Wealth Access Report FINAL - Flipbook - Page 13
“The No. 1 risk is the quality of your data. You don’t want
data getting pushed out that is inaccurate, and then
people start making inaccurate decisions from it.”
Dennis Klemenz, Jovia Financial Federal Credit Union
nizant [of building] the right lake so that the analytics that
“The No. 1 risk is the quality of your data,” says Dennis
comes out of it truly drives value for employees or customers.”
Klemenz, chief technology officer at Jovia Financial Federal
Credit Union in Westbury, New York. “You don’t want data
But in the earliest stages of an institution’s data journey,
management may want to avoid the temptation to invest in
getting pushed out that is inaccurate, and then people start
making inaccurate decisions from it.”
fancy — and expensive — data repositories. While there are
powerful tools available, these may be beyond the current skill
Data quality is something that institutions can fix, if need be.
or ambition of a small institution just starting out. Instead,
One way to ensure this is by implementing an ETL — extract,
the bank or credit union may just need a place where the data
transform, load — process. ETL combines data from multiple
is accessible in an actionable format and consider upgrading
sources into a large, central repository by applying business
as their maturity increases.
rules to clean and organize the raw data so it can be stored
for future purposes. Daniel Haisley, chief product officer at
“There are a lot of new technologies out there, a lot of shiny
distractions, and sometimes there’s just that eagerness to build
a new report,” Sud says. “That leads to a lot of false starts
because you may end up building something that does not
drive value.”
digital banking provider Apiture, says companies can validate
data when they apply a transformation to it, ensuring the
data is both accurate and standardized. Standardizing data
includes following a uniform style for all information. For
instance, this could be spelling out the word “Street” for customer addresses rather than abbreviating it.
How Will the Data Come In?
“Finding those sorts of conventions and then enforcing them
across the database is really important,” he says. “Financial
Just because a storage tool can hold a lot of data doesn’t
institutions broadly just haven’t done a great job of that
mean an institution should do a massive extraction and data
because it is a knife fight every single day.”
dump. Especially early on, it may be better to be selective
about what goes into the database. That could look like an
Firms like Wealth Access can help institutions with this
institution identifying an initial use case and the data they
extraction and normalization, allowing institutions to create
would need to address it, and then scrubbing and validating
a universal customer record, says Andy Zinn, the firm’s chief
only that information.
innovation officer. An institution does this by applying the
No matter the source of data, the information being used must
ware is programmed to normalize the data and deposit it into
be high quality: complete, accurate, normalized and scrubbed.
whatever data storage architecture the institution selects. Zinn
company’s intelligence layer to extracted data; the layer’s soft-
This translates into data governance that defines, oversees and
says it can be easier for institutions to first normalize the data
validates the quality of any information an institution uses in
as part of a customer record and then deposit it into a data
analytics or modeling.
lake, compared with putting all the data into the lake and trying to build the customer profile there.
EFFECTIVE DATA MANAGEMENT: CRAFTING YOUR DATA STRATEGY | 11