Data integrity can no longer be neglected in anti-money laundering (AML) programs

Editor's Note: This article originally appeared on the Gresham Tech blog on May 15, 2017.

The New York State Department of Financial Services (NYDFS) risk based banking rule, went into effect on January 1, 2017 and will have a significant impact on the validation of financial institutions’ transaction monitoring and sanctions filtering systems.  The final rule requires regulated institutions to annually certify that they have taken the necessary steps to ensure compliance. 

Data integrity is particularly interesting because it arguably hasn’t been given the same emphasis as other components of an effective anti-money laundering (AML) program, such as a risk assessment. 

There has always been an interesting dynamic between the way compliance and technology departments interact with one another.  This new rule will force institutions to trace the end-to-end flow of data into their compliance monitoring systems which could be a painful exercise.  This exercise will demand the interaction between groups which may have stayed isolated in the past and it will require some parts of the organization to ask tough and uncomfortable questions to others.  Clearly, gaps will be found and remediation projects will have to be launched to address those items. 

But does data integrity need to be such a painful endeavor?  Could there be a better way to streamline the process - and even create value to the institution as a whole - increasing the enterprise value of data (EvD) by ensuring its integrity?

Can New York regulators really have a global impact?

Any financial institution which is regulated by the NYDFS should have already reviewed this rule in detail.  Some have even suggested if financial institutions haven’t launched projects to ensure they can certify by April 15, 2018, then they are already behind.

What about financial institutions with no physical presence in the United States?  Should these banks be paying close attention to this regulation?  The answer is unequivocally – yes.  First of all, if you read the rule it’s straight to the point and makes a lot of sense (it’s only five pages long). But it goes beyond just common sense and raising the standards and accountability for AML programs. 

Imogen Nash, of Accuity, explored the global implications of NYDFS regulations, in a blog post back in August 2016, which are succinctly summarized below:

On the surface, one regulatory body that oversees one state would seemingly have little influence on a global scale. Look a little deeper and you’ll find that New York City is not only the financial capital of the United States but also 90% of foreign-exchange transactions and 80% of trade finance transactions (which often involve global parties and cross-border currency exchange) involve the US dollar and flow through financial institutions in New York.”[1]

Ultimately, the argument is that as long as the U.S. dollar continues to hold its dominance among the world’s currencies then the reverberations of what the NYDFS requires will be felt across the globe.

What is data integrity?

Data integrity can be described as the accuracy and consistency of data throughout its entire lifecycle of being used by business or technical processes.  The importance of accurate data in a financial institution can be easily demonstrated with a simple mortgage use case.   

On day one, a customer is approved for a $1,000,000 mortgage with a monthly payment of $5,000.  Let’s imagine that the loan system which approved that mortgage had a “bug” in it and on day thirty, the outstanding loan balance suddenly went to $0.  Obviously, maintaining an accurate balance of outstanding loans is an important data integrity task for financial institutions.

Do banks really make these types of errors?

It wouldn’t be the norm, but there have been documented cases of unusual bank glitches, such as the Melbourne man whose account was mistakenly credited $123,456,789.

The miraculous $123M credit to a customer’s account is definitely an outlier because most core banking systems function fairly well.  However, the reliability of a core banking system can’t necessarily be correlated to transaction monitoring and sanction screening systems. 

The fundamental difference between the two types of systems is that the latter need to aggregate data from many different banking applications which are in scope for compliance monitoring.  While each core banking application may function reliably well on its own - when all of these disparate banking systems are aggregated together into one compliance monitoring system, then various data integrity and quality issues arise.

But it’s not only the aggregation of disparate systems which is a problem, it’s the fact of how these systems are aggregated, which is like “fitting a square peg into a round hole.”  Banks systems come in many shapes and sizes, but compliance monitoring systems are one-size-fits-all.  Just imagine Shaquille O'Neal trying to fit into a toddler’s sneakers and then playing a full court basketball game.  Some things just don’t fit and when things don’t fit, it could impact performance. It’s the same idea when forcing data into compliance monitoring systems.

Why the extract, transform and load (ETL) process is a serious risk for compliance?

The process of moving data from the core banking applications to the transaction monitoring and sanctions screening systems presents several challenges and risks which should be validated on an ongoing basis.  The process which moves data from one source system to another is generally referred to as the extract, transform and load (ETL) process.  There are questions which arise at each stage of the ETL process such as:

  1. Was all of the data extracted properly?
  2. Was all of the data transformed from the source to the target system as designed?
  3. Was all of the data loaded from the source to the target system successfully?

Unless the a financial institution’s information technology (IT) department has already implemented data integrity and quality reports for their compliance partners, then the ETL process which supports financial crimes is nothing more than a “black box.”  In other words, there would be no reliable way to determine the effectiveness of the data migration processes and controls without some level of reporting outside of IT.

Does the below interaction sound familiar?

The screenshot below highlights two requirements in the NYSDFS rule, but the key word to notice is “validation.”  To validate the integrity, accuracy and quality of data is actually a significant effort because validation not only implies that a process is taking place, but it’s safe to assume that evidence will be requested to prove that the validation itself was complete and accurate.  Since the rule is requiring the Board of Directors or Senior Officer(s) to certify that their institution is complying, then clearly senior management should require reporting around the ETL process or “black box” which feeds data into the transaction monitoring and sanctions screening systems.

2. validation of the integrity, accuracy and quality of data to ensure that accurate and complete data flows through the Transaction Monitoring and Filtering Program;

3. data extraction and loading processes to ensure a complete and accurate transfer of data from its source to automated monitoring and filtering systems, if automated systems are used;
Source: http://www.dfs.ny.gov/legal/regulations/adoptions/dfsp504t.pdf

It’s just data. No big deal right?

Implementing data integrity and quality checks is no simple feat and transaction monitoring and sanctions screening systems will have their own unique requirements.  Transaction monitoring systems (TMS) generally require some level of transformation logic which categorizes various transaction types into groups, which are monitored by different AML models. 

There are other common transformations which occur to support the TMS, such as converting foreign currencies to a base currency used for calculation logic.  Obviously, you can’t sum transactions completed in Euros (€) and Japanese Yen (¥) together because the result would be erroneous. 

These transformation rules are also susceptible to data integrity issues if the source systems make changes to their application without informing the owners of the TMS, who also may need to make changes to their own processes.

Why data integrity projects have failed in the past

Data integrity is not a new concept for those who work with AML systems.  Actually, data integrity issues could potentially rank as one of the more popular responses by industry professionals for major impediments to the effectiveness of AML systems.  Some banks are still using the data integrity processes they built internally or with third parties with varying degrees of success and sustainability.

One of the main issues is that many of these data integrity processes are built within the databases themselves or a dedicated data warehouse is constructed for this end.  This may seem like a minor detail, but there are four ramifications:

  1. Exclusions: If, any exclusions are applied to customer, account or transaction data then it could be difficult to monitor what data is in scope and if it was extracted properly.
  2. Square peg in a round hole: The transform process is susceptible to significant data integrity issues because the source system is forced to conform to the compliance monitoring system’s data structure.
  3. Data validation: Generally, if a financial institution does have any data validation processes then they will performed in the “staging area” of the compliance monitoring system to ensure basic data integrity. Once the data validation process is moved forward to the “staging area”, then it’s essentially skipping steps one and two in the ETL process which can contribute to data integrity issues that are not identified.  Another question which arises about the process is whether it’s ongoing or was performed one-time.
  4. Data lineage: When financial institutions use inflexible monitoring systems with strict data models it presents a challenge to data lineage. How can the financial institution trace back the data in the monitoring system to what’s in the source system?  Even if they could follow the breadcrumbs; did the transform step in the ETL process manipulate the data so extensively that it’s become unrecognizable?

A path forward under the spotlight

Data integrity doesn’t seem to be going away anytime soon based on the fact that the NYDFS has just shined the spotlight on this issue.  In our highly fragmented and compartmentalized modern world we may need to see the challenges we face through a new and objective lens. 

Traditional approaches to data integrity projects have yielded limited quantifiable results in the past.  Isn’t it about time to try a more flexible solution which can evolve seamlessly with the institution’s regulatory requirements and technological landscape? 

Cost, flexibility and speed are three components of monitoring systems that financial institutions must manage effectively as regulatory requirements continue to expand due to growing threats such as cybersecurity.  Data integrity is much broader than just compliance, but building the foundation to get it right for compliance monitoring can cross pollinate into other areas of the enterprise.  Ultimately, data integrity can become a strategic business advantage when trying to enter new markets and launch new products.

[1] Onaran, Yalman. “Dollar Dominance Intact as U.S. Fines on Banks Raise Ire.” Bloomberg. 16 July 2014. http://www.bloomberg.com/news/articles/2014-07-15/dollar-dominance-intact-as-u-s-fines-on-banks-raise-ire
Web. 13 July 2016.