AML Transaction Monitoring Data Governance
AML Transaction Monitoring Data Governance
Executive summary
Risks
AML compliance centers on sifting through thousands of transactions and matching them
against risk profiles. The result of this process is a focused examination of transactions and
identification of suspicious transactions.
Financial institutions create a customer risk profile against which they measure specific
transactions as they occur. This customer profile is used as a measuring stick against which
transactions can be tested in order to identify specific transactions for in depth examination.
The basic purpose of having a strong AML transaction monitoring system is to identify and
protect the institution from any transactions that may lead to money laundering and terrorist
financing and result in the institution filing relevant Suspicious Activity Reports (SARs).
Most financial institutions rely on AML technology software to cull the transactions and pick
out the potentially suspect transactions. Some smaller institutions use manually designed
systems. Automated AML solutions include sanctions/black list screenings, customer
profiling, and comprehensive transaction monitoring with reports/alerts.
Transactions are monitored based on a customer profile and specific details relating to that
customer. The monitoring rules can reflect a number of factors relating to that customer (e.g.
aggregate transactions, type, amount, frequency, business).
When a transaction is flagged, a notice has to be generated and a procedure for resolving the
red flag has to be defined and enforced.For flagged transactions, AML staff have to
investigate the specific circumstances surrounding the transaction. High-risk products, areas
of operation, business lines and basic customer information can influence the amount of
transaction testing.
An investigation can lead to the filing of a SAR, an important source of information for law
enforcement agencies to initiate enforcement actions. Investigators collect all relevant
information, prepare a report and submit the report to a manager for review on the conclusion
whether or not to file a SAR.
AML Structures and Processes:
1. Know your customer procedures are the tools that help financial institutions gain a detailed
understanding of their customers, including their identity, citizenship status, occupation,
source of funds, volume and type of expected activity, countries with which they do business,
etc. By collecting this information and keeping it continually updated via transaction
monitoring, companies are able to assign their customers into high-, medium-, and low-risk
categories and apply further due-diligence as appropriate.
2. Surveillance processes allow banks to monitor for money laundering typologies: people
moving money inside and outside the bank very quickly; a pattern of “structuring,” in which
a customer continually makes deposits just below the reporting threshold; a single beneficiary
receiving money from multiple originators; customers who are depositing large sums and
making wire transfers to high-risk countries; and so on. Surveillance also typically includes
Office of Foreign Assets Control (OFAC) screening, in which bank customers’ names are
compared against lists of known terrorists and other high-risk individuals.
3. Investigations and reporting efforts are based on KYC and surveillance data. Once a
customer or transaction has been flagged, it goes through a case management workflow to
manually investigate the cases and file suspicious activity reports (SARs) to the Treasury
Department’s Financial Crimes Enforcement Network (FinCEN).
4. Enterprise foundational and core components underlie the entire AML effort, assuring that
the institution has conducted a risk assessment to identify money-laundering and terrorist
financing exposures across its products, services, customers, and geographic locations;
understands how money-laundering and terrorist financing typologies apply across those
products, services, and geographies; has put the appropriate AML policies, procedures, and
training mechanisms in place; and runs regular audits to test its AML program controls.
Transaction Monitoring Framework
If The Risk analysis indicates that the use of dedicated automated system is likely to be
effective as part of an risk-based AML Transaction Monitoring framework, some or all of
the following functional capabilities may be determined to be appropriate including the
ability to :
Compare a clients or account transaction activity during the reported period against
the relevant transaction history;
over a period of time that institution thinks reasonable and appropriate;
Compare customer or transaction specific data against risk scoring models;
Issue alerts if unusual and potentially suspicious activity are identified;
Track those alerts in order to ensure that they are appropriately managed within
financial institution and the suspicious activity is reported to appropriate authorities as
applicable;
Maintain an audit trail for inspection by institution audit function and by the banks
supervisors;
Provide appropriate aggregated information and statistics
http://www.pwc.com/us/en/risk-assurance-services/publications/assets/pwc-avoiding-the-
drift.pdfhttps://www.pwc.com/us/en/anti-money-laundering/publications/assets/aml-monitoring-system-
risks.pdf
Challenges
Lack of understanding of data – As financial institutions continue to grow and acquire and/or
update data sources, the enterprise and AML data governance team may fail to take into
account the downstream impacts to various applications, resulting in ineffective data usage.
Understanding the data requires not only knowledge of the technical lineage of the data, but
also the business knowledge to understand how the data is used within key business processes
and across the organization. This is one of the main reasons the AML compliance department
needs to drive, or at least be a key player, in the effort to understand data.
Data quality gaps – Many front-end systems and business processes capturing data for AML
may not populate key data elements (e.g., country of domicile, ISIN, counterparty) uniformly,
or may capture this data in free-form fields or hard-to-leverage formats. This limits the ability
to use this data for high-volume transaction analysis, leading to potential false positives or
overall misses in the identification process.
Lack of a centralized data dictionary and metadata – Many financial institutions do not have
dedicated resources (people and processes) who can act as data stewards and can educate the
downstream users on data changes as well as decide how best to harness the data. Such data
stewardship is a key requirement in getting to KYD.
Technological gaps and challenges – Financial institutions are already inundated with both
structured and unstructured data, and the data flow is ever-increasing. Without common data
repositories/warehouses to support seamless integration, technology organizations are unable
to meet the business demands to integrate, process and sort this data on a timely basis.
Frequently, businesses attempt a solution through building data processes outside of IT
(“shadow IT” solutions). Unfortunately, this approach often exacerbates the problem. Many
times, these unsanctioned sources lack uniform master or reference data, may be using
outdated, inaccurate information, or may not have data of sufficient granularity.
Management silos – Larger institutions especially are often plagued by communication gaps
among departments. This can make effective data collaboration difficult, and often leads to
data duplication, disparate data processes and multiple versions of data transformation logic.
All of these issues make it difficult to centralize functions for AML compliance and result in
ineffective AML data analyses. KYD is key in integrating these silos by providing the
answers to important questions about the data – where it is stored, how it was created, what is
its definition, what business rules and standards have been applied to it, and how it is used
across the organization.
Other Challenges
Identifying and mapping the AML Risks in scope of the Financial Institution
products, customers, accounts, transactions and associated reference data
Poor data integrity and validation capabilities
Lack of documentation and knowledge transfer of the key data flow and operational
processes and inadequate training
In accurate and inconsistent KYC and risk assessment data
Systemic deficiencies in transaction monitoring and customer due diligence processes
Inadequate system of internal controls, ineffective independent testing
Lack of central data governance and inconsistent change control process
Managing Regulatory expectations
Excerpts from New regulation from NYS DFS section 504 part C
The transaction monitoring and filtering programs should identify the data sources and
validate the quality of the data as it flows from its source into the monitoring and filtering
programs.
The specific aspects of this requirement include the identification of all data sources, data
extraction and loading processes to achieve accurate transfer and vendor selection processes,
among others.
Even the largest Regulated Institutions may grapple with challenges as they introduce new
systems or programs and quality assurance processes to meet this requirement.
(c) Each Transaction Monitoring and Filtering Program shall, at a minimum, require the
following:
In accurate extraction logic for the source data file which could result in lost number
data records for products, accounts, customers, transactions and their associated
reference data
In complete coverage of the source data feeds mapped to TM system that results in
severe gaps in AML monitoring
Incorrect transformation and data mapping between source data and TM system
would result in inconsistent/ invalid data for AML monitored
Lack of consistent mechanism for identifying , definition and classification of
elements used critical to the Transaction Montoring
The below deficiencies will result into Poor data integrity, completeness and validation
capabilities of a financial institution and lead to severe risks in terms of the Monitoring
coverage
Lack of consistent mechanism to identifying new applications / changes to the
existing applications that effects TM coverage and monitoring
Insufficient TM data governance, quality standards, assurance framework and
process sustainability procedures will lead to poor oversight over the usage of data
assets
Insufficient identification and documentation of technical information about Critical
data Elements and data flows
Insufficient Oversight of data flows and inadequate controls over data assets
For AML professionals already stretched and weary of continuing scrutiny, becoming
proficient in data management may sound like a lot of extra work – adding yet another layer
of complexity to an already difficult job. Still, AML departments stand to benefit the most
from knowing their data activities and improved data management.
Data governance is the best approach to combining these three components – the
sophisticated software applications, the knowledge of what the customer needs, and the
accurate understanding of data definitions for inputting the appropriate data. Data governance
efforts are viewed well by regulators, who increasingly put pressure on financial institutions
to formally document business processes, data controls, source-to-target mapping, and defend
all activities around data management.
Data governance is a wide set of management and technical disciplines designed to ensure
that an institution has the right data available at the right time and that the data is accurate and
in the correct format required to satisfy specific business needs. Much like AML compliance
generally, technology enables the process, but it is specific business knowledge and context
being applied to a set of information that really adds the value.
While technology platforms are certainly enablers in supporting this governance (e.g., data
quality monitoring, centralized data dictionaries), AML leads must work closely with first-
line process owners to ensure a good definition, ownership and monitoring of key data assets
required for the AML programming. Technology components supporting this include the
management of master and reference data, which helps to ensure uniformity and improve
quality across data sets flowing from diverse systems.
From a transaction monitoring process standpoint, a single customer with multiple accounts
and conducting multiple types of transactions will have the customer name, transaction
details and other identifying information appear in multiple records, across multiple systems.
The process of consolidating this information into a single customer record for transaction
purposes (to prevent the same customer from generating duplicate alerts) can be facilitated
through strong reference and master data management.
Having a consistent Transaction Monitoring data Governance Framework can provide the below
benefits :
1. Efficient monitoring of data quality and ensure effectiveness of transaction monitoring data
2. Reliable and sustainable control over data quality and completeness
3. Reduce operational and regulatory risk
4. Facilitate DQ Issue management, Root Cause Analysis and prioritisation of DQ issue
remediation
5. Clarity around role and responsibility towards data will result in better data management
The Above benefits could be achieved by defining and implementing :
1. Data governance standards, policies, procedures, rules and controls to test for data quality
and data completeness on an ongoing basis
2. Controls to monitor any changes to the business, products, systems and transactions in the
AML landscape
3. Methodology, process, procedures to implement data quality framework & controls
4. Control Metrics to provide regular oversight of any issues discrepencies or enhancements
5. Issue Management and remediation process
6. Embed change control & process sustainability in data governance framework
Establish Data Governance Policy & Data Councils,Governance policy supported by a set
of governance and decision making bodies that must ensure adequate adoption and
subsequent compliance with the mandates in the policy
Institute and enforce effective master and reference-data management programs. This will
enable the institution to uncover data structure issues and, in the event of data unavailability,
elicit new efforts to source data that downstream applications like AML transaction
monitoring systems can leverage to perform a more refined data analysis.
Define/ Implement Data Quality rules and monitoring of data quality and controlsIt is also
important to provide models to include minimum standards,types of controls and tools for
continuous monitoring of data quality and to assign responsibility for any problems that may
arise with the data.
Create a centralized repository for metadata, Data Model / Dictionary, Data Lineage, data
flows and Golden Sources Documentation. A centralized repository will help the institution
gain an understanding of redundant data processes and eliminate them. This will streamline
downstream consumption and lead to reduction in the total cost of ownership of various data
sourcing applications. IT will also allow new data processes to be less time-consuming and
cheaper to implement due to clearer understanding of the data that is available to support the
processes.
Support big data initiatives. Financial institutions are deluged with new data daily, and the
ability to incorporate new ways of monitoring the large volume of transactions and extract
value from the data is critical to effectively managing and maturing AML programs. It is
important for institutions to maintain strong data governance as it allows institutions to
transition easily to big data analytical platforms and tools through easier data integration.
Data Quality is the degree to which data is accurate, complete, consistent and relevant. The
data needs to be of a high quality, Fit for the intended use in operations decision making and
planning
Data Feeds- Discrete data file with file name or Data Extract from a API
Data Records – Individual line of data composed of data elements delimited by a
common delimiter
Data element – Single column or field containing a data point or null value.
Critical Data Elements – CDEs are identified by analysing the fileds which have high impact
on transaction monitoring system and meets at least one of the below criteria
Data Quality Dimensions / Rules should be adopted and implemented for the organisational
data policy which may include the below data quality checks
Data Controls Framework
The financial institution should have the guidelines for data controls at an enterprise level for
the definition, implementation, execution & monitoring of data related controls to ensure that
data risk and is adequately mitigated.
Purpose of controls:
Data Control Framework: data control framework must be in line with the overall Operation
Risk Framework, its components have been simplified and adapted to be applicable to data
assets, and to be aligned with the Enterprise Data Governance Policy.
Controls activities that should be classified and implemented in the end to end data
lifecycle in order to control data risk.
Monitor& Control:
At this stage the business as usual monitoring of score cards,outstanding data quality issues,
inventory root cause the remediation and data quality improvements for the organisation are
performed. During this stage production score cards and dashboards are built, ongoing
control monitoringprocesses are put in place, operational owners and escalation routines are
formalised, and long term planning is conducted.
Team is responsible for ongoing monitoring of the design andoperation of controls in the 1st
line of defence as well of providing advise and facilitating risk management activities the
team should constantly and independently review the performance of key TM controls
Irrespective of the group /Unit/Line of defence that performs them in order to be reasonably
assured that the controls meant to identify and mitigate TM risks are designed and operating
effectively
The QA program should define the minimum data quality standards for the key controls
providing the associated risks , scope, frequency, sample size, review method, test scenarions,
documentation , reporting and rates of compliance
QA Program should also include a set of minimum operational standards for the completion
of QA reviews using an informed, independent global process, Processes for proactive
identification of areas of TM risk in order to escalate them promptly to senior management
and requirement design to ensure that effective and timely corrective action plans are in place
to address areas of identified TM risks.
The Controls Verification Testing & evidence documentation ensures that the
requirements of the independent / internal audit are met with great degree of
confidence