Jon Asprey, vice president, strategic consulting at Trillium Software looks at what institutions need to do now, to ensure their client data will support them to comply with FATCA
In January 2013, the United States Treasury Department and Internal Revenue Service (IRS) released the long-awaited final regulations for the Foreign Account Tax Compliance Act, commonly referred to as “FATCA.” While significant changes have been made, much remains the same with regard to client identification and data due diligence
requirements. At least nine nations have now signed or initialled Inter governmental Agreements (IGAs) with the US Treasury, including the United Kingdom, Ireland, Norway, Denmark, Spain, Switzerland, Italy, Germany and Mexico. With 50 or more countries engaging with the Treasury, more signatures will follow.
The deadline for ensuring client onboarding compliance is January 2014, pre-existing payees that are prima facie FFIs must be documented by 30th June 2014 and the first reporting deadline has been set at 31 March 2015. There is now limited time for the world’s foreign (non-US) financial institutions (FFIs) to assess their position and ability to comply.
For many FFIs, a significant risk to achieving compliance lies within their client data quality processes. In particular, client on boarding data capture is a point of concern as is the quality of existing client data.
Client identification problems
Under FATCA, FFIs must accurately identify, classify and report accounts held by US taxpayers or US majority owned corporations to the IRS. This is not an easy task and institutions would be well advised to urgently assess their ability to undertake this process effectively.They should not rely upon gut feel and assumptions regarding the client data they currently collect and hold.
To identify clients that should be investigated for potential reporting, FFIs need to search their electronic account data for indicia (indicators) suggesting they might belong to a US taxpayer. Such information could include US citizenship or residency status, a US birth place, correspondence address or telephone number. Of course, most firms will seek to use software-based tools to make such initial searches, before then approaching clients for documentary evidence proving one way or the other, their status under FATCA. But there’s a problem. Automated searches can only work effectively where the institution’s client data is complete, accurate and consistent. In reality institutions’ data is often of varying quality, full of gaps and is scattered across a web of disjointed systems. The search process is made even more difficult by the complexity of the client relationships that often exist for high-net-worth individuals and commercial entities.
As financial institutions get deeper into the practical application of FATCA guidelines, many are just starting to think about the complexities. These could include multiple account touch points resulting from “power of attorney” stipulations or different billing and mailing addresses. A private banking client may have several accounts with multiple addresses for correspondence and may also have nominated a signatory on a number of these accounts.
All of these contact details must be screened for FATCA indicia; the client should then be classified and corresponded with if further information is required. There may also be varying levels of data completeness across other important details such as nationality and place of birth, where different client onboarding systems are involved.
One important area of FATCA causing FFIs concern surrounds the requirement on them to determine the status of legal entities holding accounts. For example they must classify whether an entity is an FFI, non-financial foreign entity (NFFE), US Financial Institution (USFI) or Exempt/ Deemed Compliant. This obligation will require the business type of the entity to be identified first. Some organisations plan to rely on their existing standard industrial classification (SIC) coding of entity customers to drive FATCA classification. However, without any view on the completeness and accuracy of the SIC coding, it cannot be relied upon as a robust source of client business type.
Given these complexities, one global institution retained the Trillium compliance team to determine its ability to detect which of its many millions of client accounts held US indicia. Following an assessment, the team presented clear evidence that as many as 100,000 accounts appeared to have data issues requiring their further review. Either the data would need to be rectified, or each account would need manual investigation before it could be confidently verified as either subject to, or not subject to FATCA. Compliance for that institution will potentially involve many thousands of hours of work in account identification alone; work it hadn’t planned for.
Towards a solution
In order to plan their compliance programs, FFIs need now to certify whether their existing client data and client onboarding processes are fit for purpose for FATCA. They need to determine the scale of any data or process issues and where the gaps lie. They then need to remediate client data errors and apply improved controls in client onboarding.
To undertake such a data management programme effectively, the institution needs to first define and configure the rules by which it will assess records for indicia. It must also determine how it will profile data that might be held across a wide variety of systems in a wide variety of electronic formats. Pre-configured data profiling tools are the way to go here. They eliminate the need for custom coding—saving time, cost and unproductive maintenance effort. They also support accurate reporting. Such tools can also offer sophisticated matching technology to support client data alignment and be used to show complex customer entity relationships.
Accounts that are positive for FATCA indicia should then be put into a central repository in which client details can be linked together. A financial institution’s client service teams can then have the confidence to draw from this central repository of client information and centrally coordinate communications to affected clients. Reliable client information will also support accurate, consistent and timely communications with individual and commercial clients, ensuring they are not wrongly subjected to onerous FATCA related processes that could impact account retention.
The bottom line
With the final regulations published and key nations now swiftly signing up to IGAs, FATCA is sure to become a reality. With the deadlines for compliance looming, FFIs would be well advised to assess their existing client data and onboarding processes. In doing so now, they should then be able to scope, plan and resource necessary work in good time to ensure they possess the necessary high-quality client information to support compliance and ease the burden on their clients.
You can keep up with Jon Asprey’s thoughts via the Trillium Software Insights blog at http://blogs.trilliumsoftware.com/trilliuminsights/ or for more details on the importance of data quality to banking and compliance, visit www.trilliumsoftware.com/banking
Vice President, Strategic Consulting
Harte-Hanks Trillium Software
Jon leads the advisory consulting practice at Trillium Software where he is responsible for the delivery of best practice advice and guidance to Trillium’s many international clients. He has over 15 years’ experience in information management, data quality management, data governance and data analysis, gained through working for both global consultancies and software vendors across a variety of international financial services clients.
During the course of his work, Jon has advised senior business officers at a number of global financial services firms in support of Governance, Risk & Compliance (GRC) initiatives within credit risk and regulatory compliance engagements linked to FATCA, Dodd-Frank, Basel II, Solvency II, customer deposit guarantee (FSCS) and credit risk data assurance.
Prior to his role at Trillium, Jon has also held senior consulting positions at both HP Consulting and Deloitte LLP where he was the data quality lead within the Enterprise Risk practice.