According to 2018 research from software market intelligence provider CAST, 47% of financial services organisations operate 26-to-50% of their business on legacy systems – more than any other sector.
As data management in financial services becomes increasingly complex, Financial Institutions (FIs) have seen their customer and payments data become fragmented and scattered across hundreds of silos. The result is that transactions take place in any number of legacy payment systems that must then integrate with customer and reporting systems that also number more than one, none of which have uniform definitions for their underlying data fields, further impacting data-governance capabilities.
A key challenge for legacy technologies then is their inability to efficiently retrieve, transport, or share data internally or externally. This is complicated by the addition of new data lakes, data warehouses, and other repositories to meet growing storage demands and manage the exponential growth of client and compliance data.
To highlight this big data explosion, consider that IBM Watson figures say that 90% of the world’s data was created in the last two years. And one global financial institution recently reported storage demands surging from 100 petabytes in 2018, to 169 last year – a 69% increase. Also impacting visibility, is the lack of centralisation when it comes to data storage, with most FIs storing data across diversified cloud, hybrid cloud, and on-premise servers.
Financial institutions’ lack of visibility into data can then be attributed to three intersecting problems: fragmented storage, aged and incoherent systems, and increasing data overload. The result is that these organisations can sometimes lack insight into critical regulatory reporting issues.
As such, it can be difficult for institutions to not only recognise reportable transactions, but also to identify, collect and store the transaction data they do need to report to AUSTRAC and other regulators. It also doesn’t help that the financial industry has been singled out as one suffering the most from “stale,” or expired, data.
Since most financial services operations are just “paper with a digital wrapper,” according to 2017 research from data management provider Veritas, they are particularly vulnerable to dark data. In fact, Veritas found that 20% of financial institutions stale data was attributable to document files. And AUSTRAC reporting entities’ themselves often use excel spreadsheets to mine transaction information.
Another specific issue facing AUSTRAC reporting entities are the new levels of information they are now required to have relating to transactions. For example, they must be able to report both their bank-to-bank MT202 cover payment (COV) messages, and the multiple underlying MT103 transactions, to AUSTRAC.
While MT202 COV messages might tell you the value of payments between banks, MT103 messages are the underlying transactions that together, make up the value of the MT202 cover payment. Previously, the requirement was only to report the COV transaction, but following recent money-laundering scandals in Australia and beyond, AUSTRAC and other regulators are cracking down on COV reporting for direct, third-party, and non-bank financial institution (NFBI) entities. Now, all underlying MT103 transactions must be reported to regulators.
From a visibility standpoint, the issue that arises for AUSTRAC-covered entities with this reform is how do they report the underlying MT103s if they only have visibility of the MT202 message? This is one area where a central regulatory data repository, or “single source of truth” can help.
This is the second blog in a series looking at the technology challenges facing AUSTRAC reporting entities. To read the first blog click here. Or you can download the complete series as a whitepaper by clicking below.
Clare Rhodes is Identitii’s Director of Marketing and Communications, based in Sydney.