Join the Community

22,238
Expert opinions
44,206
Total members
424
New members (last 30 days)
214
New opinions (last 30 days)
28,750
Total comments

Banks must seize control of their dangerous data silos

Remaining compliant in today’s financial services industries requires a comprehensive surveillance function. Every call, message and communication between employees, customers and trading partners must be captured, stored and monitored to illustrate compliance.

As regulatory requirements have become more demanding, many banks have sought to plug gaps in their coverage by adding new point solutions to their legacy technology. Each time a regulator makes a new request, more software has been stitched into an increasingly large patchwork quilt of systems.

The result? With every added solution, comes a data silo – a collection of information that’s not fully or easily accessible to the bank because it’s recorded or stored differently to everything else. A typical tier one bank now runs countless voice-recording, trade surveillance and data-archiving systems.

Each will format and time-stamp data differently, while also referencing individual employees under surveillance in ways that are inconsistent with other systems. In short, it’s a disparate mess, both in terms of tech investment and the human resources required to filter through all the information.

Prohibitive export costs

Promisingly, banks are rightly attempting to clean up by taking on the lengthy task of reducing the number of data silos they work with. But this isn’t without its own challenges. The biggest being the web of contracts that often goes with the introduction of new surveillance technology.  

The terms of many current framework agreements between banks and their archive and surveillance suppliers stipulate that data can only be extracted before the end of the contract if banks pay a charge that can stretch into hundreds of thousands of pounds for order management systems, and even higher for data-heavy voice recorders.

If a bank has to run a discovery process, the cost of exporting the data can be ridiculous – not to mention hugely time consuming. Even after this cost, it can take months for a bank to access the data, which will still be in a proprietary format. Changing the way systems record or format data can incur an extra charge.

As contracts end, therefore, banks should consider negotiating hard for new terms that allow them to change the way data is formatted or to pull information out of applications – either continuously or when required – so that a second, bank-owned copy can be created and maintained. Or a ‘golden source’, as it’s sometimes referred to.

Data in this copy can then be normalised and pooled with other normalised data sets so that anyone in the bank with analysis tools – from within compliance and beyond – is able to interrogate it as one.

Even if a bank takes the strategic decision to maintain some silos, there are two considerations that must be taken. Firstly, it’s imperative the bank is able to extract data at any point, which must be reflected in a contract. Secondly, the bank must take stock of all the costs associated with each data silo so they can plan for the future.

Tilt the power balance

This approach helps shift the power dynamic between banks and their technology suppliers back in the favour of banks. This is because the bank will be able to run all its reporting and analysis out of its own holistic copy of the data. With this in mind, the bank is free to change its technology suppliers so it can avoid being tied into silos and the costs associated with them.

Once banks have created a holistic data store that they own – albeit in copy form – they can also render it immutable, meaning nothing can be modified without leaving a clear audit trail. This can be achieved using blockchain technology, where records are distributed and therefore un-hackable.

Furthermore, the cost implications of maintaining copies also tend not to be as heavy as imagined, at least for less data-intensive applications. Although the data associated with voice and e-comms surveillance systems can be huge, having two copies of an order management system or metadata associated with every other activity a bank does isn’t a great deal of information, no matter how big operations are. In fact, the actual storage of a data silo is relatively inexpensive owing to cheap disk space.  

Whatever approach banks take in solving the data silo challenge – whether fast-tracking the break-up of silos or opting for a more gradual reduction while negotiating better access to data – an essential first step is the creation of a ‘golden source’.  

Once banks have control of all key metadata associated with silos, they can build a fully integrated solution that delivers an interpretation of regulations, the required policies related to individuals’ roles and the organisation’s stance towards its corporate responsibilities.

Only by doing so will banks have started to solve this problem while also remaining compliant and reducing overall costs. Now’s the time to act and begin addressing the patchwork of systems that at worst could lead to issues with regulators, and at best can cost a fortune to unstitch.

External

This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.

Join the Community

22,238
Expert opinions
44,206
Total members
424
New members (last 30 days)
214
New opinions (last 30 days)
28,750
Total comments

Now Hiring