Community
Originally posted on https://suade.org/learn/transforming-data-collection-fire-as-a-common-input-layer/
In January 2020, the Bank of England initiated a project entitled ‘transforming data collection’ with the aim to improve data at financial institutions and facilitate regulatory compliance processes. Among the Bank’s key considerations were common data inputs, modern reporting instructions, and changes to the reporting architecture. Stakeholders were asked to provide their input on the Bank’s proposals. In an exciting next step, the Bank recently released a follow-up paper summarising its findings from the initial written consultation round and subsequent interviews with financial institutions, industry bodies, and technology service providers. In this paper, the Bank sets out a plan for transforming data collection at financial institutions. The Bank lists the following key objectives:
Anyone familiar with financial regulation or regulatory reporting will understand that this is not an easy task. Since the global financial crisis (GFC) of 2007/2008, the regulatory burden on financial institutions has increased immensely. The GFC sparked a fundamental review of regulatory standards designed to improve the resilience of individual financial institutions and the financial system as a whole. The Basel Committee’s Basel III standards introduced many important reforms that have significantly complicated the tasks of regulatory compliance departments across the financial services sector ever since. The Bank’s initiative to transform data collection is an important step towards bringing technology systems and data management into the 21st Century to ensure regulatory reporting submissions and compliance teams can keep up with the increased regulatory burden. The Bank’s initial findings indicate three key reforms for transforming data collection:
Suade Labs welcomes the Bank’s ambitious plans. As a RegTech firm, we are passionate about all things data and driven by our mission to bridge the regulatory gap with modern technology. A key cornerstone of this is the open-source data standard FIRE, short for financial regulation, that Suade RegTech Engineers created with funding from the European Commission in 2016. This data standard has helped Suade revolutionise the regulatory reporting process by providing a common input layer that can be deployed across the financial services industry. Suade are excited to see our work reflected in the Bank’s plans for transforming data collection at financial institutions. What do the Bank’s findings on common input layers and data standards indicate?
Data standards as common input layers
The Bank highlights the strong support for common input layers that it found in its conversations with financial institutions and solution providers. A common input layer ensures consistency in data within an institution. In its conversations, the Bank found that an internally designed common input layer had helped a financial institution streamline data from 40 group entities. Solution providers’ common input layers have similarly helped simplify the process of managing data. This appears, however, to come at a high cost to financial institutions mapping their data to a solution provider’s unique input layer. The Bank suggests that a possible solution could be the design of industry data standards in connection with a common input layer. Such a data standard would offer more prescriptive definitions that cover operational processes at firms and thus further streamline data at financial institutions. Several challenges emerged from the Bank’s analysis that the common input layer contained in FIRE successfully resolves.
A data standard for the financial services industry
FIRE is an open-source data standard that is based on definitions contained in financial regulation. Because it is available open source, it is not proprietary to a particular vendor. As a result, it can be adopted and deployed by any financial institution that chooses to do so. Because its data points are based on definitions in financial regulation, FIRE covers exactly those data points that are required for data collection requests set out in financial regulation. This has several important benefits that address the challenges raised in the Bank’s review of its consultation exercise.
Operational data
Rather than defining an industry data standard in terms of operational processes, FIRE derives its definitions directly from financial regulation. Where redefining operational data could result in changes to operational processes, defining data in terms of financial regulation will not conflict with legacy financial products. In their responses to the consultation, several financial institutions raised concerns that legacy IT systems and financial products were so deeply intertwined that redefining operational data in legacy IT systems would adversely affect long-dated financial products, such as mortgages. This concern is easily addressed in FIRE.
Redefining operational data associated with mortgages inevitably affects operational processes associated with those mortgages. If, on the other hand, the data associated with a mortgage is defined in terms of the required data points contained in financial regulation, the operational processes associated with the mortgages are no longer affected. Instead of reinventing the wheel on operational processes, a data standard based on financial regulation merely identifies exactly that data which is required for compliance purposes and regulatory reporting. The operational processes concerning mortgage payments remain the same, while a smooth transition from a legacy IT system to the data standard is possible. It follows that FIRE offers a tool for transforming data collection without any disruption to operational processes.
Standardising heterogenous data
Defining data in terms of financial regulation makes standardisation of heterogenous data possible. Because financial products must comply with regulatory provisions, definitions contained in financial regulation act as a common denominator across heterogenous financial products and financial institutions. All mortgage products, for instance, must comply with requirements concerning interest rate setting, loan-to-value ratios, and due diligence data among many other factors. Rather than forcing one particular means of recording that data onto financial institutions, FIRE identifies the common denominator and helps financial institutions record their data in those terms.
Similar reasoning is applicable to the difference between operations among financial institutions. The Bank's findings highlight a concern that, in the process of defining a data standard, challenges arise with regards to the operations to be included in the data standard. A narrowly defined data standard around mortgages, for instance, could end up providing clear definitions for a certain set of practices at the expense of others, thus undermining its effectiveness for the industry as a whole. Regulatory definitions have the power to address these concerns and offer a data standard that works for a wide range of common practices. The net stable funding ratio (NSFR) and the simplified NSFR (sNSFR) offer interesting use cases.
The NSFR requires institutions to have a ratio of at least 100% between their required stable funding and available stable funding. The PRA’s recently published consultation paper on the Basel standards highlights the regulator’s desire to reduce the complexity of the NSFR for smaller institutions whilst being at least as conservative as the NSFR. To the extent that the NSFR and the sNSFR overlap, the definitions of the NSFR can be reused, whilst those applicable to the sNSFR only ensure that smaller institutions to which the sNSFR applies are appropriately covered. This way, a data standard is guaranteed to cover a wide range of operations in the financial services industry despite the heterogeneity of data. It follows that FIRE enables the standardisation of heterogenous data.
Scaling complexity
FIRE makes data collection and reporting tasks scalable. The Bank’s findings indicate a concern that the bespoke solutions of larger solution providers are not scalable. They contain thousands of data points and require hundreds of members of staff to be maintained. For larger institutions, this complicates the process of scaling reporting solutions across their global operations. Another key concern among firms is that the level of aggregation of a data standard cannot depend on the intended use alone. Rather, the data standard must be scalable across financial products. Bespoke data standards and reporting solutions appear unable to cope with this demand for scalability.
A data standard based on regulatory definitions offers scalability to address both concerns: data points can be reused for different jurisdictions and across different financial products. Because prudential standards in the banking sector are largely derived from the Basel standards, regulatory standards across jurisdictions are comparable, if not the same. Equally, because the global financial system is highly interconnected, financial products are similarly comparable. Data points based on financial regulation can thus be reused across different jurisdictions and for a variety of financial products. Any differences are likely attributable to distinct legal concepts in jurisdictions. These can be added as data points using regulatory definitions. As a result, the number of data points is kept to a minimum, thus reducing the number of people needed to maintain the standard. It follows that FIRE makes complex data collection requirements scalable.
Transforming data collection
The Bank’s work on transforming data collection is an important milestone in realising the benefits of modern technologies for the financial services industry. A key step is the development of a common input layer using industry data standards. Suade’s work on the open-source data standard FIRE addresses many of the concerns raised in the Bank’s review of the consultation evidence. FIRE removes concerns over disruption to operational processes, while standardising heterogenous data and scaling complex processes.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Alex Kreger Founder & CEO at UXDA
27 November
Kathiravan Rajendran Associate Director of Marketing Operations at Macro Global
25 November
Vitaliy Shtyrkin Chief Product Officer at B2BINPAY
22 November
Kunal Jhunjhunwala Founder at airpay payment services
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.