Join the Community

22,039
Expert opinions
43,969
Total members
395
New members (last 30 days)
177
New opinions (last 30 days)
28,688
Total comments

Creating a Shared Service for FRTB Compliance

Financial institutions are increasingly leveraging shared services, from enabling Know Your Customer (KYC) compliance to post-trade reference data management, in order to reduce both cost and compliance resources. And, as the new data requirements associated with the Fundamental Review of the Trading Book (FRTB) become clearer, whether it is the new risk models or the depth of historical information requirements, there is growing industry concern regarding the challenges ahead and the tight timescales.

From quote collection to risk factor approval, organisations are beginning to question the viability of institution-specific compliance activity. While there are without doubt challenges to address in areas such as instrument classification and determining the modellability of risk factors, the potential upsides of a single service approach that leverages data pooling and data sharing to mutualise the modellability of risk factor creation and approval are compelling.

Early Collaboration

It has become patently clear over the past decade that early collaboration with regulators is now an essential part of the compliance process. As organisations progressively look for commonalities in regulatory data requirements, it is the industry’s feedback and input into the procedures and standards needed to realise each specific requirement that are now underpinning the necessary change management programmes.

The Fundamental Review of the Trading Book (FRTB) is a prime example. Since its finalisation in January, organisations have started to get to grips with the data requirements associated with this new need to calculate and report market risk and the refreshed risk modelling methodology. FRTB’s replacement of Value-at-Risk (VaR) with expected shortfall (ES) as the standard risk measure has very significant data implications.

Most notably, the concept of non-modellable risk factors (NMRF) will mandate banks demonstrate that the data going into their risk models is real and derived from actual transactions or committed quotes. The expected shortfall measure itself will be calibrated on a history of 10 years. Regulators have become more prescriptive; not only on the content of the data (length of history and modellability), but also on the enterprise-wide integration and the explicit links to P&L and Prudent Valuation.

The depth, range, volume and quality of information now required is unprecedented. Where Basel II risk engines could work with relatively simple price histories, FRTB requires them to be managed as risk factors, which implies an understanding of their behaviour and relationships. Apart from that, data quality isn’t just for modellability, the increased computation requirements mean that data errors become harder and more expensive to correct.

Historical Data

From a data management perspective this will demand the collection, analysis, validation and reporting of information across multiple product silos, organisational entities and risk areas. And it raises two key issues: the need for a common data foundation and access to a depth of historical time series information. However, FRTB is just one component of a reinvigorated focus on historical data.

From identifying gaps in history to flagging history that doesn’t qualify for use due to inaccuracy and adding external data sources and proxies, institutions need to create a strong information management architecture to support the growing regulatory focus on historical time series data.

Does it, however, make sense for each and every institution to collect transactional data, identify gaps, introduce new sources and validate ten years of history across every single risk factor? Few, if any, institutions routinely store real price data, therefore collaboration will be required at some level to fill the gaps. If each bank seeks to solve this data gap separately not only will costs rise but there will still be a risk of data gaps and inconsistency.

There is clearly an opportunity for a shared service model, where one provider undertakes to consolidate this information and provide it as a service to the market.

Data Challenge

The challenges with creating this unified model will be in defining a common understanding of risk factors and then mapping and cross referencing this data. The role of EDM will be key – enabling the collection and reconciliation of quotation data in multiple different formats from numerous banks and cross-referencing different instrument classes and alternative ways of labelling the same financial product types.

With a common data foundation and a common basis upon which to create or derive the various risk factors, the contribution of quotes to the shared service by multiple organisations will resolve the data acquisition problem. There should be no gaps, and hence no need for complex estimates.

The shared service can then leverage the data foundation and data resource to undertake risk factor mapping and provide proof of modellability. The resultant ‘on-demand’ service would deliver institutions a cost effective risk data foundation, overcoming all the traditional data collection and data supply chain costs and integration issues.

The benefits would extend beyond financial institutions: regulators would have to approve this shared facility but, once risk factors and definitions are agreed, only the shared service would require audit, not each individual bank, significantly reducing the burden on each regulator.

Proven Approach

The way the market has responded to other regulatory requirements – such as KYC – with new, consolidated data providers clearly demonstrates the industry’s appetite for shared services. Given the challenges now faced by financial institutions in meeting the FRTB reporting requirements, there is a strong case for collaboration in the middle office.

With the time constraints associated with FRTB, is it really viable to source and validate the required data, from multiple internal and external sources; map that data to risk factors and prove that it has sufficient market data to be deemed modellable? 

By sharing the data collection burden and creating a single, audited model for data structure and risk definition, a shared service will enable institutions to significantly reduce the financial and resource overhead associated with FRTB compliance.

The onus is now on the industry to engage in communication with regulatory bodies and embark on a collaborative process to realise the benefits of this shared service approach.

External

This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.

Join the Community

22,039
Expert opinions
43,969
Total members
395
New members (last 30 days)
177
New opinions (last 30 days)
28,688
Total comments

Trending

David Smith

David Smith Information Analyst at ManpowerGroup

Best 5 White-Label Neobank Solutions in 2024

Ruoyu Xie

Ruoyu Xie Marketing Manager at Grand Compliance

Governance, Risk and Compliance: How AI will Make Fintech Comply?

Now Hiring