Community
In my previous entry on August 13, I discussed what we can imagine to be a solid foundation to the Risk and Finance initiatives. Obviously, it is not just about the data model and that is the first point I would like to address today.
Many vendors that strive to deliver a warehouse designed to service more than a handful of users within a business department tout an extensive, comprehensive logical data model. The role and need of a logical model is important and uncontested, but its ability to act as a true project accelerator is greatly over-valued and over-sold.
There is not a single application in financial services that can run off a 3rd Normal Form Logical Model. All applications need a physicalized model. To physicalize a logical model – no matter how detailed, extensive and comprehensive – is no small or easy task.
A typical iteration cycle of a data centric project is:
Multiply this by every line of business, major application and group of key stake holders. Where does the budget go?
So we’re back to asking, how well does the technology truly reflect your business? What good is all the data in the world in the fastest processing environment if you have to pull it out, and put it in a data mart to run your reports?
Therefore, as accelerators go, the one to value most is a well conceived, business relevant, wide ranging physical model that’s designed to drive analytics and reporting in the manner in which business users typically operate. Requirements are locked down much faster, they have far greater probability of being successful the first time and being implemented before major changes can take place. Users will see value far sooner and more meaningfully than any other alternative. Now as I mentioned, it is not just about the Data Model and many of the discussions in terms of architecture to support the needs of Risk and Finance stops right there.
In effect, does it not make sense to then use this fantastic repository you have just created as the Golden Source of data to then have a single processing environment? What I mean by processing environment is very simple. Consider a Data Warehouse in place, in most instances the bank will use it as the source to execute calculations in separate risk engines and extract the data from the DW to populate a given Data Mart. For example, FTP, Regulatory Capital, Credit Economic Capital and Channel Performance, etc. are all Analytical Applications that make use of that data for business decisions. Each effectively has their own decision, calculation and data processing engine.
Fundamentally, whether you are creating the result of a Cost Allocation or the calculation of Regulatory Capital, you are simply ‘doing stuff’ to data. Some of these ‘stuff’ are simple and prescriptive in nature and others are very complex, involving for instance stochastic calculus (think Economic Capital).
Since each of the engines is essentially doing its own calculation, why not take the synergy a step further and have a single processing layer where all the calculations and transformations can be executed in a simple, consistent manner with only one definition of dimensionality, security, auditability, etc.
In our next and final blog entry we will look at some example of benefits from customers who have embarked on this journey.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Kathiravan Rajendran Associate Director of Marketing Operations at Macro Global
10 December
Barley Laing UK Managing Director at Melissa
Scott Dawson CEO at DECTA
Roman Eloshvili Founder and CEO at XData Group
06 December
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.