Community
There seems to be some shock that bank execs might be dissatisfied with Big Data.
Big Data (the ability to analyse and understand massive data sets) has been heralded as revolutionary for financial services and is seen as a major IT trend.
The promise is huge. The compute power and analytic software now exist to interrogate unimaginably huge amounts of data, on a scale that would have been previously impossible. The possible benefits from crunching your existing data, using it to understand what your customer wants, and then already being aligned to meet their needs, seem limitless. Even more tempting is the idea that with more crunching of your existing data, risk can be better managed, events predicted and disasters can potentially be avoided.
Financial Services, where data has been digital for decades, would seem the ideal industry for big data, so why the disillusionment?
I think there are a couple of reasons, two that could apply to any financial services of IT and one that I think is unique to Big Data.
Firstly, IT industry marketing concepts have the major weakness that all too often the promise is not delivered or takes too long - and Big Data seems no exception. Big Data is a catch all term for a number of quite distinct concepts. You only need to think back to the hype and promise around CRM (Customer Relationship Management) or EDW (Enterprise Data Warehouse), also catch-all terms for some very complicated ideas, and then look at how those concepts developed in financial services to see the issues.
I rather like the Gartner Hype Cycle as an explanation. Technology develops rapidly, gets hyped, then gets over-hyped and fails to deliver. Despite that, there is real value in the technology and as deployments come on-line and start to deliver some of the value, there is a recovery in interest and the technology delivers on (some) of the hype. I see Big Data as now passing the hype and hence the disillusion.
The second issue is also common to IT projects – projects fail. The McKinsey and BT/ University of Oxford study of IT project failure in 2012 found that project failure, or at least failure to deliver benefits on time and on-schedule was widespread. Having run very large projects in the past, many of the big data projects I see have the criteria that I find makes a project high-risk. My criteria are things like design risks (“…is it a first of a kind?”, “….is it bespoke?” and “…how many systems/ vendors need integration?”) and perhaps more critically project risks, “…does it have a clear owner inside the organisation?” and “…are the benefits defined/ documented/ agreed?”). Unsurprisingly, a big data project is almost by definition innovative and complex and tries to deliver beneifts that are new and therfore difficult to quantify, so the chances of failing to deliver on the hype are high.
There is a final problem, though, that is unique to big data. It’s straightforward enough to run queries on lots of data and come back with answers. The challenge is actually getting semantics from a data set (the actual meaning or truth) and then understanding that truth and applying it. These steps are each complicated enough in their own right, and combining them makes big data uniquely challenging. CRM is again a good parallel. Customer Relationship Managment may seem a simple concept, but for banks each step (interaction capture, data mnagment, system integration with transactions, end to end security, etc...) is a large project in its own right with much business process change. An example may help. A analysis of complex trading system (such as a derivatives exchange) might reveal that it participents are closely inter-connected at and interdependent. That might identify systemic risk, butthe real value of a semantic approach to Big Data is to understand how systemic risk applies to your specific situation in multiple scenarios and what you need to do to mitigate it.
Furthermore to really get value out of Big Data have the rigour to understand what the data cannot tell you. Mr Rumsfeld was mocked for his "known unknowns" and "unknown unknowns", but knowing the limitations of the knowledge in your data is useful, especially when you start looking at the so called “Black Swan” type of market events.
The presentation of the output, whether visually or numerically, is key to success. Without clear presentation, users often struggle to understand the true meaning of the big data work they are doing. This is an area where I think we will see further development and the financial services sector (as an industry that works with abstracts) is likely to be one of the leaders. From what I’ve seen, presentation area is where many big data projects struggle. They find it challenging to present the output analytics answer clearly, I’ve seen some very interesting examples of presentation from SAS and Palantir, but friendly user interfaces are still not the norm.
In short, I suspect the Gartner hype cycle will hold true and big data will deliver much more of its promise. For me the crucial element in getting the real value is the development of the user interface and the ability to represent complexity clearly to multiple different types of audience.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Ellison Anne Williams CEO at Enveil
30 October
Damien Dugauquier Co-Founder & CEO at iPiD
Kyrylo Reitor Chief Marketing Officer at International Fintech Business
Prashant Bhardwaj Innovation Manager at Crif
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.