Community
Introduction
In the last 3 years, I have come across a lot of companies trying to build their data platforms. Unique approaches, aggressive architectures, and varied use cases underline the different features of each of the proposed solution. It is great to see, that industry is embracing the Big data age and even post big data evolutionary cycles very willingly.
One thing that many companies undermine, is the speed of delivery or the so-called time to market for these platforms to internal stakeholders. In almost every big organization, the data evolution journey saw the adoption of data warehouses in the mid to late 2000s. Before the big data wave, the standard data mining and data transformation strategies on data warehouses were all the rage. Leading that data analytics was the Telcom sector and many telecom service providers wanted to identify where they can maximize their review and expand on their channels.
One big achievement of these data warehouses was the centralization of data at one golden source. At the core of this data mining and data-insights revolution was the idea to centralize the data at one place, flatten the schema and run the mining programs at this golden source to get the meaningful relations and insights.
Now when all different systems are plugging their data streams or batches to a golden source, with time new dependencies on analytics reports and business workflows were developed.
The Change
With the idea of microservices gaining firm acceptance and coupled with it the ever reducing computing cost, it became evident that big data can be managed and optimized for insights more efficiently. The advent of ML models has boosted the use cases for this huge mountain of data, where more hidden patterns can be looked into and find ever so elusive secret sauce of understanding consumer behaviors, trend cycles, and even predicting product appeal among the clientele.
While many companies are on their way to build a holistic data platform, it is the urgency of doing so and getting things done that is found wanting. Reasons are not always lack of project management, but also the decision to choose the type of technology that restricts the implementation too. Many people with brilliant ideas from the developer community have jumped in to start ambitious open source projects, and promise of these projects open new possibilities.
These projects also increase the headaches of many organizations, as when they plan to build a new system, the options that are available to be included in tool kit increases exponentially. Business users bringing in requirements for making a system fully capable to deliver the features that match the latest buzz words is another reason why delays are added. This undermines the whole first mover advantage, for which these data platforms were envisioned in the first place.
What happens in the meanwhile, the teams that are quick to jump on the data platform adoption find themselves on a cruise which has few passengers. The dream of having a party at cruise where the whole company is heading to a new direction suddenly turns into an endless waiting game for other teams to adopt these platforms, so the desired data integration and more insights can be created. Many teams end up creating their own APIs to have an interim solution in place to keep the business use case development on track.
What harm this creates, is that by the time other teams onboard the data platform and the promise of delivering a single golden source of data for all business, it is realized that almost all teams have an alternate API bridge connections working and to refactor the code again in the span of 1-2 years to repurpose this towards the data platform is seen as a toil, and teams build a natural resistance to change the code again, thus making the data platform adoption a drag on the timeline.
Conclusion
The idea to have a new age single data platform that enables Data-As-A-Service model, it is imperative to keep the internal stakeholders and system team engaged and involved. Planning for delays is a practical thing, but ensure no new sticky structures are created that would hamper the adoption of the platform in the future but on the other hand, keep the data platform teams of the capacity that it would be able to onboard critical customer use cases simultaneously.
Be precise and firm on the choice of the toolkit, constant changes in toolkits and too much experimentation will only add drags and toil to the overall system onboarding process. It also discourages system teams from early adoption of the platform, as they will have to keep changing their codebase, with the changing toolkit of the platform.
This journey is exciting, make it enjoyable for your adoption team and help build in great applications for end users.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
David Smith Information Analyst at ManpowerGroup
20 November
Konstantin Rabin Head of Marketing at Kontomatik
19 November
Ruoyu Xie Marketing Manager at Grand Compliance
Seth Perlman Global Head of Product at i2c Inc.
18 November
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.