The importance of distributing market data in the cloud

  25 Be the first to comment

The importance of distributing market data in the cloud

Contributed

This content is contributed or sourced from third parties but has been subject to Finextra editorial review.

Capital markets firms have long understood how cloud can help establish competitive advantage, increase agility, reduce time to market for new products, and address the growing total cost of ownership for IT infrastructure. However, the financial services industry has reached a point where real-time trading and risk management has become more critical, particularly amid market volatility and global economic uncertainty.

The Covid-19 pandemic forced firms to accelerate digital transformation, further escalating the requirement for the derivation of actionable insights from data, so that informed trading decisions can be made. Data can no longer be fragmented or siloed; capital markets firms must be prepared to rethink their internal technical footprint and rearchitect their data management infrastructure so resilience and scalability can continue to take precedence and business innovation can be expanded.

Finextra spoke to Julie Hutchinson, capital markets industry specialist - exchanges and trading platforms, Amazon Web Services (AWS); Jonathan Reeve, SVP, content and technology solutions, FactSet and Demetry Zilberg, chief technology officer at FactSet. The conversation centred around the role of the cloud in market data distribution, enterprise-wide modernisation, cost reduction and providing developers with the tools and easier access to market data for application development.

Click here to read the AWS and Factset report on how to benefit from data in the cloud. 

Elasticity and scaling up to handle volatility

Hutchinson highlighted that due to the pandemic and the remote working that followed in succession, financial players that needed connectivity with centralised infrastructure and did not have access to centralised corporate resources from home or the right tooling to handle fluctuating capacity requirements, were at a loss.

“Customers had to have that elasticity and access to compute-on-demand to scale up to handle market volatility resulting in data amounts that were anywhere from two to three times previous peak usage. Think back to the first week of March [2020] when we were just going into the pandemic in the US, we had market volatility like we’ve never seen; it broke the volume record four days in a row.”

She continued: “Our customers needed to have that elasticity to scale up, but then when the markets settled down, they were able to turn everything off.” Other unexpected pandemic-related challenges included disruptions and delays in the supply chain for hardware shipments, as well as people not being able to gain access to physical data centres when cities were in lockdown.

“Dealing effectively with those challenges proved a major advantage that customers already in the cloud were able to experience,” Hutchinson said. Zilberg agreed and added that financial institutions did not slow down their transformation in any way throughout the Covid-19 pandemic and “kept their foot on the gas.”

He went on to say that “it certainly was important to have the capability to scale certain types of infrastructure in the cloud. During the pandemic, trading activity became substantially more volatile just as lockdowns were being implemented. Things changed incredibly fast, which presented an opportunity for action. The ability to scale and have burst capacity in the cloud was critical for us when it came to pivoting to product opportunities, accelerating digital transformation programmes, and integrating data into various collaboration and chat platforms quickly to support teams who suddenly found themselves working remotely.”

Migrating real-time data to the cloud to lock in business intelligence

Hutchinson said: “Moving financial content to the cloud will continue to be important in driving broader adoption – as you can’t hydrate a data lake, run analytics or build market predictions without data.” Financial firms are utilising a wide variety of data types including satellite imagery, IOT sensor data, news and even extracting insight from paper documents and making them digital. One of the more challenging parts of managing disparate data sets is they are not all structured in the same way, with similar fields and formats.

Reeve believes that the first step is making data usable and for this to happen efficiently, firms will need to leverage the available, on-demand and elastic nature of cloud. “It has to be connected and do what we call ‘concordance’ which is allowing these data sets to talk to each other. It doesn’t matter if you’re looking at a trade, a piece of company information, an annual report, a transcript or a broker estimate. They have to be connected. You’ve got to have the ability to travel up and down the content hierarchy, and that makes it infinitely usable.”

Once real-time data is compiled and data analysts have access to tooling, they can unlock the value within the data. Commodity traders can predict futures on corn by looking at satellite imagery on the size of crop yields, while also considering weather data and patterns to predict a drought. Energy traders will also want to utilise Internet of Things (IOT) sensor data to gauge the levels of crude oil that is available at a refinery, while also keeping an eye on any breaking news such as unexpected blockage in the Suez Canal.

Equity traders can also examine sentiment analysis on annual shareholder meetings to predict earnings or potential mergers and acquisitions, in addition to probing signals on Reddit message boards. By operating in the cloud, users can now combine disparate sets of data and the ability to utilise machine learning to find hidden correlations, make predictions and show visualisations. In turn, parameters can also be changed if the output was not what was expected without having to compete for computing resources with other internal teams – spinning up resources as needed.

From compliance to business to app development, modernising with the cloud

One challenging aspect of IT modernisation in any industry – especially in highly regulated ones like government, financial services, and healthcare – can be getting data securely from on premises into the cloud. Services on the cloud can offer private connectivity between virtual private cloud (VPCs), amongst different services within the cloud, and to on premises applications – which eliminates the need for managing hardware-based routers and improves the security of data shared with cloud-based applications by eliminating the exposure of data to the public internet.

For example, FINRA, that regulates all registered brokers in the US established a Consolidated Audit Trail (CAT) on AWS to ingest and monitor over 100 billion market events per day across 22 stock exchanges and 1500 broker dealers. This allowed FINRA to track market trading activity by following orders throughout their life cycle. After FINRA moved CAT to the cloud, they saw an opportunity to further modernise the way it allowed customers to securely submit encrypted data. This benefited broker dealers grappling with legacy systems, as well as those that already had an existing cloud presence.

On this, Hutchinson added that private clients that were already storing data also required an “easier” solution, “which provides cloud to cloud connectivity between users on the broker side to an endpoint on the FINRA side, so they can seamlessly and electronically secure, encrypt and report their trades at the end of each day. Moving that process to the cloud makes the customer reporting process more transparent and easier to audit.”

Zilberg also commented on modernisation to make the process easier for users. “We want our application teams to have a frictionless experience that balances agility, governance, and ease of use. Permissions – ensuring you grant access to the right things to the right people – matters across a lot of workflows. It is important to get this right because you want to ensure creativity and flexibility within the rules. One thing that we have done is to use micro accounts, where each AWS account is allocated for one project, owned by a single team. That way, each application team can be granted a broad set of permissions to explore and innovate within their own accounts without impacting other teams or workloads.”

Mastering data, repurposing, and finding a logical home for data sets

Beyond regulatory requirements, Hutchinson stated that there has been an increased interest in adoption of building out a firmwide data lake strategy or an architecture where the data sits in the middle of the capital markets firm’s environment. Further to this, tools, analytics, risk modelling and machine learning algorithms can all be integrated within the environment.

“When customers set up a centralised data lake or warehouse, each user can get access privileges into the data set, down to the row or column that they are entitled to view and use. We have identity and access management tools that help customers documents who is getting access, when they’re getting access and what they’ve accessed – all working within the cloud,” Hutchinson added.

Reeve elaborated on how “mastering data is important.” He added: “As our clients are accelerating their digital transformation, they are trying to make sure that every piece of data has a logical home, they know how to find it and how to make it useful. Mastering data means being able to deploy data for any use case, while at the same time, improving efficiency and accelerating the path to project success.”

Tech-specific benefits and the developer dimension

Considering specific benefits for developers, Zilberg described how without traditional impediments or sources of dependence, developers no longer have to “wait for infrastructure to be built, file internal tickets to request the server, increase the server or change the configuration of the server.”

The ability to innovate quickly, not be impeded by cumbersome, low-value tasks, and to remove hurdles to produce better outcomes can now be a developer’s primary focus. For this to happen, barriers to entry need to be lowered for single developer teams, so that they are given the opportunity to then ‘master the data’.

It is only when the data is mastered that, as Reeve said, “the next dimension can be reached, which is where you develop rapid prototyping so people can succeed or fail fast and see which of their prototypes are sticky.”

In conclusion, we have seen that while the cloud provides well understood benefits for financial services organisations, including increased agility, scalability, and lower total cost of ownership (TCO), challenges posed by the Covid-19 pandemic have highlighted the importance of other benefits, such as the ability to provide compute elasticity and efficient market data distribution, which can enable better real-time trading, risk management and application development.

By operating in the cloud, users can combine disparate data sets and use technologies like machine learning to analyse and derive value from that data, in a way that is much harder to replicate using on premise data sources.

Using cloud-based data also provides greater levels of centralised access control, which makes it easier than ever for data consumers across the organisation to access the content they need as soon as they want it.

Click here to read the AWS and Factset report on how to benefit from data in the cloud. 

Channels

Comments: (0)

/cloud Long Reads

Hamish Monk

Hamish Monk Senior Reporter at Finextra

What is hybrid cloud?

/cloud

Hamish Monk

Hamish Monk Senior Reporter at Finextra

Understanding SaaS

/cloud

Sponsored

This content has been created by the Finextra editorial team with inputs from subject matter experts at the funding sponsor.