Community
With a predicted global addressable market of nearly $600 billion in 2022, hyperautomation (HA) is a key focus for many industry transformation initiatives. In this article, I’ll lay out in layman’s terms what hyperautomation is, why it matters and what foundations firms should establish to ensure the delivery of successful hyperautomation initiatives.
Hyperautomation: Erm…that’s just automation, right?
There are some insurmountable forces in nature. The sun rises and sets, tides change, and IT analyst firms will create new buzzwords. Gartner obliged when they named hyperautomation as the number one strategic technology trend in 2020. Based on Gartner’s definition, hyperautomation is an endeavour that seeks to maximise the opportunity to automate processes via a kitchen sink of technologies, from advanced analytics to business process management. It is automation, but to an extent that which has never previously been possible.
The extent of automation has been historically limited to highly prescribed processes that are comprised of trivial tasks processing highly structured transactional data. Think basic rule-based transaction processing. The reason is principally because humans have struggled how to tell computers how to process unstructured data, which humans inherently use to communicate and forms 80-90% of the worlds data.
So, what’s changed?
Arguably, the biggest single enabler of hyperautomation is machine learning, which has provided a mechanism to turn unstructured data (what humans understand) into structured data (what computers understand). This has created solutions to hitherto intractable problems such as recognition of speech, handwriting and assorted vision-based tasks. Industry is keen to use the term Artificial intelligence (AI) to describe these capabilities, though “intelligence“ is just a poor euphemism for “specific task competence”. Regardless, task-competent automation offers an abundance of additional automation opportunities where process tasks are complex (e.g., voice identification, processing of paper-based application forms), but the process itself is prescribed and trivial (e.g., rule based).
To realise even greater opportunities, more generalised intelligence is required to exploit the automation of processes that are not well prescribed. For instance, not just using AI to extract data from images, but using AI to incorporate this data into complex decisions processes requiring reasoning and other higher-level cognitive functions. This requires capabilities that are much closer to higher level human intelligence than task-specific competence.
The size of the prize
A study by Fenergo suggested that inefficient onboarding processes could cost individual commercial banks as much as $4.5bn in revenue. Business leaders with this problem are apt to ponder what they could do if only they had more (human) resources. Hyperautomation can bring some of those thoughts to reality, by increasing the proportion of firms’ workload that can be processed using automated “human-like” capabilities. Done successfully, this creates a win-win situation for firms and customers, as customers can receive a superior service for a given operational cost to the firm. In terms of hard operational outcomes, a study by Deloitte detailed that firms exceeded expectations and realised a 27% reduction in operational costs.
But it’s not just about operational efficiency. Being able to create “human-like” operators on demand at low operational cost means that firms can optimise their human resource allocation. By using AI to address more complex but mundane process automation, firms can re-allocate their personnel with real intelligence to address the more gnarly, interesting business challenges that AI currently can’t solve. It's hard to put a number on the potential for this dynamic to drive revenue growth, but the same Deloitte survey put estimates at 11%. Watch this space.
Laying the right foundations
Whether a financial services firm hires 1,000 real people or deploys their hyperautomation equivalent, aside from lunch breaks and annual holiday, both resource groups need the same thing: a solid foundation of transparent process aligned to business strategy and timely access to accurate data. If these solid foundations are not created, there’s a good chance that anything built on them won’t stay standing for long.
In the context of operational processes, premature automation is often seen as the root of all evil, creating the shakiest of foundations for operational improvement. Without prioritising the optimisation of the process landscape design, firms run the risk of automating operational processes that are redundant, non-conformant or incoherent in the context of the firm’s business strategy. Firms that stampede towards hyperautomation may make their train go faster, but that’s not much good if the destination is unclear. For customer facing processes, the journey may seem faster, but will still be convoluted. The long-term costs of managing incoherent or redundant hyperautomation solutions and their associated technical debt can very quickly exceed the benefits that the solutions were designed to exploit.
Firms should adopt capable process management and process mining solutions to get a firm grasp on their operational process landscape. Most importantly, this will enable them to ensure their operations are strategically aligned. But in the context of hyperautomation, it will enable evidence-based decisions about where, when, and how hyperautomation solutions should be deployed
Feeding the machine
The big data revolution has been going in earnest since the mid noughties, yet it is clear firms are still much better at accumulating data than they are at exploiting it to deliver better business outcomes. In surveys, somewhere between 2% and 31% of respondents considered their firms to be data driven, and most alarmingly this proportion is decreasing in spite of significant investments in big data and AI initiatives. With firms contributing to exponentially increasing data volumes, they appear to becoming overwhelmed. This is a major impediment to hyperautomation.
To address the challenge, firms must invest significantly in data and application integration and API management solutions to break down siloed data landscapes and create transparent information sharing pipelines across the enterprise. Importantly, these solutions must be able to extend across the entire data footprint of the organisation. From on-premise mainframes to cloud SaaS applications and data lakes, and everything in between.
Putting it into practice
Hyperautomation initiatives are both a source of significant challenge and reward for firms looking to improve their operational efficiencies and excellence. But before engaging in significant hyperautomation initiatives, firms must ensure the essential foundations are in place to maximise the likelihood of long-term success.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
David Smith Information Analyst at ManpowerGroup
20 November
Konstantin Rabin Head of Marketing at Kontomatik
19 November
Ruoyu Xie Marketing Manager at Grand Compliance
Seth Perlman Global Head of Product at i2c Inc.
18 November
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.