In a data-rich world, financial institutions are already struggling with fractured bank master data. Add on top of that data decay and the drive towards migration as many ERP (enterprise resource planning) providers move towards the end-of-life dates for
their traditional ERP systems, migration is becoming a herculean task.
In the face of a looming ERP migration, financial institutions are forced to address an issue they have been trying to put off: understanding their level of data decay, taking the time and budget to fill data gaps, and rooting out decayed data. Deploying
a bank master data strategy is imperative to revitalise systems by improving data quality and streamlining efficiency.
We spoke to Neil Tagg, sales director at LexisNexis® Risk Solutions, about the impact of decayed bank master data and how ERP migrations can be enhanced, so that banks can migrate bank master data and reorganise their data estates to prevent decay of data.
Tagg states that bank master data is to software what petrol is to the car; so to speak, it is the fuel that drives the entire system: “There are lots of different types of data required to manage an ERP, and significantly bank master data. The biggest challenge
that we have seen in our own organisation is how to effectively migrate that data, and making sure that when you do migrate it, it is not decayed data.”
What is the drive to migrate ERPs?
The ERP market is expected to
exceed $48.5 billion this year, and reach
$78.4 billion by 2026. According to
Technology Evaluation, almost 50% of companies are already starting to upgrade ERP systems or are planning to do so.
The benefits of migrating to a newer version of software lie in the new functionalities available, that can provide better services from a consumer perspective. For clients, ERP migration is often necessary when software is nearing end-of-life and requires
an update. There are limitations on features and functionality in pieces of software, therefore software giants such as SAP and Oracle are consistently working to improve ERP and level up their products.
Tagg explains: “Some ERP systems had limitations where they were on-premise, and that still exists today. But many new versions of the software have new features such as SaaS solutions on the public cloud. All of them will also have the capability to operate
on private cloud, using providers such as Amazon Web Services, Google Cloud, or Azure. Customers are seeing the benefit of moving to a version where they can put data into a private cloud, and not have a technical debt associated with the on-premise solution
is a big attraction.”
According to Gartner, the majority of companies looking to adopt ERP delivery systems were looking at cloud or hybrid models.
These new functionalities that come with ERP migration include inherent changes in how these processes are managed. As the operating system updates and adjusts, what roles an entity is classified (such as customer, vendor, business partner) dictates the
managing process. Tagg cites the example of new versions of SAP; the classifications of ‘customer’ and ‘vendor’ have shifted to ‘business partner’, therefore internal processes now need to adjust accordingly.
Bank master data and discovering decayed data
The impact of decayed data and failed payments has a major cost impact, with the 2020 global economy reportedly
losing $118.5 billion in fees, labour, and lost business due to unsuccessful transactions. According to
Gartner, implementing a bank master data management strategy can help banks improve their data quality by 20% and enhance operational efficiency by 15%.
Decayed data often slips through the cracks of minor errors, often within manual processing.
Gartner reported that B2B data can decay at a rate of up to 70.3% per year. During onboarding processes, checking the details of a vendor or consumer, and keying those details into the database could include minor mistakes that lead to larger problems.
These errors can lead to friction in transactions, and then requires an operational repair costs from banks, which can also cause a strained vendor relationship, says Tagg.
Factors that lead to data decay include inaccurate data mapping, poor data cleansing, unstable data governance, flawed analysis of historical data, and incorrect data validation and testing, and impractical migration strategies. Automation can support better
data hygiene.
Finding decayed data requires internal feedback from teams on a daily basis, speaking to accounts payable, payment operations, master data management, and treasury teams to weed out any transaction friction. Tagg emphasises that this is where constant internal
communication is vital.
Third party vendors can support the migration process
Third party companies can aid with issues of transaction friction and finding decayed data. Banks can export their entire bank master database table to a vendor that can check it against their database and flag errors or missing records.
However, the most cost-effective and efficient method of managing ERP migrations and removing decayed data is to go to a third-party provider and check data in an automated fashion.
In a
2024 report that surveyed 225 companies on their ERP migration, nearly 40% reported that ‘functionality’ was their primary consideration when upgrading and choosing a vendor.
It is more effective to start with a clean slate and not take any decayed data when migrating to a new version of software or the cloud. Using a third-party vendor can offer automated update frequency that will pre-populate the bank master database on an
ongoing basis.
It can be a challenge to find an opportune time to reorganise your data is when you're upgrading. You can start the new database table from scratch and populate your database without having to migrate anything over, and that will remove the whole headache
of understanding how decayed the data was.”
To successfully migrate bank master data, organisations need to leverage database tables, link records, and also consider new capabilities that come with migration to new versions. Onboarding web portals can pre-validate details being keyed in, making use
of new technologies like APIs to ensure clean transference of data, and refresh that data on an ongoing basis. Removing manual effort from data processing tasks will effectively maintain databases going forward.