Finastra has published a five-point plan to tackle gender bias in algorithms.
The use of algorithms in financial services has become an increasing focus of regulatory scrutiny, with concerns over the manifestation of unconscious biases in applications built to guide crucial consumer decisions for credit scoring, rate-setting and insurance.
Financial services providers increasingly rely on algorithms to make decisions on how to price a policy or whether to provide credit and at what price.
US regulators hopened a probe into the algorithm used to determine the credit worthiness of Apple Card applicants in 2019 after a man took to Twitter to call it "sexist" for giving him a credit limit 20 times higher than his wife's.
To understand the scale of the problem, Finastra commissioned KPMG to produce a report which lists 12 common biases for organisations to be aware of when designing, building and putting algorithms into production. These included biases specific to data, algorithmic design, and human involvement.
Simon Paris, CEO at Finastra, says: “Without this being a priority in the financial industry, AI will become a flywheel that will accelerate the negative impact on human lives. Finastra doesn’t have all the answers but we believe that the industry must first acknowledge that there is a problem with algorithmic bias.”
For its part, Finastra intends to reform its developer agreement, updating termas and conditions for FusionFabric.cloud, its open platform and marketplace for developers. Under the new terms, developers using the platform will be expected to account for algorithmic bias and Finastra has the right to inspect for this bias within any new application.
The vendor is also in the proof-of-concept stage of a new product, FinEqual, which will provide banks with a technolgoical basis for tackling algorithmic bias within their own businesses. The firm aims to make FinEqual commercialy available in the next 18 months, says Paris.