/artificial intelligence

News and resources on artificial intelligence systems, innovations and initiatives worldwide.

Australia's Asic highlights governance lag in AI adoption

Australia's Securities and Investment Commission (Asic) is urging financial services and credit licensees to ensure their governance practices keep pace with their accelerating adoption of artificial intelligence.

  0 Be the first to comment

Australia's Asic highlights governance lag in AI adoption

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.

The call comes as Asic’s first state of the market review of the use and adoption of AI by 23 licensees found there was potential for governance to lag AI adoption, despite current AI use being relatively cautious.

Asic Chair Joe Longo says making sure governance frameworks are updated for the planned use of AI is crucial to licensees meeting future challenges posed by the technology.

"Our review shows AI use by the licensees has to date focussed predominantly on supporting human decisions and improving efficiencies. However, the volume of AI use is accelerating rapidly, with around 60% of licensees intending to ramp up AI usage, which could change the way AI impacts consumers," he says.

Asic’s findings revealed nearly half of licensees did not have policies in place that considered consumer fairness or bias, and even fewer had policies governing the disclosure of AI use to consumers.

Says Longo: "It is clear that work needs to be done—and quickly — to ensure governance is adequate for the potential surge in consumer-facing AI.

"Without appropriate governance, we risk seeing misinformation, unintended discrimination or bias, manipulation of consumer sentiment and data security and privacy failures, all of which has the potential to cause consumer harm and damage to market confidence."

Longo says licensees must consider their existing obligations and duties for consumer protection when it comes to the deployment of AI and avoid simply waiting for AI laws and regulations to be introduced.

"Where there is misconduct, Asic will take enforcement action if appropriate and where necessary," he states.

The governance lag is not confined to Australia. A survey of 200 US compliance professionals conducted by ACA Aponix and the National SOciety of Compliance Porfessionals, foundthat only 32% of respondents have established an AI committee or governance group, only 12% of those using AI have adopted an AI risk management framework, and just 18% have established a formal testing programme for AI tools.

Furthermore, most respondents (92%) have yet to adopt policies and procedures to govern AI use by third parties or service providers, leaving firms vulnerable to cybersecurity, privacy, and operational risks across their third-party networks.

“We’re seeing widespread interest in using AI across the financial sector, yet there’s a clear disconnect when it comes to establishing the necessary safeguards,” said Lisa Crossley, Executive Director, NSCP. “Our survey shows that while many firms recognize the potential of AI, they lack the frameworks to manage it responsibly. This gap not only exposes firms to regulatory scrutiny, but also underscores the importance of building robust AI governance protocols as usage continues to grow.”

Discover new challenges and opportunities artificial intelligence brings to the banking sector at Finextra's first NextGenAI conference on November 26 2024. Register your interest here.

Sponsored [New Report] Managing Fraud Risks with Synthetic Data: A Practical Approach for Businesses Services Industry

Comments: (0)

[Webinar] 2025 Fraud Trends: Synthetic Identity, AI and Incoming MandatesFinextra Promoted[Webinar] 2025 Fraud Trends: Synthetic Identity, AI and Incoming Mandates