The Australian Competition and Consumer Commission (ACCC) is increasing its scrutiny of artificial intelligence as insurers and other financial services firms expand their use of emerging technologies, including agentic AI.
In an industry snapshot released this year, the competition regulator notes that AI-enabled products and services are being deployed across the economy, with implications for competition and consumer outcomes. “AI-enabled products and services are growing more and more important to consumers and businesses across Australia. New developments have the potential to transform how Australians work, communicate, and engage with digital services. However, they also come with risks of potential harms to consumers and competition,” ACCC chair Gina Cass-Gottlieb said. The snapshot updates developments in generative and agentic AI since the ACCC’s March 2025 Final Report of the Digital Platform Services Inquiry and restates the commission’s support for a formal monitoring role for emerging digital technologies under the federal government’s proposed digital competition regime.
According to the ACCC, AI technologies and markets are changing rapidly, with advances in foundation models, generative tools, and AI agents. As these systems are integrated into large digital platforms, the regulator is examining how they may affect market structure, data control, and user choice. “Our snapshot has outlined increasing interconnections between AI offerings and existing digital platform services, often supplied by tech giants, as AI technology matures. While these integrations can improve user experience, they may also have negative implications by raising barriers to entry or expansion, and consumers’ ability and willingness to switch service providers,” Cass-Gottlieb said.
The snapshot identifies agentic AI – including AI agents capable of carrying out tasks or making recommendations with a degree of autonomy – as an area of focus. “Use of agentic AI has the potential to impact how users deal with businesses online, or use digital platform services such as searching the internet. Their use may also give rise to new risks, such as the possibility of AI agents colluding, even where this is not expressly intended or programmed by human creators,” Cass-Gottlieb said. For insurers, this intersects with growing reliance on third-party technology providers, cloud infrastructure, and data platforms to support underwriting, pricing, distribution, and claims management, and with potential concentration around a small number of AI vendors.
Alongside competition concerns, the ACCC snapshot outlines several consumer risk areas linked to greater AI adoption. These include extensive collection and use of personal data, AI-generated misrepresentations in marketing and product information, and the use of generative tools to create fake reviews or more convincing scam content. “The integration of AI into various digital products and services is already delivering benefits to Australian consumers, including by enabling new app functionalities and simplifying some tasks. However, AI also has the potential to amplify existing consumer risks relating to how businesses communicate with consumers, whether consumers are well-informed about businesses’ use of their data, and risks posed by scammers,” Cass-Gottlieb said.
Research commissioned for the Digital Platform Services Inquiry found that 83% of surveyed Australian consumers believe companies should seek consent before using personal data to train AI models. Nonetheless, the ACCC said large amounts of consumer data are already used to train AI systems, often without clear awareness by users, in part because of lengthy and complex terms of service and privacy policies.
The regulator reports current cases where generative AI is used to support “false representations about the performance or characteristics of a product or service.” Cass-Gottlieb referred to “ghost websites, which misrepresent themselves as local businesses,” and online product listings that may use AI-generated material “to make products appear more sophisticated, or of a higher quality, than they actually are.” Cass-Gottlieb added that AI may generate “large volumes of fake reviews” that are harder to identify, and that it is “increasingly being used by scammers to facilitate and enhance online scam activity.” For insurers, these developments are relevant to fraud detection, claims assessment, and cyber risk modelling, and may influence demand for covers addressing scams, misrepresentation, and digital misconduct, as well as expectations around consumer protection and disclosure.
The ACCC’s attention to agentic AI comes at a time when insurance organisations are testing these tools inside their businesses, according to a separate report from the Capgemini Research Institute. Capgemini’s study estimates that agentic AI could generate up to US$450 billion in economic value by 2028. The report indicates that 20% of insurance organisations are piloting AI agent use cases, and 12% have implemented the technology partially or at scale. Only 2% have fully scaled AI agent deployments. The research indicates that insurers are exploring the use of AI agents in underwriting, claims processing, and customer service, including for triage, decision support, and workflow management. Nearly all business leaders surveyed (93%) believe that scaling AI agents within the next year would provide a competitive advantage, yet only 4% of insurance organisations fully trust AI agents.
The AI snapshot forms part of the ACCC’s wider five-year inquiry into digital platform services, launched in 2020. In its fifth interim report, the commission recommended service-specific mandatory codes of conduct for designated digital platforms to address competition and consumer risks in the digital economy. In December 2023, the federal government accepted the ACCC’s conclusion that existing competition law is not sufficient on its own to deal with current and emerging harms, and supported in principle a new digital competition regime. Public consultation on the framework began in December 2024.
The March 2025 Final Report recommended that the ACCC retain a monitoring function for emerging digital technologies under the new regime and that the government adopt a whole-of-government approach to digital platform regulation, with the Digital Platform Regulators Forum endorsed as a permanent body for information sharing and coordination. “The pace of continued changes since the ACCC provided the Australian government with the Final Report of the Digital Platform Services Inquiry in March this year underscores the importance of regulators and governments continuing to monitor changing digital technologies,” Cass-Gottlieb said.