ACCC and AI: Australian Consumer Law in the Age of Chatbots
The ACCC has been watching AI closely. As AI-powered chatbots, recommendation engines, and automated customer service tools become standard in Australian business, the consumer protection questions are sharpening. Here’s where the lines are, and what Australian small businesses need to watch.
Misleading conduct is misleading conduct
The Australian Consumer Law prohibits misleading or deceptive conduct in trade or commerce. The ACCC’s position is clear: that prohibition applies whether the misleading conduct is produced by a human or an AI system. If your AI chatbot tells customers things that are false: about product features, pricing, availability, or anything else: you’re potentially liable under the ACL, regardless of whether you knew the AI was wrong.
The ACCC has flagged this explicitly in its Digital Platform Services inquiry and in statements on AI. The fact that an AI system generated the misleading statement doesn’t transfer responsibility away from the business that deployed it.
AI-generated claims about your products
If you use AI to generate product descriptions, marketing copy, or customer communications, and that content contains false or misleading claims, you’re exposed. AI models sometimes hallucinate: generating confident-sounding claims that have no factual basis. A product description that says your supplement “has been clinically proven to” something it hasn’t, because the AI generated that phrasing, is your problem to fix and your liability if it causes harm.
The ACCC has been particularly active on health claims, financial product claims, and environmental claims: all areas where AI tools are prone to generating impressive-sounding but inaccurate statements. Review AI-generated content carefully before publishing, especially in these categories.
Consumer guarantees still apply
If you sell an AI-powered service to consumers, the statutory consumer guarantees under the ACL apply. The service must be fit for purpose, provided with due care and skill, and match any description you’ve given of it. An AI customer service tool that gives wrong information about your return policy, or an AI recommendation engine that fails to work as described, may breach these guarantees: exposing you to remedies including refunds and compensation.
What this means for your business
Four practical steps to reduce your ACCC exposure:
- Review AI-generated customer-facing content before it goes live, particularly any claims about product features, health benefits, financial returns, or environmental credentials.
- Test your AI tools regularly: chatbots drift, models update, and an AI that was accurate six months ago may not be now.
- Have a correction process: if your AI gives a customer wrong information, how do you find out and fix it? Build that feedback loop.
- Don’t oversell AI capabilities: if you’re marketing a product or service as AI-powered and the AI component doesn’t deliver on your claims, that’s a potential ACL issue on its own.
Sources
- ACCC. Digital Platform Services Inquiry
- ACCC. Consumer Guarantees under Australian Consumer Law
- Australian Consumer Law. Misleading and Deceptive Conduct (s18)
Related: AI Disclosure Obligations for Australian Businesses: What to Tell Customers | AI Contracts and Liability: What Australian SMBs Need to Know
📊 Compare AI tools side by side | 💼 Free resources & AI prompt packs
📬 The SmallBizAI Brief
One practical AI tip for Australian small business. Every Tuesday. Free.
Join business owners getting smarter about AI every week.