KPMG partner fined AU$10,000 for using AI to cheat on internal AI training exam

2 Sources

Share

A KPMG partner in Australia was fined AU$10,000 for using artificial intelligence to cheat during an internal training course on AI. The unnamed partner uploaded training materials to an AI platform to help generate responses. More than two dozen staff at KPMG Australia have been caught cheating on internal exams since July, raising concerns about AI misuse in professional settings across major consultancy and accounting firms.

News article

KPMG Partner Fined for Using AI to Pass AI Training Test

A partner at consultancy giant KPMG has been fined AU$10,000 (approximately $7,084 or £5,195) after using artificial intelligence to cheat during an internal training course designed to teach employees about AI

1

2

. The unnamed partner uploaded training materials to an AI platform to help generate responses to exam questions, according to the Australian Financial Review

1

. The incident highlights growing tensions around AI adoption in professional environments, where firms simultaneously mandate AI use while struggling to prevent AI-fueled academic misconduct.

Over Two Dozen Cases of AI Cheating on Internal Exams

The KPMG partner fined represents just one case in a broader pattern of AI cheating at the firm's Australian operations. More than two dozen KPMG Australia staff have been caught using AI tools to cheat on internal exams since July, the company confirmed

2

. The consultancy discovered the cheating using its own AI detection tools, underscoring the cat-and-mouse dynamic emerging between AI misuse in professional settings and efforts to detect it

2

. The case was flagged during an Australian senate inquiry into governance, where Senator Barbara Pocock of the Greens party expressed disappointment with the fine, calling the system "toothless"

1

.

KPMG's History with Professional Misconduct and Integrity Issues

This isn't KPMG's first brush with cheating scandals. In 2021, KPMG Australia was fined AU$615,000 over "widespread" professional misconduct after more than 1,100 partners were found involved in "improper answer-sharing" on tests designed to assess skill and integrity

2

. The latest incidents involving AI introduce new complexities to maintaining integrity in internal training programs. Andrew Yates, chief executive of KPMG Australia, acknowledged the firm is "grappling" with AI's impact on internal training and testing, stating: "It's a very hard thing to get on top of given how quickly society has embraced it"

1

2

.

The Paradox of Mandating AI While Punishing Its Use

The irony hasn't escaped observers: firms like KPMG and PricewaterhouseCoopers are mandating staff use AI at work to boost profits and cut costs, while simultaneously punishing employees for using AI for internal tests

2

. KPMG partners will reportedly be assessed on their ability to use AI tools during their 2026 performance reviews, with the firm's global AI workforce lead, Niale Cleobury, stating: "We all have a responsibility to be bringing AI to all of our work"

2

. Some commenters on LinkedIn questioned whether this represents a training problem rather than a cheating problem, suggesting firms need to redesign how they approach education in an AI-enabled world.

Broader Pattern of AI Hallucinations Plaguing Accounting Firms

KPMG isn't alone in struggling with AI-related challenges. Rival Deloitte Australia had to refund a significant portion of fees for a report containing AI hallucinations, including imagined quotes from court rulings and non-existent academic research

1

. Beyond accounting firms, the issue extends to public institutions. West Midlands police chief constable Craig Guildford was prompted into retirement after his force relied on Copilot when considering whether to block Israeli football fans from an Aston Villa - Maccabi Tel Aviv match

1

. The AI research cited concerns about disruption at a match between Maccabi Tel Aviv and West Ham—a match that never occurred

1

.

What This Means for Professional Standards and AI Adoption

The UK's largest accounting body, the Association of Chartered Certified Accountants (ACCA), announced in December that it would require accounting students to take exams in person because AI tools had reached a "tipping point" where their use was outpacing safeguards against cheating

2

. ACCA chief executive Helen Brand said AI use was moving faster than the association's ability to implement protective measures. KPMG has stated it will keep track of how many workers misuse the technology and is looking at ways to strengthen its approach under the current self-reporting regime

2

. As firms navigate the tension between encouraging AI adoption and maintaining professional standards, the question remains whether traditional testing and training methods can survive in an AI-saturated environment.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo