2 Sources
2 Sources
[1]
KPMG partner in Oz turned to AI to pass an exam on... AI
Unnamed consultant - one of a dozen cases at the company's Australian arm - now nursing a fine AIpocolypse A partner at accounting and consultancy giant KPMG in Australia was forced to cough up a AU$10k ($7,084/ £5,195) fine after he used AI to ace an internal training course on... AI. Faced with questions on the use of AI, the unnamed partner uploaded training materials to an AI platform to help generate his response, according to a report in the FT. The partner was just one of more than two dozen staff who have been rumbled for using AI tooling while taking internal exams, the consulting giant confirmed. The unnamed KPMG partner's attempted use of AI on an internal test was first reported by the Australian Financial Review, and was flagged up in an Australian senate inquiry into governance. Barbara Pocock, a senator with Australia's Greens party, referred to a "misdemeanor" at the firm, and said she was disappointed with the fine. "We've got a toothless system where con artists... get away with so much," she told a parliamentary committee last week. KPMG Australia chief exec Andrew Yates was quoted as saying it was "grappling" with the impact of AI when it comes to internal training and testing. "It's a very hard thing to get on top of given how quickly society has embraced it," he said. A KPMG spokesperson confirmed the FT story and told The Register the "two dozen" figure was confined to Australia. No similar reports have emerged outside the antipodes, apparently. The accounting and consulting biz is not alone in its struggle. Rival Deloitte Australia had to refund a large chunk of the fee for a report it produced for the Australian government last year. The report was littered with AI hallucinations, including imagined quotes from a court ruling and non-existent academic research. In the UK, West Midlands' chief constable Craig Guildford was prompted into retirement earlier this year after it emerged his force relied on Copilot when considering whether to block Israeli football fans from an Aston Villa - Maccabi Tel Aviv match. The research threw up concerns about disruption at a match between Maccabi Tel Aviv and West Ham. The only problem being no such match occurred. Guildford's acting successor, Scott Green, said the matter is being investigated by IPSO and the force's own professional standards department. In the meantime, he said, Copilot had been switched off across the force. For now, at least. ®
[2]
KPMG partner fined for using artificial intelligence to cheat in AI training test
Firm says person fined A$10,000 is one of over two dozen staff in Australia caught using AI in exams since July A partner at the consultancy KPMG has been fined for using artificial intelligence to cheat during an internal training course on AI. The unnamed partner was fined A$10,000 (£5,200) for using the technology to cheat, one of a number of staff reportedly using the tactic. More than two dozen KPMG Australia staff have been caught using AI tools to cheat on internal exams since July, the company said, increasing concerns over AI-fuelled cheating in accountancy firms. The consultancy used its own AI detection tools to discover the cheating, according to the Australian Finance Review, which first reported on it. The big four accountancy firms have grappled with cheating scandals in recent years. In 2021, KPMG Australia was fined A$615,000 over "widespread" misconduct, after it was found that more than 1,100 partners had been involved in "improper answer-sharing" on tests designed to assess skill and integrity. But AI tools have introduced new possibilities for rule-breaking. In December, the UK's largest accounting body, the Association of Chartered Certified Accountants (ACCA), said it would require accounting students to take exams in person, because otherwise it was too difficult to stop AI cheating. Helen Brand, the chief executive of the ACCA, said at the time that AI tools had led to a "tipping point" as the use of them was outpacing safeguards against cheating put in place by the association. Firms such as KPMG and PricewaterhouseCoopers have also been mandating their staff to use AI at work, reportedly in an effort to boost profits and cut costs. KPMG partners will reportedly be assessed on their ability to use AI tools during their 2026 performance reviews, with the firm's global AI workforce lead, Niale Cleobury, saying: "We all have a responsibility to be bringing AI to all of our work." Some commenters on LinkedIn noted the irony in using AI to cheat in AI training. KPMG is "fighting AI adoption instead of redesigning how they train people. This is a not a cheating problem - if we look at the new world order. This is a training problem," wrote Iwo Szapar, the creator of a platform that ranks organisations' "AI maturity". KPMG said it had adopted measures to identify the use of AI by its staff and would keep track of how many of its workers misused the technology. Andrew Yates, the chief executive of KPMG Australia, said: "Like most organisations, we have been grappling with the role and use of AI as it relates to internal training and testing. It's a very hard thing to get on top of given how quickly society has embraced it. "Given the everyday use of these tools, some people breach our policy. We take it seriously when they do. We are also looking at ways to strengthen our approach in the current self-reporting regime."
Share
Share
Copy Link
A KPMG partner in Australia was fined AU$10,000 for using artificial intelligence to cheat during an internal training course on AI. The unnamed partner uploaded training materials to an AI platform to help generate responses. More than two dozen staff at KPMG Australia have been caught cheating on internal exams since July, raising concerns about AI misuse in professional settings across major consultancy and accounting firms.

A partner at consultancy giant KPMG has been fined AU$10,000 (approximately $7,084 or £5,195) after using artificial intelligence to cheat during an internal training course designed to teach employees about AI
1
2
. The unnamed partner uploaded training materials to an AI platform to help generate responses to exam questions, according to the Australian Financial Review1
. The incident highlights growing tensions around AI adoption in professional environments, where firms simultaneously mandate AI use while struggling to prevent AI-fueled academic misconduct.The KPMG partner fined represents just one case in a broader pattern of AI cheating at the firm's Australian operations. More than two dozen KPMG Australia staff have been caught using AI tools to cheat on internal exams since July, the company confirmed
2
. The consultancy discovered the cheating using its own AI detection tools, underscoring the cat-and-mouse dynamic emerging between AI misuse in professional settings and efforts to detect it2
. The case was flagged during an Australian senate inquiry into governance, where Senator Barbara Pocock of the Greens party expressed disappointment with the fine, calling the system "toothless"1
.This isn't KPMG's first brush with cheating scandals. In 2021, KPMG Australia was fined AU$615,000 over "widespread" professional misconduct after more than 1,100 partners were found involved in "improper answer-sharing" on tests designed to assess skill and integrity
2
. The latest incidents involving AI introduce new complexities to maintaining integrity in internal training programs. Andrew Yates, chief executive of KPMG Australia, acknowledged the firm is "grappling" with AI's impact on internal training and testing, stating: "It's a very hard thing to get on top of given how quickly society has embraced it"1
2
.The irony hasn't escaped observers: firms like KPMG and PricewaterhouseCoopers are mandating staff use AI at work to boost profits and cut costs, while simultaneously punishing employees for using AI for internal tests
2
. KPMG partners will reportedly be assessed on their ability to use AI tools during their 2026 performance reviews, with the firm's global AI workforce lead, Niale Cleobury, stating: "We all have a responsibility to be bringing AI to all of our work"2
. Some commenters on LinkedIn questioned whether this represents a training problem rather than a cheating problem, suggesting firms need to redesign how they approach education in an AI-enabled world.Related Stories
KPMG isn't alone in struggling with AI-related challenges. Rival Deloitte Australia had to refund a significant portion of fees for a report containing AI hallucinations, including imagined quotes from court rulings and non-existent academic research
1
. Beyond accounting firms, the issue extends to public institutions. West Midlands police chief constable Craig Guildford was prompted into retirement after his force relied on Copilot when considering whether to block Israeli football fans from an Aston Villa - Maccabi Tel Aviv match1
. The AI research cited concerns about disruption at a match between Maccabi Tel Aviv and West Ham—a match that never occurred1
.The UK's largest accounting body, the Association of Chartered Certified Accountants (ACCA), announced in December that it would require accounting students to take exams in person because AI tools had reached a "tipping point" where their use was outpacing safeguards against cheating
2
. ACCA chief executive Helen Brand said AI use was moving faster than the association's ability to implement protective measures. KPMG has stated it will keep track of how many workers misuse the technology and is looking at ways to strengthen its approach under the current self-reporting regime2
. As firms navigate the tension between encouraging AI adoption and maintaining professional standards, the question remains whether traditional testing and training methods can survive in an AI-saturated environment.Summarized by
Navi
[1]
06 Oct 2025•Technology

15 Aug 2025•Technology

13 Oct 2025•Technology

1
Policy and Regulation

2
Business and Economy

3
Technology
