5 Sources
[1]
The Productivity Commission is floating AI copyright exemptions - with worrying implications for Australian authors and publishers
Australian National University provides funding as a member of The Conversation AU. In an interim report released overnight, Harnessing data and digital technology, the Productivity Commission has floated a text and data mining exception for the Australian Copyright Act. This would make it legal to train artificial intelligence large language models, such as ChatGPT, on copyrighted Australian work. AI training would be added to the list of "fair dealing" exceptions already existing in the Copyright Act. Why? The Productivity Commission estimates a potential A$116 billion over ten years flowing into the Australian economy, thanks to AI. Of course, this comes after large language models have already "trained" on masses of Australian copyright material, breaching copyright law. In March, many Australian authors were outraged to find their works included in a dataset of pirated books used by Meta to train their AI systems (including books by former prime ministers John Howard and Julia Gillard). Writers, publishers and their industry bodies oppose any such exception - which would "preference the interests of multinational technology companies at the expense of our own creative industries", according to the Copyright Agency. And this isn't the first time the Productivity Commission has proposed changes that would harm Australian publishers. Copyright is how authors earn a living Sophie Cunningham, a writer, former book publisher and chair of the Australian Society of Authors (ASA), pointed out most writers "don't receive wages but they do have copyright". The Australian Publishers Association is "deeply concerned" by the exception. ASA CEO Lucy Hayward says, A text and data mining exception would give tech companies a free pass to use [authors'] work to train artificial intelligence models - and profit from it - while Australian creators get nothing. The ASA, too, opposes the exemption. On average, Australian writers earn around $18,500 per year from their writing practice. A recent study found that they are overwhelmingly opposed to their work being used to train AI models. Stephen King, one of two commissioners leading the inquiry, said: The obvious harm is that an AI company may use copyright materials without providing appropriate compensation. On the other side, we want the development of AI-specific tools that use that copyrighted material. The report claims the provision "would not be a 'blank cheque' for all copyrighted materials to be used as inputs into all AI models". But creating greater leeway in Australian laws can be read as tacitly endorsing currently unlawful practices. Imagine grabbing the keys for a rental car and just driving around for a while without paying to hire it or filling in any paperwork. Then imagine that instead of being prosecuted for breaking the law, the government changed the law to make driving around in a rental car legal. This gives you an idea of what is being proposed. Unproductive suggestions This is not the first time the Productivity Commission has shown little regard for the local publishing industry. In 2009, it recommended the government remove parallel importation restrictions - a regulation that says if a local company publishes a book, no foreign editions of the same book can be sold here for the following 90 days. Local publishers at the time argued removing the restriction would put them at a disadvantage compared to overseas publishers. While it might result in some cheaper books, it would also remove a major revenue stream for local publishers, whose local editions of bestsellers underwrite the rest of their local publishing program. When the argument resurfaced in 2016, author Richard Flanagan said, "The Productivity Commission is like a deranged hairdresser insisting their client wears a mullet wig." The benefits of local editions of foreign titles for local publishers have long been clear. Text Publishing produced local editions of Barack Obama's memoirs in the year he ran for his first presidency and first published Elena Ferrante in Australia. Scribe's edition of Canadian neuroscientist Norman Doidge's The Brain That Changes Itself sold over 100,000 copies in Australia and New Zealand within a couple of years, helping fund their local publishing program. In 1995, when the commission was called the Industry Commission, it recommended the end of the Book Bounty, a subsidy that supported local printing of books. The Industry Commission's argument was that overseas printing was often cheaper and the model was outdated. Since then, Australian printing has effectively dwindled to just two printers: if you look on the imprint page of most Australian books, you will see either Griffin or McPherson's - if they were printed locally. So much for productivity - as seen with Coles and Woolworths, a duopoly risks less competition and higher prices. Similarly, the Industry Commission did not give much regard to the high transaction costs of overseas printing. Overseas printing adds several months to the production schedule, meaning local books now take longer to publish. Without a competitive local printing industry, and margins that push printing offshore where it is cheaper, it's possible some time-sensitive books won't be published at all. Australia can lead, not follow Australians may be accustomed to thinking of ourselves as small players on the international stage, but we are the 13th largest economy in the world. Our actions set precedents that other countries follow. Making sweetheart copyright deals here could lead to other countries copying our legislative choices. Australia is increasingly recognising the importance of Indigenous Cultural and Intellectual Property - and some of our institutions are working to develop best practice. Australia can be a leader in this space, not a follower. We can choose to hand over the keys - or we can signal, locally and around the world, that we value our cultural products and creators.
[2]
AI companies want copyright exemption, but the arts minister says there are 'no plans' to weaken these laws. What's going on?
Macquarie University provides funding as a member of The Conversation AU. "We have copyright laws," said arts minister Tony Burke last week. "We have no plans, no intention, no appetite to be weakening those copyright laws based on this draft report that's floating around." He was referring to the Productivity Commission's controversial floating of a text and data mining exception to the Australian Copyright Act, which would make it legal to train artificial intelligence (AI) large language models, such as ChatGPT, on copyrighted Australian work. In the internet age, all that is solid melts into data, easily copied and distributed instantly across the internet. This includes the work of authors, songwriters and artists, ostensibly protected by the law of copyright. "The rampant opportunism of big tech aiming to pillage other people's work for their own profit is galling and shameful," songwriter, author and former arts minister Peter Garrett told the Australian last week, in response to the Productivity Commission's report. He urged the federal government to urgently strengthen copyright laws to help preserve cultural sovereignty and our valuable intellectual property in the face of powerful corporate forces who want to strip mine it and pay nothing. I have researched how piracy, illegal streaming and remix culture violate these rights. Somehow, authors and artists have survived over 25 years of it. But AI poses a new threat. AI companies pushing against copyright "You can't be expected to have a successful AI program when every single article, book, or anything else that you've read or studied, you're supposed to pay for," United States president Donald Trump said at an AI Summit last month, launching his government's AI Action Plan. On July 23, he signed a trio of executive orders, including one on preventing "woke" AI in the US government, one on deregulating AI development (including removing environmental protections that could hamper the construction of data centres) and another on promoting the export of American AI technology. Major AI companies, including Google and Microsoft, have put the copyright exemption argument to the Australian government. Australian tech billionaire Scott Farquhar, co-founder of software company Atlassian and chair of the Tech Council of Australia, said in a National Press Club address on 30 July that our "outdated" copyright laws are a barrier for AI companies wanting to train or host their models here. He explicitly called for a text and data mining exception like the one the Productivity Commission is floating. What is an author? In the early 19th century, English poet and literary critic Samuel Taylor Coleridge defined the poet - or creative author - as an elevated representative of the human race: a cultural hero worthy of the highest respect. The genius author was a divinely inspired creator of completely original works. And the best way to understand the meaning of a work was to determine the author's intention in creating it. Tech companies promise their advanced AI systems are capable of creating new creative works. But for most of us, we read to find truths about humanity, or reflections of it. "The interaction in your own mind, is very much with the author," reflected arts minister Burke last week. Roland Barthes, in his 1968 essay The Death of the Author argued language itself, in its constant flux and change, is what generates new work. He proposed a new model for the author: a scriptor, or copyist, who mixes writings, none of them original, so a new work is a "tissue of quotations" drawn from the "immense dictionary" of language. Ironically, in this way, Barthes predicted AI. Read more: Roland Barthes declared the 'death of the author', but postcolonial critics have begged to differ A shift in wealth Established in the 18th century, copyright enabled an author, songwriter or artist to make a living from royalties. It was meant to protect authors from illegal copying of their works. The 21st century marked a massive shift in the wealth generated by creativity. Authors, and especially songwriters, suffered an enormous loss of revenue in the period 2000-2015, due to online piracy. In 1999, global revenue for the music industry was US$39 billion; in 2014, that figure fell to $15 billion. At the same time, the owners of online platforms and big tech companies - who benefited from clicks to pirate sites offering works stolen from artists - enjoyed soaring profits. Google's annual revenue jumped from US$0.4 billion in 2002 to $74.5 billion in 2015. Several author and publisher lawsuits are underway regarding the unauthorised use of books by to train large language models. In June, a US federal judge ruled Anthropic did not breach copyright in using books to train its model, comparing the process to a "reader aspiring to be a writer". Copyright law reformers in the US have proposed that because all creativity is algorithmic, involving the novel combination of words, images or sounds, then an AI model, which uses algorithms to generate new works, should have its works protected by copyright. This would raise AI-as-author to the same legal status as human authors. These reformers believe rejecting this argument betrays an anthropocentric or "speciesist" bias. If AI models were legally accepted as authors, this would represent another blow to the esteem accorded human authors. 'Erasure disguised as efficiency' An AI model may be capable of generating works of fiction, but these works will be poor imitations of human creativity. The missing element: emotion. Works of fiction come from an author's lifetime of experiences, from joy to grief, and the work engages readers or viewers on an emotional level. The emotional capacity of an AI model: zero. I have seen some atrocious AI-generated non-fiction books for sale on Amazon. The tell-tale signs are: no author byline, and sentences such as: "Since my dataset has not been updated since 2023, I cannot provide information past that date." If someone buys those books, the publisher and Amazon will profit. No royalties will be paid to an author. This is happening in other media, too. In the Hollywood Reporter last month, British producer Remy Blumenfeld told of a "showrunner with multiple global hits" being asked to rewrite a pilot generated by ChatGPT. He called it "erasure disguised as efficiency". In January, the US Authors Guild introduced an official certification system, for use by its members, to indicate that a book is human written. In April, the European Writers Council also called for "an effective transparency obligation for AI-generated products that clearly distinguishes them from the works made by human beings". In 2025, readers still flock to writers' festivals, and we still esteem great authors as cultural heroes. But this status is threatened by AI. We must resist the possibility that human authors become mere content in AI systems' datasets. The ultimate degradation would be for authors to become unwilling data donors to AI. Big tech must be opposed in that ambition.
[3]
Australia's potential surrender of creative content to tech giants for free is shocking. Labor must decide where it stands | Josh Taylor
Tech companies have devalued the work of creative industries for years. The latest iteration of this is their insistence the AI models they plan to make lots of money from need the labour of all of human creation for free in perpetuity. It's just surprising that the Productivity Commission appears to have bought into the argument - and caught the Australian government off-guard. The Productivity Commission's view on AI trained on the copyrighted works of others without compensation, published Wednesday, is that the horse has already bolted for big tech companies - that providing a text and data mining (TDM) exception in copyright law would not change much, but should be worth considering. The commission stated that AI models, trained overseas on unlicensed copyrighted materials, are already used in Australia by larger institutions and a TDM exception is unlikely to change this. The argument is that providing an exception could allow smaller local institutions to train their own models. That argument would carry more weight if it wasn't the giants of Google, Meta and Atlassian that have so far argued for carte blanche AI access to all available human data. As the UK recently experienced, there is already almighty pushback from news, film, music and TV companies, as well as authors and publishers. This is the case for essentially any industry that might now be suddenly expected to hand over its labour for free to the same very large tech companies that spent millions of dollars to arguably seek favourable AI regulation (ie very little) from the Trump administration. Music industry bodies in Australia have said a TDM exception here would "legitimise digital piracy under guise of productivity". It is hard not to view the push from tech companies cynically given recent political donations and the current AI job-hiring arm's race. Companies are reportedly offering up to US$100m pay packets for AI researchers in a highly competitive jobs market, while at the same time crying poor when it comes to paying for the data that will make those AI models useful. Many in media have been through waves of redundancies because tech companies promised the rivers of gold would return with a pivot to on-platform video, hopes suddenly dashed when Facebook deprioritised video. When the Coalition government forced Meta and Google to negotiate with publishers for payment for their content, Meta temporarily removed news from its platforms in Australia, before eventually coming to the table. When those deals came up for renewal, Meta had turned the tap off for news content appearing in people's feeds, and argued news wasn't important for the platform anymore. The attitude to training AI models on the work of others seems no different. The claim that the models are already trained and it is therefore too late to do anything about it ignores the question of happens next. News companies - including the Guardian, which has opposed the exception proposal in the UK - will remain a vital resource that AI will need to train on, to respond to growing user demand for the latest information. But allowing AI to access to that offers very little in return to those companies if they aren't paid. AI summaries in Google search results mean people now click through less and less to find the details in a news story. A recent study suggests a site previously ranked top in search results could see a 79% drop in clickthroughs. Cloudflare - the internet infrastructure company that has launched a way to block AI crawlers from sites unless they pay up - has said the results are even worse in AI chatbots. As of June, Google crawled websites about 14 times for every referral, Cloudflare said, but OpenAI's crawl-to-referral ratio was 1,700:1, and Anthropic's was 73,000:1, Cloudflare said. One can argue sites already do have an option to opt out: by including a "robots.txt" file on their website to say "don't crawl my page". But as Cloudflare points out, that amounts to little more than putting up a rules sign next to the pool. The Albanese government appears to have been caught unaware that an overhaul of copyright law for AI was something tech companies have been pushing for - and something the Productivity Commission appears to be open to. The treasurer, Jim Chalmers, referred questions on it to Tim Ayres, the industry and science minister. Ayres told ABC News Breakfast on Wednesday there were "no plans to make changes" in regards to copyright law. But while the opposition has already come out strongly against the use of copyrighted material without compensation, Labor needs to figure out where it stands. At a time when the government is championing its under-16s social media ban as "world-leading", and with various new regulations facing the tech sector - from child protection, to paying news companies for news, to competition changes for Apple and Google app stores being floated - Australia's potential surrender of all the content of human creation to the tech giants, for free, seems jarring. To argue that without it we will "fall behind" seems to ignore years of regulators struggling to play catch-up with tech companies. Despite Bill Heslop-ian cries of "you can't stop progress" from those who stand to gain the most, they may soon realise it won't last when those whose content they need for AI cannot viably continue to produce it.
[4]
Arts and media groups demand Labor take a stand against 'rampant theft' of Australian content to train AI
Arts, creative and media groups have demanded the government rule out allowing big tech companies to take Australian content to train their artificial intelligence models, with concerns such a shift would "sell out" Australian workers and lead to "rampant theft" of intellectual property. The Albanese government has said it has no plans to change copyright law, but any changes must consider effects on artists and news media. The opposition leader, Sussan Ley, has demanded that copyrighted material must not be used without compensation. "It is not appropriate for big tech to steal the work of Australian artists, musicians, creators, news media, journalism, and use it for their own ends without paying for it," Ley said on Wednesday. In an interim report on "harnessing data and digital technology", the Productivity Commission set out proposals for how tech including AI could be regulated and treated in Australia, suggesting it could boost productivity by between 0.5% and 13% over the next decade, adding up to $116bn to Australia's GDP. The report said building AI models required large amounts of data, and several stakeholders in the field, including Creative Australia and the Copyright Agency, had "expressed concern about the unauthorised use of copyrighted materials to train AI models". The PC suggested several possible remedies, including expanding licensing schemes, or an exemption for "text and data mining" and expanding the existing fair dealing rules, which the commission said existed in other countries. The latter suggestion prompted fierce pushback from arts, creative and media companies, which raised alarm their work could be left open for massively wealthy tech companies to use - without compensation or payment - to train AI models. Such moves could undermine licensing deals currently being negotiated by publishers and creatives with big tech companies. It would also raise questions about the viability of the news media bargaining incentive, where news publishers strike commercial deals with major social media networks for the use of their journalism online. The Australian Council of Trade Unions accused the Productivity Commission of having "swallowed the arguments of large multinational tech companies hook, line and sinker", warning its approach would do little to help working Australians. "The report's extensive canvassing of the possibility of a text and data mining exemption opens the door to legitimising the rampant theft of the creative output of Australia's creative workers and of Indigenous cultural and intellectual property," the ACTU said. Joseph Mitchell, the ACTU assistant secretary, said such an exemption would create a situation where "tech bros get all the benefits of the new technology and productivity benefits are not fairly shared". Apra Amcos, Australasia's music rights collecting agency, and the National Aboriginal and Torres Strait Islander Music Office said they were disappointed at the commission's suggestions, raising concerns about such moves "potentially devastating Australia's $9bn music industry". Apra's chair, Jenny Morris, claimed the recommendations would "legitimise what they themselves acknowledge is already widespread theft". The attorney general, Michelle Rowland, who has carriage over copyright law, said further adoption of AI must be done in a way to build trust and confidence. "Any potential reform to Australia's copyright laws must consider the impacts on Australia's creative, content and news media sectors. I am committed to continuing to engage on these issues including through the Copyright and AI Reference Group that our government established last year," she said. Ley, asked about the PC report, said she was concerned about a lack of "guardrails" from the government in responding to AI challenges. "We have to protect content creators ... that work is theirs and it can't be taken without it being paid for," she said. The treasurer, Jim Chalmers, said he believed AI could be "a force for good", but acknowledged risks in the expanding technology. "The only way to make our people and workers and industries beneficiaries is if we treat AI as an enabler, not an enemy of what we want to see in our economy," he told a press conference in Parliament House. He pointed out that copyright laws apply in Australia, which he said was in contrast to some other countries, and that the government was not seeking to change those laws. The arts minister, Tony Burke, pointed to a submission to the review from Creative Australia, which he said "makes clear that with respect to copyright and labelling, there needs to be consent, transparency and remuneration". The Australian Publishers Association raised fears about authors, researchers and publishers having their work used without permission or compensation, which it said would undermine local publishing, as well as federal government cultural policy. "We support responsible innovation, but this draft proposal rewards infringers over investors," said Patrizia Di Biase-Dyson, APA's CEO. "We reject the notion that Australian stories and learning materials - that shape our culture and democracy - should be treated as free inputs for corporate AI systems." The Copyright Agency also opposed the text and data mining exemption, saying it would negatively affect creators' earning capacity. "The push to water down Australia's copyright system comes from multinational tech companies, and is not in the national interest," said CEO Josephine Johnston. "If we want high-quality Australian content to power the next phase of AI, we must ensure creators are paid for it."
[5]
Labor under pressure on plans to regulate AI as Coalition accuses government of mixed messages
The federal government is facing mounting pressure to confirm how it plans to regulate fast-growing artificial intelligence technology, with the Coalition critical of mixed messaging from Labor ministers about whether new laws are needed. As debate erupts over big tech companies seeking access to Australian material including journalism and books to train AI models, Anthony Albanese has stressed the importance of protecting copyright. But the shadow productivity minister, Andrew Bragg, has urged Australia not to squander its opportunity to harness AI's benefits, warning against any major new rules. "The risk is that we over-regulate. The risk is that we make ourselves even more uncompetitive," Bragg told Guardian Australia. "[AI] might be the only free kick we get on productivity." A suggestion from the Productivity Commission to give big tech companies an exemption to copyright laws for "text and data mining", or to expand existing fair dealing rules, prompted fierce pushback from arts, creative and media companies this week, alarmed that Australian work could be used by massively wealthy tech companies - without compensation - to train AI models. Federal ministers, including the treasurer, Jim Chalmers, have said they have no plans to change copyright law, and spoken in favour of creatives and rights holders. Albanese on Thursday echoed concerns over protecting copyright, but also said the government was keen to reap the benefits of AI technology, including productivity gains, expected to be a focus of the upcoming economic reform roundtable. "My government's a government that supports the arts," Albanese said at a press conference in Melbourne, calling AI a "complex" issue. "We as a society will work [the balance of AI risks and opportunities] through. It's good there's debate about it, but copyright and intellectual property is important." The government's plans to respond to the fast-moving technology have shifted, prompting Bragg to call on Labor to offer certainty to the industry. Former industry and science minister Ed Husic had set out plans for a standalone AI act to regulate the field; the productivity minister, Andrew Leigh, has advocated for a low-intervention approach described by some as "light-touch"; the new industry and science minister, Tim Ayres, has spoken about regulation and legislation among plans still to be decided, as well as giving trade unions more say in developing the sector. Chalmers has pushed for a "sensible middle path" between high and low regulation. "I just think the government has no idea, really, what it wants to do. They have more position than you can poke a stick at on AI," Bragg said, noting these positions. "We don't need new laws," he said. "The government need to say to the regulators, 'How are you going in enforcing the laws the parliament already has on the books?' before they look to put more laws on those books." Julian Leeser, the shadow attorney general and arts spokesperson, echoed similar sentiments, saying creators deserve fair compensation and calling for clarity from the government. "In the real world, we wouldn't let someone use an artist's work for commercial purposes without paying for it. The virtual world should be no different," he said in a statement. "This government just doesn't know what it's doing when it comes to AI, and it has no plan to protect Australian artists." Labor senator Tony Sheldon, who chaired an inquiry into AI in the last term of parliament, wrote on X that copyright laws "must be enforced to ensure big tech fairly licenses and compensates artists, writers, and other creatives". "Despite the Productivity Commission's interim report, the Albanese Government has been clear - we stand with Australia's creative workers and industries, and we will not compromise our copyright laws," Sheldon wrote. "If the Googles and Amazons of the world want to use Australia's extraordinary trove of written and recorded treasures, they can license and pay for it just like everyone else."
Share
Copy Link
The Productivity Commission's proposal for AI copyright exemptions sparks debate in Australia, pitting tech companies against creative industries and raising questions about the future of intellectual property rights in the AI era.
The Productivity Commission has sparked a heated debate in Australia by proposing a text and data mining exception to the Australian Copyright Act. This exception would make it legal to train artificial intelligence (AI) large language models, such as ChatGPT, on copyrighted Australian work 1. The commission estimates that AI could potentially contribute A$116 billion to the Australian economy over ten years 1.
Source: The Conversation
Major tech companies, including Google, Microsoft, and Atlassian, are advocating for copyright exemptions to facilitate AI development in Australia. Scott Farquhar, co-founder of Atlassian and chair of the Tech Council of Australia, argued that Australia's "outdated" copyright laws are hindering AI companies from training or hosting their models in the country 2.
The proposal has met with strong opposition from writers, publishers, and their industry bodies. They argue that such an exception would "preference the interests of multinational technology companies at the expense of our own creative industries" 1. The Australian Council of Trade Unions (ACTU) accused the Productivity Commission of having "swallowed the arguments of large multinational tech companies hook, line and sinker" 4.
The Australian government appears to be caught between competing interests. Arts Minister Tony Burke stated that there are "no plans, no intention, no appetite to be weakening those copyright laws" 2. However, the government is facing mounting pressure to clarify its position on AI regulation 5.
The debate raises significant questions about the balance between fostering innovation and protecting intellectual property rights. While AI promises substantial economic benefits, critics argue that allowing unrestricted use of copyrighted material could undermine the viability of creative industries 3.
Source: The Conversation
This debate is not unique to Australia. Similar discussions are occurring globally, with the United States and United Kingdom also grappling with the intersection of AI and copyright law 23.
The outcome of this debate could have far-reaching implications for the future of copyright law in the age of AI. It raises fundamental questions about the nature of authorship, the value of human creativity, and the appropriate balance between technological progress and the rights of content creators 23.
As the discussion continues, the Australian government will need to navigate carefully between encouraging AI innovation and protecting the interests of its creative industries. The decision made in Australia could potentially influence similar debates in other countries, highlighting the global significance of this issue 15.
AI startup Anthropic secures a massive $13 billion Series F funding round, skyrocketing its valuation to $183 billion. The company reports exponential growth in revenue and customer base, solidifying its position as a major player in the AI industry.
17 Sources
Business
3 hrs ago
17 Sources
Business
3 hrs ago
Salesforce CEO Marc Benioff reveals the company has reduced its customer support workforce by 4,000 jobs, replacing them with AI agents. This move highlights the growing impact of AI on employment in the tech industry.
4 Sources
Technology
3 hrs ago
4 Sources
Technology
3 hrs ago
OpenAI announces the acquisition of product testing startup Statsig for $1.1 billion and appoints its CEO as CTO of Applications, while also making significant changes to its leadership team.
4 Sources
Business
3 hrs ago
4 Sources
Business
3 hrs ago
Microsoft strikes a deal with the US General Services Administration, offering significant discounts on cloud services and free access to its AI tool Copilot, potentially saving the government billions.
7 Sources
Technology
11 hrs ago
7 Sources
Technology
11 hrs ago
Recent studies and tragic incidents highlight the potential dangers of AI chatbots and companions for vulnerable youth, raising concerns about mental health support and suicide prevention.
3 Sources
Technology
3 hrs ago
3 Sources
Technology
3 hrs ago