Curated by THEOUTPOST
On Thu, 6 Mar, 12:03 AM UTC
2 Sources
[1]
AI deepfakes threaten democracy and people's identities. 'Personality rights' could help
It could be as little as roughly A$100. That was how much ABC News Verify recently spent to clone federal senator Jacqui Lambie's voice -- with her permission -- using an easily accessible online platform. This example highlights how artificial intelligence (AI) apps which create a synthetic replica of a person's image and/or voice in the form of deepfakes or voice cloning are becoming cheaper and easier to use. This poses a serious threat not only to the functioning of democracy (especially around elections), but also to a person's identity. Current copyright laws in Australia are inadequate when it comes to protecting people if their image or voice is digitally cloned without their permission. Establishing "personality rights" could help. Detecting what's fake is difficult Deepfake technology is able to produce content which seems increasingly real. This makes it harder to detect what is fake and what is not. Indeed, several people for whom the ABC played the voice clone of Senator Lambie did not initially realize it was fake. This shows how unauthorized deepfakes and voice cloning can be easily used to generate misinformation. They can also be extremely damaging to individuals. This was highlighted back in 2020, when one of Australia's first political deepfake videos was released. It featured the then Queensland premier Annastacia Palaszczuk claiming the state was "cooked" and in "massive debt." The video received around 1 million views on social media. In theory copyright law can also protect a person's image and voice. However, its application is more nuanced. First, a person whose likeness has been cloned by an AI platform often does not own the source material. This material could be an image, video or voice recording which has been copied and uploaded. Even if your image and voice is depicted, if you are not the owner of the source material, you cannot sue for infringement. Using Senator Lambie as an example, the ABC only needed 90 seconds of original voice recording to create the AI clone. Senator Lambie's voice itself is not able to be copyright-protected. That's because copyright can only attach to a tangible expression, say in written or recorded form. It cannot attach to speech or unexpressed ideas. As the ABC arranged, recorded and produced the original 90-second recording, the broadcaster could hold copyright in it as a sound recording. It is a fixed, tangible expression of Senator Lambie's voice. However, unless the senator and the ABC made an agreement, Senator Lambie would have no economic rights, such as the right to reproduction, to the original voice recording. Nor would she have any rights to the clone of her voice. In fact, the AI-generated clone itself is unlikely to be protected by copyright, as it is considered authorless under Australian copyright law. Many AI-generated creations are currently unable to be protected under Australian copyright, due to a lack of original, identifiable human authorship. Moral rights -- including the right of attribution (to be credited as the performer), the right against false attribution and the right of integrity -- are also limited in scope. They could apply to the original audio clip, but not to a deepfake. What are 'personality rights'? In most jurisdictions in the United States, there exist what are commonly known as "personality rights". These rights include the right of publicity, which acknowledges that an individual's name, likeness, voice and other attributes are commercially valuable. Celebrities such as Bette Midler and Johnny Carson have successfully exercised this right to prevent companies using elements of their identity for commercial purposes without permission. However, personality rights might not always apply to AI voice clones, with some lawyers arguing that only actual recorded voices are protectable, not clones of voices. This has led to states such as Tennessee introducing legislation to specifically address AI-generated content. The Ensuring Likeness, Voice, and Image Security Act, introduced in 2024, addresses the misappropriation of an individual's voice through generative AI use. Urgent steps are needed There has been longstanding scholarly debate about whether Australia should introduce statutory publicity rights. One of the challenges is overlap with pre-existing laws, such as Australian consumer law and tort law. Policymakers might be hesitant to introduce a new right, as these other areas of the law may provide partial protection. Another challenge is how to enforce these rights if an AI-generated deepfake is created overseas. Australia could also consider introducing a similar law to the "No Fakes Bill" currently being debated in the US. If passed, this bill would allow people to protect their image and voice through intellectual property rights. This should be given serious consideration in Australia too. Deepfakes are becoming more and more common, and are now widespread during elections. Because of this, it's important that Australians remain vigilant to them in the lead up to this year's federal election. And let's hope that whoever wins that election takes urgent steps to better protect everyone's image and voice.
[2]
AI deepfakes threaten democracy and people's identities. 'Personality rights' could help
How much is your voice worth? It could be as little as roughly A$100. That was how much ABC News Verify recently spent to clone federal senator Jacqui Lambie's voice - with her permission - using an easily accessible online platform. This example highlights how artificial intelligence (AI) apps which create a synthetic replica of a person's image and/or voice in the form of deepfakes or voice cloning are becoming cheaper and easier to use. This poses a serious threat not only to the functioning of democracy (especially around elections), but also to a person's identity. Current copyright laws in Australia are inadequate when it comes to protecting people if their image or voice is digitally cloned without their permission. Establishing "personality rights" could help. Detecting what's fake is difficult Deepfake technology is able to produce content which seems increasingly real. This makes it harder to detect what is fake and what is not. Indeed, several people for whom the ABC played the voice clone of Senator Lambie did not initially realise it was fake. This shows how unauthorised deepfakes and voice cloning can be easily used to generate misinformation. They can also be extremely damaging to individuals. This was highlighted back in 2020, when one of Australia's first political deepfake videos was released. It featured the then Queensland premier Annastacia Palaszczuk claiming the state was "cooked" and in "massive debt". The video received around 1 million views on social media. What laws cover this? In Australia, defamation, privacy, image-based abuse laws, passing off and consumer protection laws might be applicable to situations involving deepfake video or audio clips. You may also be able to lodge a complaint with the eSafety commissioner. In theory copyright law can also protect a person's image and voice. However, its application is more nuanced. First, a person whose likeness has been cloned by an AI platform often does not own the source material. This material could be an image, video or voice recording which has been copied and uploaded. Even if your image and voice is depicted, if you are not the owner of the source material, you cannot sue for infringement. Using Senator Lambie as an example, the ABC only needed 90 seconds of original voice recording to create the AI clone. Senator Lambie's voice itself is not able to be copyright-protected. That's because copyright can only attach to a tangible expression, say in written or recorded form. It cannot attach to speech or unexpressed ideas. As the ABC arranged, recorded and produced the original 90-second recording, the broadcaster could hold copyright in it as a sound recording. It is a fixed, tangible expression of Senator Lambie's voice. However, unless the senator and the ABC made an agreement, Senator Lambie would have no economic rights, such as the right to reproduction, to the original voice recording. Nor would she have any rights to the clone of her voice. In fact, the AI-generated clone itself is unlikely to be protected by copyright, as it is considered authorless under Australian copyright law. Many AI-generated creations are currently unable to be protected under Australian copyright, due to a lack of original, identifiable human authorship. Moral rights - including the right of attribution (to be credited as the performer), the right against false attribution and the right of integrity - are also limited in scope. They could apply to the original audio clip, but not to a deepfake. What are 'personality rights'? In most jurisdictions in the United States, there exist what are commonly known as "personality rights". These rights include the right of publicity, which acknowledges that an individual's name, likeness, voice and other attributes are commercially valuable. Celebrities such as Bette Midler and Johnny Carson have successfully exercised this right to prevent companies using elements of their identity for commercial purposes without permission. However, personality rights might not always apply to AI voice clones, with some lawyers arguing that only actual recorded voices are protectable, not clones of voices. This has led to states such as Tennessee introducing legislation to specifically address AI-generated content. The Ensuring Likeness, Voice, and Image Security Act, introduced in 2024, addresses the misappropriation of an individual's voice through generative AI use. Urgent steps are needed There has been longstanding scholarly debate about whether Australia should introduce statutory publicity rights. One of the challenges is overlap with pre-existing laws, such as Australian consumer law and tort law. Policymakers might be hesitant to introduce a new right, as these other areas of the law may provide partial protection. Another challenge is how to enforce these rights if an AI-generated deepfake is created overseas. Australia could also consider introducing a similar law to the "No Fakes Bill" currently being debated in the US. If passed, this bill would allow people to protect their image and voice through intellectual property rights. This should be given serious consideration in Australia too. Deepfakes are becoming more and more common, and are now widespread during elections. Because of this, it's important that Australians remain vigilant to them in the lead up to this year's federal election. And let's hope that whoever wins that election takes urgent steps to better protect everyone's image and voice.
Share
Share
Copy Link
A recent experiment by ABC News Verify highlights the ease and affordability of creating AI-generated voice clones, raising concerns about the impact of deepfakes on democracy and personal identity. The article explores the inadequacies of current copyright laws and proposes the establishment of "personality rights" as a potential solution.
In a recent experiment, ABC News Verify demonstrated the alarming ease with which AI-generated voice clones can be created. For just A$100, they were able to clone the voice of Australian federal senator Jacqui Lambie using an easily accessible online platform 12. This experiment highlights the growing concern surrounding deepfake technology and its potential to threaten both democratic processes and individual identities.
Deepfake technology has advanced to the point where it can produce incredibly realistic content, making it increasingly difficult to distinguish between genuine and fabricated media. In the ABC experiment, several listeners were initially unable to identify Senator Lambie's cloned voice as artificial 12. This level of realism poses a significant risk for the spread of misinformation, particularly during critical periods such as elections.
Australia's existing copyright laws are ill-equipped to protect individuals whose images or voices are digitally cloned without consent. The current legal framework presents several challenges:
Ownership of source material: Often, the person whose likeness is cloned does not own the original source material, leaving them unable to sue for infringement 12.
Copyright limitations: Copyright law only applies to tangible expressions, not to speech or unexpressed ideas. This means that a person's voice itself cannot be copyright-protected 12.
AI-generated content: Under Australian copyright law, AI-generated clones are considered authorless and therefore not protected by copyright due to the lack of original, identifiable human authorship 12.
To address these legal gaps, experts are considering the introduction of "personality rights," a concept widely recognized in the United States. These rights include the right of publicity, which acknowledges the commercial value of an individual's name, likeness, voice, and other attributes 12.
However, the application of personality rights to AI-generated content is not straightforward. Some lawyers argue that only actual recorded voices are protectable, not AI-generated clones. This has led to new legislation being introduced in some U.S. states, such as Tennessee's Ensuring Likeness, Voice, and Image Security Act, which specifically addresses the misappropriation of an individual's voice through generative AI 12.
As deepfakes become increasingly prevalent, particularly during election periods, there is a pressing need for legal reform in Australia. Policymakers face several challenges in introducing new rights, including:
Overlap with existing laws: There is potential overlap with pre-existing legislation such as Australian consumer law and tort law 12.
Enforcement difficulties: Enforcing these rights against AI-generated deepfakes created overseas presents significant challenges 12.
One potential solution is the introduction of legislation similar to the "No Fakes Bill" currently under consideration in the United States. This bill would allow individuals to protect their image and voice through intellectual property rights 12.
As Australia approaches its federal election, it is crucial for citizens to remain vigilant against deepfakes and for policymakers to take urgent steps to better protect individuals' images and voices in the age of AI.
A new US bill aims to give individuals intellectual property rights over their likeness to protect against deepfakes, raising questions about its effectiveness and potential adoption in other countries like New Zealand.
2 Sources
2 Sources
A bipartisan group of U.S. senators has introduced legislation aimed at protecting individuals and artists from AI-generated deepfakes. The bill seeks to establish legal safeguards and address concerns about AI exploitation in various sectors.
5 Sources
5 Sources
AI-powered voice cloning technology is advancing rapidly, raising concerns about fraud, privacy, and legal implications. Celebrities like David Attenborough and Scarlett Johansson have been targeted, prompting calls for updated regulations.
3 Sources
3 Sources
The U.S. Copyright Office has called for urgent legislation to address the growing concerns surrounding AI-generated deepfakes and impersonation. The office emphasizes the need for a new federal law to protect individuals' rights and regulate the use of AI in content creation.
2 Sources
2 Sources
A recent Consumer Reports study finds that popular AI voice cloning tools lack sufficient safeguards against fraud and misuse, raising concerns about potential scams and privacy violations.
7 Sources
7 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved