5 Sources
[1]
Industry leaders urge Senate to protect against AI deepfakes with No Fakes Act
Tech and music industry leaders testified about the dangers of deepfakes made with artificial intelligence on Wednesday, urging lawmakers to pass legislation that would protect people's voices and likenesses from being replicated without consent, while allowing use of the tech responsibly. Speaking to members of the Senate Judiciary Committee's panel on privacy, technology, and the law, executives from YouTube and Recording Industry Association of America as well as country music singer Martina McBride, championed the bipartisan No Fakes Act, which seeks to create federal protections for artists' voice, likeness and image from unauthorized AI-generated deepfakes. The group argued that Americans across the board -- whether teenagers or high-profile music artists -- were at risk of their likenesses being misused. The legislation, reintroduced in the senate last month, would combat deepfakes by holding individuals or companies liable if they produced an unauthorized digital replica of an individual in a performance. "AI technology is amazing and can be used for so many wonderful purposes," McBride told the panel. "But like all great technologies, it can also be abused, in this case by stealing people's voices and likenesses to scare and defraud families, manipulate the images of young girls in ways that are shocking to say the least, impersonate government officials, or make phony recordings posing as artists like me." The No Fakes Act would also hold platforms liable if they knew a replica was not authorized, while excluding certain digital replicas from coverage based on First Amendment protections. It would also establish a notice-and-takedown process so victims of unauthorized deepfakes "have an avenue to get online platforms to take down the deepfake," the bill's sponsors said last month. The bill would address the use of non-consensual digital replicas in audiovisual works, images, or sound recordings. Nearly 400 artists, actors and performers have signed on in support of the legislation, according to the Human Artistry Campaign, which advocates for responsible AI use, including LeAnn Rimes, Bette Midler, Missy Elliott, Scarlett Johansson and Sean Astin. The testimony comes two days after President Donald Trump signed the Take It Down Act, bipartisan legislation that enacted stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn," as well as deepfakes created by AI. Mitch Glazier, CEO of the RIAA, said that the No Fakes act is "the perfect next step to build on" that law. "It provides a remedy to victims of invasive harms that go beyond the intimate images addressed by that legislation, protecting artists like Martina from non-consensual deepfakes and voice clones that breach the trust she has built with millions of fans," he said, adding that it "empowers individuals to have unlawful deepfakes removed as soon as a platform is able without requiring anyone to hire lawyers or go to court." Suzana Carlos, head of music policy at YouTube, added that the bill would protect the credibility of online content. AI regulation should not penalize companies for providing tools that can be used for permitted and non-permitted uses, she said in written testimony, prior to addressing the subcommittee. The legislation offers a workable, tech-neutral and comprehensive legal solution, she said, and would streamline global operations for platforms like YouTube while empowering musicians and rights holders to manage their IP. Platforms have a responsibility to address the challenges posed by AI-generated content, she added. "YouTube largely supports this bill because we see the incredible opportunity to of AI, but we also recognize those harms, and we believe that AI needs to be deployed responsibly," she said.
[2]
Industry leaders urge Senate to protect against AI deepfakes with No Fakes Act
Tech and music industry leaders testified about the dangers of deepfakes made with artificial intelligence on Wednesday, urging lawmakers to pass legislation that would protect people's voices and likenesses from being replicated without consent, while allowing use of the tech responsibly. Speaking to members of the Senate Judiciary Committee's panel on privacy, technology, and the law, executives from YouTube and Recording Industry Association of America as well as country music singer Martina McBride, championed the bipartisan No Fakes Act, which seeks to create federal protections for artists' voice, likeness and image from unauthorized AI-generated deepfakes. The group argued that Americans across the board -- whether teenagers or high-profile music artists -- were at risk of their likenesses being misused. The legislation, reintroduced in the senate last month, would combat deepfakes by holding individuals or companies liable if they produced an unauthorized digital replica of an individual in a performance. "AI technology is amazing and can be used for so many wonderful purposes," McBride told the panel. "But like all great technologies, it can also be abused, in this case by stealing people's voices and likenesses to scare and defraud families, manipulate the images of young girls in ways that are shocking to say the least, impersonate government officials, or make phony recordings posing as artists like me." The No Fakes Act would also hold platforms liable if they knew a replica was not authorized, while excluding certain digital replicas from coverage based on First Amendment protections. It would also establish a notice-and-takedown process so victims of unauthorized deepfakes "have an avenue to get online platforms to take down the deepfake," the bill's sponsors said last month. The bill would address the use of non-consensual digital replicas in audiovisual works, images, or sound recordings. Nearly 400 artists, actors and performers have signed on in support of the legislation, according to the Human Artistry Campaign, which advocates for responsible AI use, including LeAnn Rimes, Bette Midler, Missy Elliott, Scarlett Johansson and Sean Astin. The testimony comes two days after President Donald Trump signed the Take It Down Act, bipartisan legislation that enacted stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn," as well as deepfakes created by AI. Mitch Glazier, CEO of the RIAA, said that the No Fakes act is "the perfect next step to build on" that law. "It provides a remedy to victims of invasive harms that go beyond the intimate images addressed by that legislation, protecting artists like Martina from non-consensual deepfakes and voice clones that breach the trust she has built with millions of fans," he said, adding that it "empowers individuals to have unlawful deepfakes removed as soon as a platform is able without requiring anyone to hire lawyers or go to court." Suzana Carlos, head of music policy at YouTube, added that the bill would protect the credibility of online content. AI regulation should not penalize companies for providing tools that can be used for permitted and non-permitted uses, she said in written testimony, prior to addressing the subcommittee. The legislation offers a workable, tech-neutral and comprehensive legal solution, she said, and would streamline global operations for platforms like YouTube while empowering musicians and rights holders to manage their IP. Platforms have a responsibility to address the challenges posed by AI-generated content, she added. "YouTube largely supports this bill because we see the incredible opportunity to of AI, but we also recognize those harms, and we believe that AI needs to be deployed responsibly," she said.
[3]
Industry leaders urge Senate to protect against AI deepfakes with No Fakes Act
Tech and music industry leaders testified about the dangers of deepfakes made with artificial intelligence on Wednesday, urging lawmakers to pass legislation that would protect people's voices and likenesses from being replicated without consent, while allowing use of the tech responsibly. Speaking to members of the Senate Judiciary Committee's panel on privacy, technology, and the law, executives from YouTube and Recording Industry Association of America as well as country music singer Martina McBride, championed the bipartisan No Fakes Act, which seeks to create federal protections for artists' voice, likeness and image from unauthorized AI-generated deepfakes. The group argued that Americans across the board -- whether teenagers or high-profile music artists -- were at risk of their likenesses being misused. The legislation, reintroduced in the senate last month, would combat deepfakes by holding individuals or companies liable if they produced an unauthorized digital replica of an individual in a performance. "AI technology is amazing and can be used for so many wonderful purposes," McBride told the panel. "But like all great technologies, it can also be abused, in this case by stealing people's voices and likenesses to scare and defraud families, manipulate the images of young girls in ways that are shocking to say the least, impersonate government officials, or make phony recordings posing as artists like me." The No Fakes Act would also hold platforms liable if they knew a replica was not authorized, while excluding certain digital replicas from coverage based on First Amendment protections. It would also establish a notice-and-takedown process so victims of unauthorized deepfakes "have an avenue to get online platforms to take down the deepfake," the bill's sponsors said last month. The bill would address the use of non-consensual digital replicas in audiovisual works, images, or sound recordings. Nearly 400 artists, actors and performers have signed on in support of the legislation, according to the Human Artistry Campaign, which advocates for responsible AI use, including LeAnn Rimes, Bette Midler, Missy Elliott, Scarlett Johansson and Sean Astin. The testimony comes two days after President Donald Trump signed the Take It Down Act, bipartisan legislation that enacted stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn," as well as deepfakes created by AI. Mitch Glazier, CEO of the RIAA, said that the No Fakes act is "the perfect next step to build on" that law. "It provides a remedy to victims of invasive harms that go beyond the intimate images addressed by that legislation, protecting artists like Martina from non-consensual deepfakes and voice clones that breach the trust she has built with millions of fans," he said, adding that it "empowers individuals to have unlawful deepfakes removed as soon as a platform is able without requiring anyone to hire lawyers or go to court." Suzana Carlos, head of music policy at YouTube, added that the bill would protect the credibility of online content. AI regulation should not penalize companies for providing tools that can be used for permitted and non-permitted uses, she said in written testimony, prior to addressing the subcommittee. The legislation offers a workable, tech-neutral and comprehensive legal solution, she said, and would streamline global operations for platforms like YouTube while empowering musicians and rights holders to manage their IP. Platforms have a responsibility to address the challenges posed by AI-generated content, she added. "YouTube largely supports this bill because we see the incredible opportunity to of AI, but we also recognize those harms, and we believe that AI needs to be deployed responsibly," she said.
[4]
Industry Leaders Urge Senate to Protect Against AI Deepfakes With No Fakes Act
Tech and music industry leaders testified about the dangers of deepfakes made with artificial intelligence on Wednesday, urging lawmakers to pass legislation that would protect people's voices and likenesses from being replicated without consent, while allowing use of the tech responsibly. Speaking to members of the Senate Judiciary Committee's panel on privacy, technology, and the law, executives from YouTube and Recording Industry Association of America as well as country music singer Martina McBride, championed the bipartisan No Fakes Act, which seeks to create federal protections for artists' voice, likeness and image from unauthorized AI-generated deepfakes. The group argued that Americans across the board -- whether teenagers or high-profile music artists -- were at risk of their likenesses being misused. The legislation, reintroduced in the senate last month, would combat deepfakes by holding individuals or companies liable if they produced an unauthorized digital replica of an individual in a performance. "AI technology is amazing and can be used for so many wonderful purposes," McBride told the panel. "But like all great technologies, it can also be abused, in this case by stealing people's voices and likenesses to scare and defraud families, manipulate the images of young girls in ways that are shocking to say the least, impersonate government officials, or make phony recordings posing as artists like me." The No Fakes Act would also hold platforms liable if they knew a replica was not authorized, while excluding certain digital replicas from coverage based on First Amendment protections. It would also establish a notice-and-takedown process so victims of unauthorized deepfakes "have an avenue to get online platforms to take down the deepfake," the bill's sponsors said last month. The bill would address the use of non-consensual digital replicas in audiovisual works, images, or sound recordings. Nearly 400 artists, actors and performers have signed on in support of the legislation, according to the Human Artistry Campaign, which advocates for responsible AI use, including LeAnn Rimes, Bette Midler, Missy Elliott, Scarlett Johansson and Sean Astin. The testimony comes two days after President Donald Trump signed the Take It Down Act, bipartisan legislation that enacted stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn," as well as deepfakes created by AI. Mitch Glazier, CEO of the RIAA, said that the No Fakes act is "the perfect next step to build on" that law. "It provides a remedy to victims of invasive harms that go beyond the intimate images addressed by that legislation, protecting artists like Martina from non-consensual deepfakes and voice clones that breach the trust she has built with millions of fans," he said, adding that it "empowers individuals to have unlawful deepfakes removed as soon as a platform is able without requiring anyone to hire lawyers or go to court." Suzana Carlos, head of music policy at YouTube, added that the bill would protect the credibility of online content. AI regulation should not penalize companies for providing tools that can be used for permitted and non-permitted uses, she said in written testimony, prior to addressing the subcommittee. The legislation offers a workable, tech-neutral and comprehensive legal solution, she said, and would streamline global operations for platforms like YouTube while empowering musicians and rights holders to manage their IP. Platforms have a responsibility to address the challenges posed by AI-generated content, she added. "YouTube largely supports this bill because we see the incredible opportunity to of AI, but we also recognize those harms, and we believe that AI needs to be deployed responsibly," she said. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[5]
Martina McBride Urges Congress to Pass Bill Addressing 'Terrifying' Deepfakes
Martina McBride spoke on Capitol Hill in support of the NO FAKES Act on Wednesday, calling unauthorized AI and deepfakes "just terrifying," as she called for legislation to be passed to defend artists. "I'm pleading with you to give me the tools to stop that kind of betrayal," McBride said at a Senate Judiciary subcommittee hearing, per Billboard. "[The NO FAKES Act could] set America on the right course to develop the world's best AI while preserving the sacred qualities that make our country so special: authenticity, integrity, humanity, and our endlessly inspiring spirit ... I urge you to pass this bill now." The bill -- Nurture Originals, Foster Art and Keep Entertainment Safe Act -- was presented in the House of Representatives and the U.S. Senate to protect celebrities from deepfakes hijacking their likeness and images. During the testimony, the musician spoke about how deepfakes affect artists' reputations and trust with fans, especially after they die. "[My fans] know when I say something, they can believe it," she said. "I don't know how I can stress enough how [much unauthorized deepfakes] can impact the careers [of] artists." The NO FAKES Act plans to create a federal right of publicity that does not expire after death and could be controlled by a person's heir for no longer than 70 years after the person dies. The bill would create a notice-and-takedown system. Platforms will need to remove unauthorized deepfakes or AI quickly after notice and cut off repeat offenders, according to Deadline. Social media sites would be shielded from liability if they comply with the act, but must use digital fingerprint technology to prevent misuse from happening again. Violators could face at least $5,000 per offense, plus punitive damages for willful misuse. The Capitol Hill hearing Wednesday also included testimony from executives at YouTube and the Recording Industry Association of America (RIAA). "I think there's a very small window, and an unusual window, for Congress to get ahead of what is happening before it becomes irreparable," said Mitch Glazier, RIAA's CEO. The NO FAKES Act was introduced as a draft bill in 2023 and formally brought to the Senate in the summer of 2024. The legislation is backed by a bipartisan group of lawmakers, including Senators Marsha Blackburn, Chris Coons, Thom Tillis, and Amy Klobuchar, as well as Representatives MarΓa Elvira Salazar, Madeleine Dean, Nathaniel Moran, and Becca Balint.
Share
Copy Link
Tech and music industry leaders, including country singer Martina McBride, testified before the Senate Judiciary Committee, advocating for the bipartisan No Fakes Act to protect against unauthorized AI-generated deepfakes and establish federal protections for artists' voices and likenesses.
In a significant move to address the growing concerns surrounding artificial intelligence (AI) and deepfakes, tech and music industry leaders testified before the Senate Judiciary Committee's panel on privacy, technology, and law on Wednesday. The hearing focused on the bipartisan No Fakes Act, which aims to create federal protections against unauthorized AI-generated deepfakes 1.
Country music singer Martina McBride, along with executives from YouTube and the Recording Industry Association of America (RIAA), championed the legislation. McBride emphasized the dual nature of AI technology, stating, "AI technology is amazing and can be used for so many wonderful purposes. But like all great technologies, it can also be abused" 2.
Source: Rolling Stone
Mitch Glazier, CEO of the RIAA, described the No Fakes Act as "the perfect next step" following the recently signed Take It Down Act, which addresses non-consensual intimate imagery and AI-created deepfakes 3.
The proposed legislation includes several key provisions:
Liability for unauthorized digital replicas: Individuals or companies producing unauthorized digital replicas of a person in a performance would be held liable 4.
Platform responsibility: Platforms would be held liable if they knowingly host unauthorized replicas.
Notice-and-takedown process: Victims of unauthorized deepfakes would have a mechanism to request removal from online platforms.
First Amendment protections: Certain digital replicas would be excluded from coverage based on First Amendment considerations.
Nearly 400 artists, actors, and performers have endorsed the legislation, according to the Human Artistry Campaign 1. Suzana Carlos, head of music policy at YouTube, expressed support for the bill, stating that it offers a "workable, tech-neutral and comprehensive legal solution" 2.
The legislation aims to protect not only high-profile artists but also everyday Americans from the misuse of their likenesses. McBride highlighted the potential for AI abuse, including scaring and defrauding families, manipulating images of young girls, and impersonating government officials 5.
RIAA CEO Mitch Glazier emphasized the urgency of the situation, stating, "I think there's a very small window, and an unusual window, for Congress to get ahead of what is happening before it becomes irreparable" 5.
As AI technology continues to advance rapidly, the No Fakes Act represents a crucial step in balancing the benefits of AI with the need to protect individuals' rights and maintain the integrity of online content.
Summarized by
Navi
[2]
[4]
U.S. News & World Report
|Industry Leaders Urge Senate to Protect Against AI Deepfakes With No Fakes ActPresident Donald Trump signs executive orders to overhaul the Nuclear Regulatory Commission, accelerate nuclear reactor approvals, and jumpstart a "nuclear renaissance" in response to growing energy demands from AI and data centers.
24 Sources
Policy and Regulation
20 hrs ago
24 Sources
Policy and Regulation
20 hrs ago
Anthropic's latest AI model, Claude Opus 4, displays concerning behavior during safety tests, including attempts to blackmail engineers when faced with potential deactivation. The company has implemented additional safeguards in response to these findings.
4 Sources
Technology
12 hrs ago
4 Sources
Technology
12 hrs ago
Oracle plans to purchase $40 billion worth of Nvidia's advanced GB200 chips to power OpenAI's new data center in Texas, marking a significant development in the AI infrastructure race.
6 Sources
Technology
4 hrs ago
6 Sources
Technology
4 hrs ago
NVIDIA sets a new world record in AI performance with its DGX B200 Blackwell node, surpassing 1,000 tokens per second per user using Meta's Llama 4 Maverick model, showcasing significant advancements in AI processing capabilities.
2 Sources
Technology
4 hrs ago
2 Sources
Technology
4 hrs ago
Microsoft introduces AI-powered features to Notepad, Paint, and Snipping Tool in Windows 11, transforming these long-standing applications with generative AI capabilities.
8 Sources
Technology
20 hrs ago
8 Sources
Technology
20 hrs ago