Curated by THEOUTPOST
On Mon, 30 Sept, 8:01 AM UTC
7 Sources
[1]
New laws close gap in California on deepfake child pornography
LOS ANGELES - Using an AI-powered app to create fake nude pictures of people without their consent violates all sorts of norms, especially when those people are minors. It would not, however, violate California law - yet. But soon it will. A pair of bills newly signed by Gov. Gavin Newsom outlaw the creation, possession and distribution of sexually charged images of minors even when they're created with computers, not cameras. The measures take effect Jan. 1. The expansion of state prohibitions comes as students are increasingly being victimized by apps that use artificial intelligence either to take a photo of a fully clothed real person and digitally generate a nude body ("undresser" apps) or seamlessly superimpose the image of a person's face onto a nude body from a pornographic video. According to a survey released last month by the Center for Democracy & Technology, 40% of the students polled said they had heard about some kind of deepfake imagery being shared at their school. Of that group, nearly 38% said the images were nonconsensual and intimate or sexually explicit. Very few teachers polled said their schools had steps in place to slow the spread of nonconsensual deepfakes, the center's report said, adding, "This unfortunately leaves many students and parents in the dark and seeking answers from schools that are ill-equipped to provide them." What schools have tended to do, according to the center, is respond with expulsions or other penalties for the students who create and spread the deepfakes. For example, the Beverly Hills school district expelled five eighth-graders who shared fake nudes of 16 other eighth-graders in February. The Beverly Hills case was referred to police, but legal experts said at the time that a gap in state law appeared to leave computer-generated child sexual abuse material out of state prosecutors' reach - a situation that would apply even if the images are being created and distributed by adults. The gap stems in part from the state's legal definition of child pornography, which did not mention computer-generated images. A state appeals court ruled in 2011 that, to violate California law, "it would appear that a real child must have been used in production and actually engaged in or simulated the sexual conduct depicted." Assembly Bill 1831, authored by Assemblymember Marc Berman, D-Menlo Park, expands the state's child-porn prohibition to material that "contains a digitally altered or artificial-intelligence-generated depiction [of] what appears to be a person under 18 years of age" engaging in or simulating sexual conduct. State law defines sexual conduct not just as sexual acts, but graphic displays of nude bodies or bodily functions for the purpose of sexual stimulation. Once AB 1831 goes into effect next year, AI-generated and digitally altered material will join other types of obscene child pornography in being illegal to knowingly possess, sell to adults or distribute to minors. It's also illegal to be involved in any way in the noncommercial distribution or exchange of such goods to adults knowing that they involve child pornography, even if they're not obscene. Senate Bill 1381, authored by Sen. Aisha Wahab, D-Hayward, covers similar ground, amending state law to clearly prohibit using AI to create images of real children engaged in sexual conduct, or using children as models for digitally altered or AI-generated child pornography. Ventura County resident and former Disney actress Kaylin Hayman, 16, was a vocal advocate for AB 1831, having experienced the problem firsthand. According to the Ventura County district attorney's office, a Pennsylvania man created images that spliced her face onto sexually explicit bodies - images that were not punishable at the time under California law. Instead, the man was prosecuted in federal court, convicted and sentenced to 14 years in prison, the D.A.'s office said. "Advocating for this bill has been extremely empowering, and I am grateful to the DA's office as well as my parents for supporting me through this process," Hayman said in a news release. "This law will be revolutionary, and justice will be served to future victims." "Through our work with Kaylin Hayman, who courageously shared her experience as a victim, we were able to expose the real-life evils of computer-generated images of child sexual abuse," Dist. Atty. Erik Nasarenko said in the release. "Kaylin's strength and determination to advocate for this bill will protect minors in the future, and her efforts played a pivotal role in enacting this legislation."
[2]
California governor signs bills to protect children from AI deepfake nudes
SACRAMENTO, Calif. (AP) -- California Gov. Gavin Newsom signed a pair of proposals Sunday aiming to help shield minors from the increasingly prevalent misuse of artificial intelligence tools to generate harmful sexual imagery of children. The measures are part of California's concerted efforts to ramp up regulations around the marquee industry that is increasingly affecting the daily lives of Americans but has had little to no oversight in the United States. Earlier this month, Newsom also has signed off on some of the toughest laws to tackle election deepfakes, though the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI industry in the U.S. The new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated imagery of child sexual abuse and make it clear child pornography is illegal even if it's AI-generated. Current law does not allow district attorneys to go after people who possess or distribute AI-generated child sexual abuse images if they cannot prove the materials are depicting a real person, supporters said. Under the new laws, such an offense would qualify as a felony. "Child sexual abuse material must be illegal to create, possess, and distribute in California, whether the images are AI generated or of actual children," Democratic Assemblymember Marc Berman, who authored one of the bills, said in a statement. "AI that is used to create these awful images is trained from thousands of images of real children being abused, revictimizing those children all over again." Newsom earlier this month also signed two other bills to strengthen laws on revenge porn with the goal of protecting more women, teenage girls and others from sexual exploitation and harassment enabled by AI tools. It will be now illegal for an adult to create or share AI-generated sexually explicit deepfakes of a person without their consent under state laws. Social media platforms are also required to allow users to report such materials for removal. But some of the laws don't go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said new penalties for sharing AI-generated revenge porn should have included those under 18, too. The measure was narrowed by state lawmakers last month to only apply to adults. "There has to be consequences, you don't get a free pass because you're under 18," Gascón said in a recent interview. The laws come after San Francisco brought a first-in-the-nation lawsuit against more than a dozen websites that AI tools with a promise to "undress any photo" uploaded to the website within seconds. The problem with deepfakes isn't new, but experts say it's getting worse as the technology to produce it becomes more accessible and easier to use. Researchers have been sounding the alarm these past two years on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. In March, a school district in Beverly Hills expelled five middle school students for creating and sharing fake nudes of their classmates. The issue has prompted swift bipartisan actions in nearly 30 states to help address the proliferation of AI-generated sexually abusive materials. Some of them include protection for all, while others only outlaw materials depicting minors. Newsom has touted California as an early adopter as well as regulator of AI technology, saying the state could soon deploy generative AI tools to address highway congestion and provide tax guidance, even as his administration considers new rules against AI discrimination in hiring practices.
[3]
California governor signs bills to protect children from AI deepfake nudes
SACRAMENTO, Calif. -- California Gov. Gavin Newsom signed a pair of proposals Sunday aiming to help shield minors from the increasingly prevalent misuse of artificial intelligence tools to generate harmful sexual imagery of children. The measures are part of California's concerted efforts to ramp up regulations around the marquee industry that is increasingly affecting the daily lives of Americans but has had little to no oversight in the United States. Earlier this month, Newsom also has signed off on some of the toughest laws to tackle election deepfakes, though the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI industry in the U.S. The new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated imagery of child sexual abuse and make it clear child pornography is illegal even if it's AI-generated. Current law does not allow district attorneys to go after people who possess or distribute AI-generated child sexual abuse images if they cannot prove the materials are depicting a real person, supporters said. Under the new laws, such an offense would qualify as a felony. "Child sexual abuse material must be illegal to create, possess, and distribute in California, whether the images are AI generated or of actual children," Democratic Assemblymember Marc Berman, who authored one of the bills, said in a statement. "AI that is used to create these awful images is trained from thousands of images of real children being abused, revictimizing those children all over again." Newsom earlier this month also signed two other bills to strengthen laws on revenge porn with the goal of protecting more women, teenage girls and others from sexual exploitation and harassment enabled by AI tools. It will be now illegal for an adult to create or share AI-generated sexually explicit deepfakes of a person without their consent under state laws. Social media platforms are also required to allow users to report such materials for removal. But some of the laws don't go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said new penalties for sharing AI-generated revenge porn should have included those under 18, too. The measure was narrowed by state lawmakers last month to only apply to adults. "There has to be consequences, you don't get a free pass because you're under 18," Gascón said in a recent interview. The laws come after San Francisco brought a first-in-the-nation lawsuit against more than a dozen websites that AI tools with a promise to "undress any photo" uploaded to the website within seconds. The problem with deepfakes isn't new, but experts say it's getting worse as the technology to produce it becomes more accessible and easier to use. Researchers have been sounding the alarm these past two years on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. In March, a school district in Beverly Hills expelled five middle school students for creating and sharing fake nudes of their classmates. The issue has prompted swift bipartisan actions in nearly 30 states to help address the proliferation of AI-generated sexually abusive materials. Some of them include protection for all, while others only outlaw materials depicting minors. Newsom has touted California as an early adopter as well as regulator of AI technology, saying the state could soon deploy generative AI tools to address highway congestion and provide tax guidance, even as his administration considers new rules against AI discrimination in hiring practices.
[4]
California governor signs bills to protect children from AI deepfake nudes
SACRAMENTO, Calif. (AP) -- California Gov. Gavin Newsom signed a pair of proposals Sunday aiming to help shield minors from the increasingly prevalent misuse of artificial intelligence tools to generate harmful sexual imagery of children. The measures are part of California's concerted efforts to ramp up regulations around the marquee industry that is increasingly affecting the daily lives of Americans but has had little to no oversight in the United States. Earlier this month, Newsom also has signed off on some of the toughest laws to tackle election deepfakes, though the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI industry in the U.S. The new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated imagery of child sexual abuse and make it clear child pornography is illegal even if it's AI-generated. Current law does not allow district attorneys to go after people who possess or distribute AI-generated child sexual abuse images if they cannot prove the materials are depicting a real person, supporters said. Under the new laws, such an offense would qualify as a felony. "Child sexual abuse material must be illegal to create, possess, and distribute in California, whether the images are AI generated or of actual children," Democratic Assemblymember Marc Berman, who authored one of the bills, said in a statement. "AI that is used to create these awful images is trained from thousands of images of real children being abused, revictimizing those children all over again." Newsom earlier this month also signed two other bills to strengthen laws on revenge porn with the goal of protecting more women, teenage girls and others from sexual exploitation and harassment enabled by AI tools. It will be now illegal for an adult to create or share AI-generated sexually explicit deepfakes of a person without their consent under state laws. Social media platforms are also required to allow users to report such materials for removal. But some of the laws don't go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said new penalties for sharing AI-generated revenge porn should have included those under 18, too. The measure was narrowed by state lawmakers last month to only apply to adults. "There has to be consequences, you don't get a free pass because you're under 18," Gascón said in a recent interview. The laws come after San Francisco brought a first-in-the-nation lawsuit against more than a dozen websites that AI tools with a promise to "undress any photo" uploaded to the website within seconds. The problem with deepfakes isn't new, but experts say it's getting worse as the technology to produce it becomes more accessible and easier to use. Researchers have been sounding the alarm these past two years on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. In March, a school district in Beverly Hills expelled five middle school students for creating and sharing fake nudes of their classmates. The issue has prompted swift bipartisan actions in nearly 30 states to help address the proliferation of AI-generated sexually abusive materials. Some of them include protection for all, while others only outlaw materials depicting minors. Newsom has touted California as an early adopter as well as regulator of AI technology, saying the state could soon deploy generative AI tools to address highway congestion and provide tax guidance, even as his administration considers new rules against AI discrimination in hiring practices.
[5]
California Governor Signs Bills to Protect Children From AI Deepfake Nudes
SACRAMENTO, Calif. (AP) -- California Gov. Gavin Newsom signed a pair of proposals Sunday aiming to help shield minors from the increasingly prevalent misuse of artificial intelligence tools to generate harmful sexual imagery of children. The measures are part of California's concerted efforts to ramp up regulations around the marquee industry that is increasingly affecting the daily lives of Americans but has had little to no oversight in the United States. Earlier this month, Newsom also has signed off on some of the toughest laws to tackle election deepfakes, though the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI industry in the U.S. The new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated imagery of child sexual abuse and make it clear child pornography is illegal even if it's AI-generated. Current law does not allow district attorneys to go after people who possess or distribute AI-generated child sexual abuse images if they cannot prove the materials are depicting a real person, supporters said. Under the new laws, such an offense would qualify as a felony. "Child sexual abuse material must be illegal to create, possess, and distribute in California, whether the images are AI generated or of actual children," Democratic Assemblymember Marc Berman, who authored one of the bills, said in a statement. "AI that is used to create these awful images is trained from thousands of images of real children being abused, revictimizing those children all over again." Newsom earlier this month also signed two other bills to strengthen laws on revenge porn with the goal of protecting more women, teenage girls and others from sexual exploitation and harassment enabled by AI tools. It will be now illegal for an adult to create or share AI-generated sexually explicit deepfakes of a person without their consent under state laws. Social media platforms are also required to allow users to report such materials for removal. But some of the laws don't go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said new penalties for sharing AI-generated revenge porn should have included those under 18, too. The measure was narrowed by state lawmakers last month to only apply to adults. "There has to be consequences, you don't get a free pass because you're under 18," Gascón said in a recent interview. The laws come after San Francisco brought a first-in-the-nation lawsuit against more than a dozen websites that AI tools with a promise to "undress any photo" uploaded to the website within seconds. The problem with deepfakes isn't new, but experts say it's getting worse as the technology to produce it becomes more accessible and easier to use. Researchers have been sounding the alarm these past two years on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. In March, a school district in Beverly Hills expelled five middle school students for creating and sharing fake nudes of their classmates. The issue has prompted swift bipartisan actions in nearly 30 states to help address the proliferation of AI-generated sexually abusive materials. Some of them include protection for all, while others only outlaw materials depicting minors. Newsom has touted California as an early adopter as well as regulator of AI technology, saying the state could soon deploy generative AI tools to address highway congestion and provide tax guidance, even as his administration considers new rules against AI discrimination in hiring practices. Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[6]
California governor signs 2 bills to protect children from AI-generated deepfake sexual images
California Gov. Gavin Newsom signed a pair of proposals Sunday aiming to help shield minors from the increasingly prevalent misuse of artificial intelligence tools to generate harmful sexual imagery of children. The measures are part of California's concerted efforts to ramp up regulations around the marquee industry that is increasingly affecting the daily lives of Americans but has had little to no oversight in the United States. Earlier this month, Newsom also has signed off on some of the toughest laws to tackle election deepfakes, though the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI industry in the U.S. The new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated imagery of child sexual abuse and make it clear child pornography is illegal even if it's AI-generated. Current law does not allow district attorneys to go after people who possess or distribute AI-generated child sexual abuse images if they cannot prove the materials are depicting a real person, supporters said. Under the new laws, such an offense would qualify as a felony. "Child sexual abuse material must be illegal to create, possess, and distribute in California, whether the images are AI generated or of actual children," Democratic Assemblymember Marc Berman, who authored one of the bills, said in a statement. "AI that is used to create these awful images is trained from thousands of images of real children being abused, revictimizing those children all over again." Newsom earlier this month also signed two other bills to strengthen laws on revenge porn with the goal of protecting more women, teenage girls and others from sexual exploitation and harassment enabled by AI tools. It will be now illegal for an adult to create or share AI-generated sexually explicit deepfakes of a person without their consent under state laws. Social media platforms are also required to allow users to report such materials for removal. But some of the laws don't go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said new penalties for sharing AI-generated revenge porn should have included those under 18, too. The measure was narrowed by state lawmakers last month to only apply to adults. "There has to be consequences, you don't get a free pass because you're under 18," Gascón said in a recent interview. The laws come after San Francisco brought a first-in-the-nation lawsuit against more than a dozen websites that AI tools with a promise to "undress any photo" uploaded to the website within seconds. The problem with deepfakes isn't new, but experts say it's getting worse as the technology to produce it becomes more accessible and easier to use. Researchers have been sounding the alarm these past two years on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. In March, a school district in Beverly Hills expelled five middle school students for creating and sharing fake nudes of their classmates. The issue has prompted swift bipartisan actions in nearly 30 states to help address the proliferation of AI-generated sexually abusive materials. Some of them include protection for all, while others only outlaw materials depicting minors. Newsom has touted California as an early adopter as well as regulator of AI technology, saying the state could soon deploy generative AI tools to address highway congestion and provide tax guidance, even as his administration considers new rules against AI discrimination in hiring practices.
[7]
California Governor Signs AI Bills to Protect Children From Deepfake Nudes
California governor Gavin Newsom has signed two bills that will make it illegal for people to possess or distribute AI-generated deepfake nudes of children. On Sunday, Newsom signed a pair of proposals that aim to protect minors from AI technology being used to create harmful sexual imagery of them. The legislation will allow prosecutors to go after any individuals who own or share such AI-generated child sexual abuse images. The California governor's new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated imagery of child sexual abuse and make it clear child pornography is illegal even if it's AI-generated. According to ABC News, under current law, district attorneys cannot prosecute individuals for possessing or distributing AI-generated child sexual abuse images unless they can prove that the material depicts a real person. But these new laws would make such offenses a felony. "Child sexual abuse material must be illegal to create, possess, and distribute in California, whether the images are AI-generated or of actual children," Democratic Assemblymember Marc Berman, who authored one of the bills, says in a statement. "AI that is used to create these awful images is trained from thousands of images of real children being abused, revictimizing those children all over again." It will be now illegal for an adult to create or share AI-generated sexually explicit deepfakes of a person without their consent under state laws. Social media platforms will also required to allow users to report such materials for removal. In the last few weeks, Newsom has signed some of the toughest laws against AI-generated content in the U.S. yet -- including legislation that makes it illegal to create deepfakes related to the 2024 election. The California governor also signed two bills into law that will protect actors and performers from unauthorized AI clones. He also vetoed a first-of-its-kind bill (S.B. 1047) that will require safety testing of large AI systems or models before their release to the public. California's legislation could serve as a guide for regulators nationwide seeking to curb the spread of AI-driven manipulative content in the U.S. However, the state's new laws have already encountered legal challenges. A lawsuit was recently filed against California's new bills by a content creator who creates AI-generated, parody videos including one of vice president Kamala Harris that was shared by Elon Musk.
Share
Share
Copy Link
Governor Gavin Newsom signs bills closing legal loopholes and criminalizing AI-generated child sexual abuse material, positioning California as a leader in AI regulation.
In a significant move to protect minors from the misuse of artificial intelligence, California Governor Gavin Newsom has signed two groundbreaking bills into law. These measures aim to shield children from the growing threat of AI-generated sexual imagery and close existing legal loopholes 1.
The new laws, Assembly Bill 1831 and Senate Bill 1381, expand the state's child pornography prohibitions to include AI-generated and digitally altered material depicting minors in sexual situations 1. This legislative action addresses a critical gap in California's legal framework, which previously did not explicitly cover computer-generated images in its definition of child pornography 1.
These bills received overwhelming bipartisan support, reflecting the urgency of the issue. Under the new laws, possessing or distributing AI-generated child sexual abuse images will be classified as a felony, even if the materials do not depict a real person 2.
The legislation is part of a larger initiative by California to regulate the AI industry. Earlier this month, Newsom signed additional bills strengthening laws on revenge porn and requiring social media platforms to allow users to report AI-generated sexually explicit content for removal 3.
The push for these laws was partly inspired by real cases, including that of former Disney actress Kaylin Hayman, who advocated for AB 1831 after experiencing AI-generated sexual imagery of herself 1. The Beverly Hills school district's expulsion of five eighth-graders for sharing fake nudes of classmates also highlighted the urgency of addressing this issue 4.
The problem of AI-generated child sexual abuse material has been escalating, with a recent survey indicating that 40% of students polled had heard about deepfake imagery being shared at their school 1. This growing concern has prompted nearly 30 states to take action against AI-generated sexually abusive materials 5.
With these new laws, California positions itself as a potential leader in regulating the AI industry in the United States. Governor Newsom has emphasized the state's role as both an early adopter and regulator of AI technology, suggesting future applications in areas such as traffic management and tax guidance 5.
As AI technology continues to advance and become more accessible, these laws represent a crucial step in protecting minors from exploitation and abuse in the digital age. The effectiveness of these measures and their potential to serve as a model for other states will likely be closely watched in the coming years.
Reference
[1]
[4]
[5]
California Governor Gavin Newsom has signed multiple AI-related bills into law, addressing concerns about deepfakes, actor impersonation, and AI regulation. These new laws aim to protect individuals and establish guidelines for AI use in various sectors.
5 Sources
5 Sources
California Governor Gavin Newsom signs new laws to address the growing threat of AI-generated deepfakes in elections. The legislation aims to protect voters from misinformation and maintain election integrity.
39 Sources
39 Sources
California's legislature has passed a series of bills aimed at regulating artificial intelligence, including a ban on deepfakes in elections and measures to protect workers from AI-driven discrimination. These laws position California as a leader in AI regulation in the United States.
7 Sources
7 Sources
California's recently enacted law targeting AI-generated deepfakes in elections is being put to the test, as Elon Musk's reposting of Kamala Harris parody videos sparks debate and potential legal challenges.
6 Sources
6 Sources
A new study reveals that 1 in 6 congresswomen have been victims of AI-generated sexually explicit deepfakes, highlighting the urgent need for legislative action to combat this growing threat.
6 Sources
6 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved