Curated by THEOUTPOST
On Wed, 5 Mar, 8:05 AM UTC
5 Sources
[1]
Minnesota Considers Blocking 'Nudify' Apps That Use AI to Make Explicit Images Without Consent
ST. PAUL, Minn. (AP) -- Molly Kelly was stunned to discover in June that someone she knew had used widely available "nudification" technology to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media. "My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender," Kelly said. Backed by her testimony, Minnesota is considering a new strategy for cracking down on deepfake pornography. A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos. States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created -- before it spreads online. Experts on AI law caution the proposal might be unconstitutional on free speech grounds. Why advocates say the bill is needed The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of "nudification" sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 "for each unlawful access, download, or use." Developers would need to figure out how to exclude Minnesota users. It's not just the dissemination that's harmful to victims, she said. It's the fact that these images exist at all. Kelly told reporters last month that anyone can quickly create "hyper-realistic nude images or pornographic video" in minutes. Most law enforcement attention so far has been focused on distribution and possession. Congress, states and cities are also trying other tactics San Francisco in August filed a first-of-its-kind lawsuit against several widely visited "nudification" websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. That case remains pending. The U.S. Senate last month unanimously approved a bill by Democrat Amy Klobuchar, of Minnesota, and Republican Ted Cruz, of Texas, to make it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. Social media platforms would be required to remove them within 48 hours of notice from a victim. Melania Trump on Monday used her first solo appearance since becoming first lady again to urge passage by the Republican-controlled House, where it's pending. The Kansas House last month approved a bill that expands the definition of illegal sexual exploitation of a child to include possession of images generated with AI if they're "indistinguishable from a real child, morphed from a real child's image or generated without any actual child involvement." A bill introduced in the Florida Legislature creates a new felony for people who use technology such as AI to generate nude images and criminalizes possession of child sexual abuse images generated with it. Broadly similar bills have also been introduced in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina and Texas, according to an Associated Press analysis using the bill-tracking software Plural. Maye Quade said she'll be sharing her proposal with legislators in other states because few are aware the technology is so readily accessible. "If we can't get Congress to act, then we can maybe get as many states as possible to take action," Maye Quade said. Victims tell their stories Sandi Johnson, senior legislative policy counsel for the victim's rights group RAINN -- the Rape, Abuse and Incest National Network -- said the Minnesota bill would hold websites accountable. "Once the images are created, they can be posted anonymously, or rapidly widely disseminated, and become nearly impossible to remove," she testified recently. Megan Hurley also was horrified to learn someone had generated explicit images and video of her using a "nudification" site. She said she feels especially humiliated because she's a massage therapist, a profession that's already sexualized in some minds. "It is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren," Hurley said. "I do not understand why this technology exists and I find it abhorrent there are companies out there making money in this manner." AI experts urge caution However, two AI law experts -- Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University's Institute of Human-Centered Artificial Intelligence -- said the Minnesota bill is too broadly constructed to survive a court challenge. Limiting the scope only to images of real children might help it withstand a First Amendment challenge since those are generally not protected, Pfefferkorn said. But she said it would still potentially conflict with a federal law that says you can't sue websites for content that users generate. "If Minnesota wants to go down this direction, they'll need to add a lot more clarity to the bill," Unger said. "And they'll have to narrow what they mean by nudify and nudification." But Maye Quade said she thinks her legislation is on solid constitutional ground because it's regulating conduct, not speech. "This cannot continue," she said. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature." ___ Associated Press reporters Matt O'Brien, John Hanna and Kate Payne contributed to this story from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[2]
Minnesota considers blocking 'nudify' apps that use AI to make explicit images without consent
ST. PAUL, Minn. (AP) -- Molly Kelly was stunned to discover in June that someone she knew had used widely available "nudification" technology to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media. "My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender," Kelly said. Backed by her testimony, Minnesota is considering a new strategy for cracking down on deepfake pornography. A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos. States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created -- before it spreads online. Experts on AI law caution the proposal might be unconstitutional on free speech grounds. Why advocates say the bill is needed The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of "nudification" sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 "for each unlawful access, download, or use." Developers would need to figure out how to exclude Minnesota users. It's not just the dissemination that's harmful to victims, she said. It's the fact that these images exist at all. Kelly told reporters last month that anyone can quickly create "hyper-realistic nude images or pornographic video" in minutes. Most law enforcement attention so far has been focused on distribution and possession. Congress, states and cities are also trying other tactics San Francisco in August filed a first-of-its-kind lawsuit against several widely visited "nudification" websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. That case remains pending. The U.S. Senate last month unanimously approved a bill by Democrat Amy Klobuchar, of Minnesota, and Republican Ted Cruz, of Texas, to make it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. Social media platforms would be required to remove them within 48 hours of notice from a victim. Melania Trump on Monday used her first solo appearance since becoming first lady again to urge passage by the Republican-controlled House, where it's pending. The Kansas House last month approved a bill that expands the definition of illegal sexual exploitation of a child to include possession of images generated with AI if they're "indistinguishable from a real child, morphed from a real child's image or generated without any actual child involvement." A bill introduced in the Florida Legislature creates a new felony for people who use technology such as AI to generate nude images and criminalizes possession of child sexual abuse images generated with it. Broadly similar bills have also been introduced in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina and Texas, according to an Associated Press analysis using the bill-tracking software Plural. Maye Quade said she'll be sharing her proposal with legislators in other states because few are aware the technology is so readily accessible. "If we can't get Congress to act, then we can maybe get as many states as possible to take action," Maye Quade said. Victims tell their stories Sandi Johnson, senior legislative policy counsel for the victim's rights group RAINN -- the Rape, Abuse and Incest National Network -- said the Minnesota bill would hold websites accountable. "Once the images are created, they can be posted anonymously, or rapidly widely disseminated, and become nearly impossible to remove," she testified recently. Megan Hurley also was horrified to learn someone had generated explicit images and video of her using a "nudification" site. She said she feels especially humiliated because she's a massage therapist, a profession that's already sexualized in some minds. "It is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren," Hurley said. "I do not understand why this technology exists and I find it abhorrent there are companies out there making money in this manner." AI experts urge caution However, two AI law experts -- Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University's Institute of Human-Centered Artificial Intelligence -- said the Minnesota bill is too broadly constructed to survive a court challenge. Limiting the scope only to images of real children might help it withstand a First Amendment challenge since those are generally not protected, Pfefferkorn said. But she said it would still potentially conflict with a federal law that says you can't sue websites for content that users generate. "If Minnesota wants to go down this direction, they'll need to add a lot more clarity to the bill," Unger said. "And they'll have to narrow what they mean by nudify and nudification." But Maye Quade said she thinks her legislation is on solid constitutional ground because it's regulating conduct, not speech. "This cannot continue," she said. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature." ___ Associated Press reporters Matt O'Brien, John Hanna and Kate Payne contributed to this story from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively.
[3]
Minnesota considers blocking 'nudify' apps that use AI to make explicit images without consent
ST. PAUL, Minn. -- Molly Kelly was stunned to discover in June that someone she knew had used widely available "nudification" technology to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media. "My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender," Kelly said. Backed by her testimony, Minnesota is considering a new strategy for cracking down on deepfake pornography. A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos. States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created -- before it spreads online. Experts on AI law caution the proposal might be unconstitutional on free speech grounds. The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of "nudification" sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 "for each unlawful access, download, or use." Developers would need to figure out how to exclude Minnesota users. It's not just the dissemination that's harmful to victims, she said. It's the fact that these images exist at all. Kelly told reporters last month that anyone can quickly create "hyper-realistic nude images or pornographic video" in minutes. Most law enforcement attention so far has been focused on distribution and possession. San Francisco in August filed a first-of-its-kind lawsuit against several widely visited "nudification" websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. That case remains pending. The U.S. Senate last month unanimously approved a bill by Democrat Amy Klobuchar, of Minnesota, and Republican Ted Cruz, of Texas, to make it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. Social media platforms would be required to remove them within 48 hours of notice from a victim. Melania Trump on Monday used her first solo appearance since becoming first lady again to urge passage by the Republican-controlled House, where it's pending. The Kansas House last month approved a bill that expands the definition of illegal sexual exploitation of a child to include possession of images generated with AI if they're "indistinguishable from a real child, morphed from a real child's image or generated without any actual child involvement." A bill introduced in the Florida Legislature creates a new felony for people who use technology such as AI to generate nude images and criminalizes possession of child sexual abuse images generated with it. Broadly similar bills have also been introduced in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina and Texas, according to an Associated Press analysis using the bill-tracking software Plural. Maye Quade said she'll be sharing her proposal with legislators in other states because few are aware the technology is so readily accessible. "If we can't get Congress to act, then we can maybe get as many states as possible to take action," Maye Quade said. Sandi Johnson, senior legislative policy counsel for the victim's rights group RAINN -- the Rape, Abuse and Incest National Network -- said the Minnesota bill would hold websites accountable. "Once the images are created, they can be posted anonymously, or rapidly widely disseminated, and become nearly impossible to remove," she testified recently. Megan Hurley also was horrified to learn someone had generated explicit images and video of her using a "nudification" site. She said she feels especially humiliated because she's a massage therapist, a profession that's already sexualized in some minds. "It is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren," Hurley said. "I do not understand why this technology exists and I find it abhorrent there are companies out there making money in this manner." However, two AI law experts -- Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University's Institute of Human-Centered Artificial Intelligence -- said the Minnesota bill is too broadly constructed to survive a court challenge. Limiting the scope only to images of real children might help it withstand a First Amendment challenge since those are generally not protected, Pfefferkorn said. But she said it would still potentially conflict with a federal law that says you can't sue websites for content that users generate. "If Minnesota wants to go down this direction, they'll need to add a lot more clarity to the bill," Unger said. "And they'll have to narrow what they mean by nudify and nudification." But Maye Quade said she thinks her legislation is on solid constitutional ground because it's regulating conduct, not speech. "This cannot continue," she said. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature." ___ Associated Press reporters Matt O'Brien, John Hanna and Kate Payne contributed to this story from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively.
[4]
Minnesota considers blocking 'nudify' apps that use AI to make explicit images without consent
ST. PAUL, Minn. (AP) -- Molly Kelly was stunned to discover in June that someone she knew had used widely available "nudification" technology to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media. "My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender," Kelly said. Backed by her testimony, Minnesota is considering a new strategy for cracking down on deepfake pornography. A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos. States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created -- before it spreads online. Experts on AI law caution the proposal might be unconstitutional on free speech grounds. The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of "nudification" sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 "for each unlawful access, download, or use." Developers would need to figure out how to exclude Minnesota users. It's not just the dissemination that's harmful to victims, she said. It's the fact that these images exist at all. Kelly told reporters last month that anyone can quickly create "hyper-realistic nude images or pornographic video" in minutes. Most law enforcement attention so far has been focused on distribution and possession. San Francisco in August filed a first-of-its-kind lawsuit against several widely visited "nudification" websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. That case remains pending. The U.S. Senate last month unanimously approved a bill by Democrat Amy Klobuchar, of Minnesota, and Republican Ted Cruz, of Texas, to make it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. Social media platforms would be required to remove them within 48 hours of notice from a victim. Melania Trump on Monday used her first solo appearance since becoming first lady again to urge passage by the Republican-controlled House, where it's pending. The Kansas House last month approved a bill that expands the definition of illegal sexual exploitation of a child to include possession of images generated with AI if they're "indistinguishable from a real child, morphed from a real child's image or generated without any actual child involvement." A bill introduced in the Florida Legislature creates a new felony for people who use technology such as AI to generate nude images and criminalizes possession of child sexual abuse images generated with it. Broadly similar bills have also been introduced in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina and Texas, according to an Associated Press analysis using the bill-tracking software Plural. Maye Quade said she'll be sharing her proposal with legislators in other states because few are aware the technology is so readily accessible. "If we can't get Congress to act, then we can maybe get as many states as possible to take action," Maye Quade said. Sandi Johnson, senior legislative policy counsel for the victim's rights group RAINN -- the Rape, Abuse and Incest National Network -- said the Minnesota bill would hold websites accountable. "Once the images are created, they can be posted anonymously, or rapidly widely disseminated, and become nearly impossible to remove," she testified recently. Megan Hurley also was horrified to learn someone had generated explicit images and video of her using a "nudification" site. She said she feels especially humiliated because she's a massage therapist, a profession that's already sexualized in some minds. "It is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren," Hurley said. "I do not understand why this technology exists and I find it abhorrent there are companies out there making money in this manner." However, two AI law experts -- Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University's Institute of Human-Centered Artificial Intelligence -- said the Minnesota bill is too broadly constructed to survive a court challenge. Limiting the scope only to images of real children might help it withstand a First Amendment challenge since those are generally not protected, Pfefferkorn said. But she said it would still potentially conflict with a federal law that says you can't sue websites for content that users generate. "If Minnesota wants to go down this direction, they'll need to add a lot more clarity to the bill," Unger said. "And they'll have to narrow what they mean by nudify and nudification." But Maye Quade said she thinks her legislation is on solid constitutional ground because it's regulating conduct, not speech. "This cannot continue," she said. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature." Associated Press reporters Matt O'Brien, John Hanna and Kate Payne contributed to this story from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively.
[5]
Minnesota Considers Law to Block 'Nudification' Apps That Use AI to Make Explicit Images
Minnesota is considering a landmark law to block widely accessible "nudification" technology and apps that use AI to create explicit images from regular photos without consent. A bipartisan bill, currently under review, aims to target the companies that run these nudification websites and apps. These apps -- which have soared in popularity in the last year -- allow users to upload an ordinary photo, which is then transformed to produce hyper-realistic nude images or pornographic videos. According to a report by AP News, Democratic Senator Erin Maye Quade, the bill's lead author, argues that stronger regulations are needed to combat deepfake pornography as AI technology advances at an alarming pace. Maye Quade says the bill would require the operators of "nudification" sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 "for each unlawful access, download, or use." App and website developers would need to determine how to turn off the function for Minnesota users. Maye Quade also plans to share her proposal with lawmakers in other states, highlighting how little awareness exists around the ease of access to this technology and its ability to generate explicit images in minutes. Some states have banned the distribution of sexually explicit deepfakes as a way to regulate AI-generated content. However, Minnesota's groundbreaking bill focuses on stopping such material from being created in the first place -- before it can spread online. "It's not just the dissemination that's harmful to victims," Maye Quade tells AP News. "It's the fact that these images exist at all." The bill comes after San Francisco filed a first-of-its-kind lawsuit in August against several widely visited "nudification" websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. The San Francisco City Attorney's office is suing 16 of the most frequently visited AI-powered "undressing" websites, often used to create nude deepfakes of women and girls without their consent. These platforms allow users to upload images of real, fully clothed people, which are then digitally "undressed" with AI tools that simulate nudity. San Francisco City Attorney David Chiu says the targeted websites were collectively visited over 200 million times in the first six months of 2024 alone.
Share
Share
Copy Link
Minnesota legislators are considering a pioneering bill to block AI-powered 'nudification' apps and websites, aiming to prevent the creation of non-consensual explicit images before they can spread online.
In a groundbreaking move, Minnesota legislators are considering a new bill to combat the rising threat of AI-powered "nudification" technology. This proposed law aims to prevent the creation of non-consensual explicit images by targeting the companies behind these controversial apps and websites 123.
The bill gained momentum after Molly Kelly, a victim of AI-generated explicit content, shared her harrowing experience. Kelly discovered that someone she knew had used readily available "nudification" technology to create highly realistic and sexually explicit videos and images of her, using family photos posted on social media 123.
"My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota," Kelly recounted, highlighting the widespread nature of this issue 123.
Democratic Senator Erin Maye Quade, the bill's lead author, emphasizes the need for additional restrictions due to the rapid advancement of AI technology. The proposed legislation would:
Unlike existing laws that primarily target the distribution of sexually explicit deepfakes, Minnesota's approach aims to prevent the creation of such material before it can spread online. Maye Quade argues, "It's not just the dissemination that's harmful to victims. It's the fact that these images exist at all" 1234.
The Minnesota bill is part of a broader national effort to address AI-generated explicit content:
AI law experts Wayne Unger and Riana Pfefferkorn caution that the Minnesota bill might face constitutional challenges on free speech grounds. They suggest:
Despite potential legal hurdles, Maye Quade remains confident in the bill's constitutional standing, arguing that it regulates conduct rather than speech. As AI technology continues to evolve, this pioneering legislation could set a precedent for other states grappling with similar issues 12345.
Reference
[1]
U.S. News & World Report
|Minnesota Considers Blocking 'Nudify' Apps That Use AI to Make Explicit Images Without Consent[2]
[3]
[4]
San Francisco's city attorney has filed a lawsuit against websites creating AI-generated nude images of women and girls without consent. The case highlights growing concerns over AI technology misuse and its impact on privacy and consent.
12 Sources
12 Sources
Governor Gavin Newsom signs bills closing legal loopholes and criminalizing AI-generated child sexual abuse material, positioning California as a leader in AI regulation.
7 Sources
7 Sources
A new study reveals that 1 in 6 congresswomen have been victims of AI-generated sexually explicit deepfakes, highlighting the urgent need for legislative action to combat this growing threat.
6 Sources
6 Sources
The U.S. Senate has unanimously passed the DEFIANCE Act, a bipartisan bill aimed at empowering victims of non-consensual deepfake pornography. The legislation allows victims to sue creators and distributors of such content.
4 Sources
4 Sources
U.S. law enforcement agencies are cracking down on the spread of AI-generated child sexual abuse imagery, as the Justice Department and states take action to prosecute offenders and update laws to address this emerging threat.
7 Sources
7 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved