2 Sources
2 Sources
[1]
Minnesota Passes Landmark Bill to Ban AI Nudification Apps
The Minnesota House has passed a landmark bill to ban AI nudification apps and websites that create explicit images of people, mostly women, without consent. On Thursday, the Minnesota House passed HF 1606, a bill to prohibit the use of nudification technology in the state. The legislation would make Minnesota the first state to take this step to prevent the creation of AI-generated non-consensual intimate images. So-called nudification or "nudify" apps use AI to alter photographs of real people, making them appear naked, placing them in pornographic videos, or turning them into sexually explicit chatbots. The Minnesota bill "prohibits the access, download, or use of nudification technology, except when the website, app, or software requires the substantial application of technological or artistic skill by a human creator directing and controlling the output," according to the House. It targets programs that use AI to fabricate a nude photo or pornographic video from someone's image. Under the bill, companies that violate the law could face civil penalties of up to $500,000. It also allows victims to pursue damages. "The misuse of this technology has harmed too many people," Rep. Hanson, who authored the bill, says in a press release. "It has empowered and enabled pedophiles and sexual predators to increasingly profit while causing more and more harm around the globe, particularly to children." "These apps are readily available in nearly every app store, and we should not leave this dangerous tool to be weaponized by more and more predators," she adds. HF 1606 now needs a Senate floor vote and passage to make its way to Governor Walz's desk to get signed into law. However, according to CBS News, in December, the Trump administration announced it would challenge AI laws made at the state level and revealed plans for a "comprehensive national legislative framework" for regulating the technology. President Trump suggested he will sign an AI executive order, sparking fears it would upend Minnesota's regulations -- despite U.S. senators voting to strike down a proposed 10-year ban that would have blocked states and local governments from creating their own regulations around the technology. The news of Minnesota's ban come after an investigation revealed that Apple and Google are helping users find apps that create deepfake nude images. Apple App Store and Google Play hosted dozens of apps designed to digitally remove clothing from photographs of women -- despite policies that prohibit apps enabling the creation of nonconsensual sexualized images. The latest investigation found that the platforms' own search and advertising systems direct users toward these apps, increasing their visibility.
[2]
Minnesota House passes bill banning "nudification technology"
The Minnesota House has passed a bill to ban apps and websites that create fake, sexualized pictures of individuals. The bill "prohibits the access, download, or use of nudification technology, except when the website, app, or software requires the substantial application of technological or artistic skill by a human creator directing and controlling the output," according to the House. It targets programs that use AI to fabricate a nude photo or pornographic video from someone's image. The legislation subjects companies in violation to a civil penalty of up to half a million dollars, and allows victims to seek damages. A companion bill is also making its way through the state Senate. "No one should have to worry that nude images of themselves can be generated by AI, without their permission, at the push of a button," said Rep. Jess Hanson, who authored the bill. "This bill would not have been possible without the brave victims who told their heartbreaking stories about this exploitative AI feature." The bill passed on a vote of 132-1. State lawmakers previously made it illegal to create and distribute AI-generated sexually explicit material of someone, as well as to use deepfakes to influence the outcome of an election. The Trump administration last year announced it would challenge AI laws made at the state level, and last month announced a "comprehensive national legislative framework" for regulating the technology. The White House said "a patchwork of conflicting state laws would undermine American innovation and our ability to lead the global AI race." The White House itself has posted AI-altered images of Minnesota protesters on social media, and President Trump earlier this month posted an AI image of himself as Jesus.
Share
Share
Copy Link
The Minnesota House has approved HF 1606, a groundbreaking bill to ban AI nudification technology that creates explicit images without consent. Companies violating the law could face civil penalties up to $500,000, while victims gain the right to pursue damages. The legislation now awaits Senate approval amid potential federal challenges.
The Minnesota House has taken a decisive step to combat AI-generated explicit content by passing HF 1606, a landmark bill targeting nudification technology that creates fake, sexualized images of individuals without their permission
1
. The legislation passed with overwhelming support on a vote of 132-1, positioning Minnesota to become the first state to ban AI-powered nudification apps that have increasingly been weaponized against women and children2
.The Minnesota bill specifically prohibits the access, download, or use of nudification technology, with an exception for software requiring substantial application of technological or artistic skill by a human creator directing and controlling the output
1
. These so-called "nudify" apps use AI to alter photographs of real people, making them appear naked, placing them in pornographic videos, or turning them into sexually explicit chatbots. The legislation targets programs designed to fabricate nude photos or pornographic videos from someone's image, addressing a growing crisis of non-consensual explicit images circulating online.
Source: CBS
Under HF 1606, companies found in violation of the law could face civil penalties of up to $500,000, establishing one of the most stringent financial consequences for AI misuse in the country
1
. The legislation also allows victims to pursue damages directly, providing a legal pathway for those harmed by deepfake nude images to seek compensation and justice2
.Rep. Hanson, who authored the bill, emphasized the urgency of the legislation in protecting vulnerable populations. "The misuse of this technology has harmed too many people," she stated, noting that these apps have "empowered and enabled pedophiles and sexual predators to increasingly profit while causing more and more harm around the globe, particularly to children"
1
. She added that these apps are readily available in nearly every App Store, making them easily accessible tools for predators.
Source: PetaPixel
The timing of the Minnesota bill coincides with revelations that Apple and Google have been helping users find apps that create deepfake nude images through their own search and advertising systems
1
. Despite policies prohibiting apps that enable the creation of nonconsensual sexualized images, both the Apple App Store and Google Play have hosted dozens of apps designed to digitally remove clothing from photographs of women. The investigation found that these platforms' algorithms actively direct users toward nudification technology, increasing visibility and accessibility of these harmful tools.Related Stories
While the Minnesota House has acted decisively, the legislation faces potential obstacles from the Trump administration AI regulation approach. In December, the administration announced it would challenge state-level AI laws and revealed plans for a comprehensive national legislative framework for regulating the technology
1
. The White House argued that "a patchwork of conflicting state laws would undermine American innovation and our ability to lead the global AI race"2
.President Trump has suggested he will sign an AI executive order, sparking concerns it could upend Minnesota's regulations. However, U.S. senators voted to strike down a proposed 10-year ban that would have blocked states and local governments from creating their own AI regulation
1
. This tension between federal and state authority will likely shape the future of AI governance as HF 1606 awaits a Senate floor vote and passage before reaching Governor Walz's desk.Minnesota lawmakers have previously established themselves as leaders in combating AI misuse. State legislation already makes it illegal to create and distribute AI-generated sexually explicit material of someone, as well as to use deepfakes to influence election outcomes
2
. Rep. Hanson acknowledged the courage of victims who shared their experiences: "No one should have to worry that nude images of themselves can be generated by AI, without their permission, at the push of a button. This bill would not have been possible without the brave victims who told their heartbreaking stories about this exploitative AI feature"2
.As the companion bill makes its way through the state Senate, Minnesota's approach could serve as a template for other states grappling with the rapid proliferation of AI-generated explicit content and the urgent need to protect individuals from digital exploitation.
Summarized by
Navi
05 Mar 2025•Policy and Regulation

30 Sept 2024

13 Jan 2026•Policy and Regulation

1
Technology

2
Science and Research

3
Technology
