Minnesota House Passes Landmark Bill to Ban AI Nudification Apps with $500K Penalties

2 Sources

Share

The Minnesota House has approved HF 1606, a groundbreaking bill to ban AI nudification technology that creates explicit images without consent. Companies violating the law could face civil penalties up to $500,000, while victims gain the right to pursue damages. The legislation now awaits Senate approval amid potential federal challenges.

Minnesota House Advances First-of-Its-Kind Ban on AI Nudification Technology

The Minnesota House has taken a decisive step to combat AI-generated explicit content by passing HF 1606, a landmark bill targeting nudification technology that creates fake, sexualized images of individuals without their permission

1

. The legislation passed with overwhelming support on a vote of 132-1, positioning Minnesota to become the first state to ban AI-powered nudification apps that have increasingly been weaponized against women and children

2

.

The Minnesota bill specifically prohibits the access, download, or use of nudification technology, with an exception for software requiring substantial application of technological or artistic skill by a human creator directing and controlling the output

1

. These so-called "nudify" apps use AI to alter photographs of real people, making them appear naked, placing them in pornographic videos, or turning them into sexually explicit chatbots. The legislation targets programs designed to fabricate nude photos or pornographic videos from someone's image, addressing a growing crisis of non-consensual explicit images circulating online.

Source: CBS

Source: CBS

Civil Penalties and Victim Protections Take Center Stage

Under HF 1606, companies found in violation of the law could face civil penalties of up to $500,000, establishing one of the most stringent financial consequences for AI misuse in the country

1

. The legislation also allows victims to pursue damages directly, providing a legal pathway for those harmed by deepfake nude images to seek compensation and justice

2

.

Rep. Hanson, who authored the bill, emphasized the urgency of the legislation in protecting vulnerable populations. "The misuse of this technology has harmed too many people," she stated, noting that these apps have "empowered and enabled pedophiles and sexual predators to increasingly profit while causing more and more harm around the globe, particularly to children"

1

. She added that these apps are readily available in nearly every App Store, making them easily accessible tools for predators.

Source: PetaPixel

Source: PetaPixel

Tech Giants Under Scrutiny as Investigation Reveals Platform Complicity

The timing of the Minnesota bill coincides with revelations that Apple and Google have been helping users find apps that create deepfake nude images through their own search and advertising systems

1

. Despite policies prohibiting apps that enable the creation of nonconsensual sexualized images, both the Apple App Store and Google Play have hosted dozens of apps designed to digitally remove clothing from photographs of women. The investigation found that these platforms' algorithms actively direct users toward nudification technology, increasing visibility and accessibility of these harmful tools.

Federal-State Tension Looms Over AI Regulation Landscape

While the Minnesota House has acted decisively, the legislation faces potential obstacles from the Trump administration AI regulation approach. In December, the administration announced it would challenge state-level AI laws and revealed plans for a comprehensive national legislative framework for regulating the technology

1

. The White House argued that "a patchwork of conflicting state laws would undermine American innovation and our ability to lead the global AI race"

2

.

President Trump has suggested he will sign an AI executive order, sparking concerns it could upend Minnesota's regulations. However, U.S. senators voted to strike down a proposed 10-year ban that would have blocked states and local governments from creating their own AI regulation

1

. This tension between federal and state authority will likely shape the future of AI governance as HF 1606 awaits a Senate floor vote and passage before reaching Governor Walz's desk.

Building on Existing Protections for Consent and Digital Rights

Minnesota lawmakers have previously established themselves as leaders in combating AI misuse. State legislation already makes it illegal to create and distribute AI-generated sexually explicit material of someone, as well as to use deepfakes to influence election outcomes

2

. Rep. Hanson acknowledged the courage of victims who shared their experiences: "No one should have to worry that nude images of themselves can be generated by AI, without their permission, at the push of a button. This bill would not have been possible without the brave victims who told their heartbreaking stories about this exploitative AI feature"

2

.

As the companion bill makes its way through the state Senate, Minnesota's approach could serve as a template for other states grappling with the rapid proliferation of AI-generated explicit content and the urgent need to protect individuals from digital exploitation.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo