The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Tue, 29 Apr, 8:02 AM UTC
16 Sources
[1]
Deepfake 'Revenge Porn' Bill Clears Congress: What to Know Before You Post
Congress this week had a rare moment of bipartisanship, with the House passing the deepfake-focused Take It Down Act by a 409-2 vote. The bill criminalizes "revenge porn," or the non-consensual publishing of sexually explicit content, including those generated using AI. Social media platforms must take down such media within 48 hours of notice. The Senate unanimously approved the Take it Down Act in February; it now heads to President Trump's desk. He's expected to sign the bill, which also has the support of First Lady Melania Trump via her "Be Best" anti-cyberbullying campaign. Non-consensual intimate images (NCII) are a growing problem for younger users. One of the inspirations behind the bill is a 14-year-old girl from Texas whose male classmate created a deepfake nude image of her and posted it on social media. Though the image was eventually taken down, no law required its immediate removal. Multiple AI tools help convert normal photographs or videos to deepfake pornography. Apple removed three such apps from the App Store last year, and San Francisco sued 16 AI-powered websites that help "undress" women in August. According to NBC News, at least 15% of high school students know about the existence of deepfake images of someone they know. "If you're a victim of revenge or AI-generated explicit imagery, your life changes forever. Most likely, you've been targeted by someone you know, and you're likely struggling to have that material removed from the internet," says bill sponsor Sen. Ted Cruz (R-TX). "The Take It Down Act empowers victims across the entire United States. It makes it a felony for these deviants to publish any non-consensual intimate images." Critics argue the law could be misused, mainly since its enforcement lies with the Federal Trade Commission. "This is an alarming expansion of the FTC's enforcement authority, especially under an administration that has openly expressed hostility to nonprofit organizations that do not serve its political interests," writes Cyber Civil Rights Initiative (CCRI), an organization dedicated to combating image-based sexual abuse. "Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII," CCRI adds. The organization also found a loophole in the law "that would seemingly allow a person to disclose intimate images without consent so long as that person also appears in the image." The bill could also be used for broader censorship, according to Electronic Frontier Foundation (EFF). Platforms must view flagged content to remove it, whether publicly available or circulated through direct messages (DMs). Therefore, they "may respond by abandoning encryption entirely (for messages) in order to be able to monitor content -- turning private conversations into surveilled spaces," says India McKinney, EFF's director of federal affairs.
[2]
Take It Down Act, addressing nonconsensual deepfakes and 'revenge porn,' passes. What is it?
Congress has overwhelmingly approved bipartisan legislation to enact stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn." Known as the Take It Down Act, the bill is now headed to President Donald Trump's desk for his signature. The measure was introduced by Sen. Ted Cruz, a Republican from Texas, and Sen. Amy Klobuchar, a Democrat from Minnesota, and later gained the support of First Lady Melania Trump. Critics of the bill, which addresses both real and artificial intelligence-generated imagery, say the language is too broad and could lead to censorship and First Amendment issues. The bill makes it illegal to "knowingly publish" or threaten to publish intimate images without a person's consent, including AI-created "deepfakes." It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies. The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March saying it was "heartbreaking" to see what teenagers, especially girls, go through after they are victimized by people who spread such content. President Trump is expected to sign it into law. Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated "deepfake" of the then 14-year-old. Meta, which owns and operates Facebook and Instagram, supports the legislation. "Having an intimate image - real or AI-generated - shared without consent can be devastating and Meta developed and backs many efforts to help prevent it," Meta spokesman Andy Stone said last month. The Information Technology and Innovation Foundation, a tech industry-supported think tank, said in a statement Monday that the bill's passage "is an important step forward that will help people pursue justice when they are victims of non-consensual intimate imagery, including deepfake images generated using AI." "We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse," Klobuchar said in a statement after the bill's passage late Monday. "These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable." Free speech advocates and digital rights groups say the bill is too broad and could lead to the censorship of legitimate images including legal pornography and LGBTQ content, as well as government critics. "While the bill is meant to address a serious problem, good intentions alone are not enough to make good policy," said the nonprofit Electronic Frontier Foundation, a digital rights advocacy group. "Lawmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse." The takedown provision in the bill "applies to a much broader category of content -- potentially any images involving intimate or sexual content" than the narrower definitions of non-consensual intimate imagery found elsewhere in the text, EFF said. "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools," EFF said. "They frequently flag legal content, from fair-use commentary to news reporting. The law's tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal." As a result, the group said online companies, especially smaller ones that lack the resources to wade through a lot of content, "will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it." The measure, EFF said, also pressures platforms to "actively monitor speech, including speech that is presently encrypted" to address liability threats. The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said it has "serious reservations" about the bill. It called its takedown provision unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse." For instance, the group said, platforms could be obligated to remove a journalist's photographs of a topless protest on a public street, photos of a subway flasher distributed by law enforcement to locate the perpetrator, commercially produced sexually explicit content or sexually explicit material that is consensual but falsely reported as being nonconsensual.
[3]
US Congress passes 'Take It Down' revenge porn bill that also covers AI deepfakes
It will require online platforms to remove non-consensual posting of sexual images within 48 hours. The US House of Representatives has passed the Take It Down Act, a bipartisan bill that criminalizes the "publication of non-consensual, sexually exploitative images," including AI-generated deepfakes that depict "identifiable, real people." It would also compel platforms, such as social networks, to remove those images within 48 hours of being notified. The bill enjoyed overwhelming support in Congress and was cleared for approval by President Trump with a vote of 409 to 2. It passed Senate unanimously in February, and Trump, who previously talked about it while addressing Congress, is expected to sign the bill into law. Nearly every state in the country has its own laws revolving around revenge porn, and there are 20 states that already have laws that cover deepfakes. Take It Down's authors, who include Senator Ted Cruz, explained that those laws "vary in classification of crime and penalty and have uneven criminal prosecution." Victims are also still having a tough time getting their images removed under those laws. However, it's that takedown provision in the bill that has raised concerns among critics. According to the Electronic Frontier Foundation, the provision could potentially apply to any image that's perceived as sexual or intimate even if it's not revenge porn. It has much broader definitions of what a "non-consensual, sexually exploitative image" is compared to its narrower definitions in other parts of the bill, the organization said. In addition, the EFF argued that the bill lacks safeguards against bad-faith takedown requests. Since online platforms typically use automated systems to remove content, and 48 hours are likely not enough time to verify each request's legitimacy, they'll most likely just depublish most reported images without checking them first. One of the Republican representatives who voted against the bill said it was "ripe for abuse, with unintended consequences." But Cruz previously said after introducing Take It Down that it will "protect and empower all victims" of revenge porn by "creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images."
[4]
Activists Warn Newly Passed Bill to Combat Deepfakes Will Cause Online Chaos
While the legislation is well intended, it offers a broad opportunity for abuse. When it comes to digital rights and protection, the United States' federal government has generally failed to establish comprehensive legislation. On Monday, Congress seemed to make progress by nearly unanimously passing a bill to combat nonconsensual intimate images online. But as the bill heads to President Trump, who already voiced his intentions to sign it, digital rights advocates warn that a combination of vague language and lack of safeguards makes it ripe for misuse. Introduced by Sens. Ted Cruz and Amy Klobuchar in 2024, the Take It Down Act criminalizes the distribution of nonconsensual intimate images (NCII) which includes "revenge porn" and AI deepfakes. It also requires specific platforms to establish processes to report NCII and remove offending content within 48 hours of notification. The bill has garnered bipartisan support, including from Trump's own family as his wife, Melania, hosted a White House roundtable about it in March. That same month, Trump told Congress that he "look[s] forward to signing that bill into law" during his address at a joint session. He added, "I'm going to use that bill for myself too if you don't mind because nobody gets treated worse than I do online, nobody." In February, the bill unanimously passed the Senate, and it cleared the House with a 409-2 vote, which Cruz celebrated as a "historic win." In a statement, Cruz said, "By requiring social media companies to take down this abusive content quickly, we are sparing victims from repeated trauma and holding predators accountable." "These images can ruin lives and reputations," Klobuchar said in the same statement. "Victims will now be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable." Although the majority of states have laws prohibiting nonconsensual pornography, they don't adequately protect a growing pool of victims. In a 2019 study, one of twelve participants reported victimization at least once in their lives, with women reporting higher rates of victimization. As AI further accelerates the issue by generating content featuring adults and children, states also struggle to define and regulate "deepfakes". On its surface, the Take It Down Act should be celebrated as a major advancement. But on Monday, the Cyber Civil Rights Initiative, which focuses on combating nonconsensual images, outlined numerous issues, including that the takedown provision is "highly susceptible to misuse and will likely be counter-productive for victims." For example, there aren't any safeguards against fake complaints, so it can easily be misappropriated to remove other content. CCRI isn't alone in its criticisms. In a statement, the Electronic Frontier Foundation also criticized the takedown provision, writing, "Services will rely on automated filters, which are infamously blunt tools." With the law's tight timeframe, platforms will likely "choose to avoid the onerous legal risk by simply depublishing" content before checking if it's actually a problem. These aren't unfounded concerns. While some argue that the legislation is sound and cannot be misused by bad-faith individuals, including government officials, there are examples to look to. Between June 2019 and January 2020, the Digital Millennium Copyright Act received over thirty thousand false notices, which may be attempts to censor online speech or protect the reputation of public officials. YouTube famously has a takedown first, ask questions later copyright policy. With the Take It Down Act expected to become law soon, other pending legislation like the DEFIANCE Act, which allows deepfake victims to sue those who create, share, and receive them, could build on its protections. But in a statement, Public Knowledge's Senior Policy Counsel, Nick Garcia, said, "This was a chance to get it right, but unfortunately, Congress only got it half rightâ€"and half right laws can do real damage."
[5]
Congress passes bill to fight deepfake nudes, revenge porn
Trump is expected to sign into law a bill that would force online platforms to quickly take down nonconsensual intimate images. The U.S. House of Representatives on Monday voted overwhelmingly to pass a bill aimed at cracking down on the posting of sexual images and videos of people online without their consent, including AI-generated "deepfake" nudes of real people. The bipartisan Take It Down Act, which passed the Senate unanimously in February, now heads to the desk of President Donald Trump, who is expected to sign it into law. The bill makes it a federal crime to publish nonconsensual intimate imagery, or NCII, of any person and requires online platforms to remove such imagery within 48 hours when someone reports it. That would make it the first significant internet law of Trump's second term and the first U.S. law to take aim at the fast-growing problem of NCII. The bill's passage delighted many advocates for survivors and victims of revenge porn and "sextortion" scams, while some free expression and privacy advocates said they worry it will be abused. The legislation's passage by a vote of 409-2 marks a victory for first lady Melania Trump, who has championed the bill as part of her "Be Best" campaign against cyberbullying. The president indicated in March that he plans to sign it -- and quipped that it is a personal boon, "because nobody gets treated worse than I do online." Republican support for the bill galvanized when the first lady and Sen. Ted Cruz (R-Texas), who co-authored the Senate's version of the bill with Sen. Amy Klobuchar (D-Minnesota), held a Capitol Hill roundtable in March with advocates who spoke about their personal experiences with NCII. They included Elliston Berry, who was 14 when a male classmate used an AI app to create fake pornographic images of her and posted them to Snapchat. Another speaker, South Carolina state Rep. Brandon Guffey (R), related that his 17-year-old son killed himself in 2022 after a sextortion scammer enticed him to send nude images and then blackmailed him. "Today's bipartisan passage of the Take It Down Act is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children," Melania Trump said in a statement. Hundreds of AI "undress" apps that can forge images of real people in seconds have proliferated across the internet in recent years, harnessing the same wave of technology that has powered image-generation tools such as Dall-E and Midjourney. Some of those apps advertise on mainstream social networks such as Meta's Instagram, despite violating those platforms' rules. Among the most common targets are female celebrities, including singer Taylor Swift and comedian Bobbi Althoff, both of whom were the subject of sexually explicit AI fakes that went viral on Elon Musk's social network X in 2024. The imagery is also often used to harass, intimidate or embarrass young women and teens. Those victimized have described their efforts to get nonconsensual nudes scrubbed from the internet as a nightmarish game of whack-a-mole. "Deepfakes are creating horrifying new opportunities for abuse," Klobuchar said in a statement Monday. "These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable." Unlike several previous attempts to regulate social media harms, the act gained the support of several leading tech companies, including Meta, Google and Snap, clearing its path toward passage. The House version, co-sponsored by Reps. MarÃa Elvira Salazar (R-Florida) and Madeleine Dean (D-Pennsylvania), passed Monday under a process known as "suspension of the rules," in which bills that are not expected to be controversial can pass without debate with a two-thirds supermajority. The act has its critics, however. Among them is Lia Holland, legislative director of the digital rights group Fight for the Future, who called it "well-intentioned but poorly drafted." Comparing the bill to the 1998 Digital Millennium Copyright Act, which requires online platforms to remove copyrighted material whenever someone declares it is being illegally used, Holland predicted that bad actors will use Take It Down to scrub from the internet legitimate content they dislike. "If only Senator Cruz or anyone on the Hill had taken the time to make a few minor corrections, this would be a great bill," Holland said. Becca Branum, director of the nonprofit Center for Democracy and Technology's Free Expression Project, said the bill's reliance on Trump's "partisan" Federal Trade Commission raised concerns of "weaponized enforcement" for political ends. Several Democratic lawmakers proposed amendments to the bill in a markup by the House Energy and Commerce committee earlier this month, but the committee's Republican majority voted them down. Meanwhile, some privacy advocates' concerns that the bill could affect private messaging apps were eased when backers clarified that it is only intended to apply to public-facing online forums. The bill also had supporters on the left, including several who spoke at a news conference Monday convened by the advocacy group Americans for Responsible Innovation. They included Fordham University law professor Zephyr Teachout, who said that unlike some past efforts to regulate social media, the Take It Down Act is well-crafted to survive First Amendment challenges in court. Tim Wu, a Columbia Law School professor who advised former president Joe Biden on tech and antitrust regulation, also backed the bill, saying at the news conference that he hoped this is only the beginning of Congress taking a more active role in addressing social media's harms. "More needs to be done to protect children and vulnerable people," he said. "It's kind of shocking how inactive Congress has been."
[6]
AI deepfakes bill heads to Trump's desk
Why it matters: It's one of the first major tech bills to get through Congress -- and it's a bipartisan victory for the legislation, which would require platforms to quickly remove non-consensual intimate images and criminalize posting such content. State of play: The bill moved fast through this Congress after Elon Musk helped tank it at the last minute last year. Winners: The bill's passage is a big win for its lead GOP sponsors, Sen. Ted Cruz and Rep. Maria Salazar, and their Democratic co-sponsors Sen. Amy Klobuchar and Rep. Madeleine Dean. Losers: Civil society groups and cybersecurity experts warned against passing the bill, saying it would lead to invasive content monitoring and unconstitutional suppression of speech.
[7]
Congress passes 'Take It Down' Act to fight AI-fueled deepfake pornography
Under the new law, platforms would have to take down certain deepfakes. Credit: Andrew Harnik/Getty Images Congress has passed a bill that forces tech companies to take action against certain deepfakes and revenge porn posted on their platforms. In a 409-2 vote on Monday, the U.S. House of Representatives passed the "Take It Down" Act, which has received bipartisan support. The bill also received vocal support from celebrities and First Lady Melania Trump. The bill already passed the Senate in a vote last month. The Take It Down Act will now be sent to President Donald Trump, who is expected to sign it into law. First introduced by Republican Senator Ted Cruz and Democratic Senator Amy Klobuchar in 2024, the Take It Down Act would require that tech companies take quick action against nonconsensual intimate imagery. Platforms would be required to remove such content within 48 hours of a takedown request. The Federal Trade Commission could then sue platforms that do not comply with such requests. In addition to targeting tech platforms, the Take It Down Act also carves out punishments, which include fines and potential jail time, for those who create and share such imagery. The new law would make it a federal crime to publish -- or even threaten to publish -- explicit nonconsensual images, which would include revenge porn and deepfake imagery generated with AI. Digital rights groups have shared their concerns regarding the Take It Down Act. Activists have said that the bill could be weaponized to censor legally protected speech, and that legal content could be inaccurately flagged for removal. Despite these concerns, the Take It Down Act even received support from the tech platforms it seeks to police, such as Snapchat and Roblox. Congress isn't finished addressing AI and deepfakes this year either. Both the NO FAKES Act of 2025 and Content Origin Protection and Integrity from Edited and Deepfaked Media Act of 2025 have also been introduced this session. The former seeks to protect individuals from having their voice replicated by AI without their consent, whereas the latter looks to protect original works and require transparency around AI-generated content.
[8]
House passes "Take it Down Act," sending revenge porn bill backed by Melania Trump to president's desk
Washington -- The House passed a bipartisan bill Monday that makes it a federal crime to post real and fake sexually explicit imagery online of a person without their consent, sending the legislation that was backed by first lady Melania Trump to the president's desk. The bill, known as the "Take It Down Act," cleared the lower chamber in a 409-2 vote. The two "no" votes came from Republicans. The Senate unanimously passed the measure in February. The legislation requires social media companies and other websites to remove images and videos, including deepfakes generated by artificial intelligence, within 48 hours after a victim's request. "If you're a victim of revenge porn or AI-generated explicit imagery, your life changes forever," Sen. Ted Cruz, a Texas Republican, said at a March 3 roundtable promoting the bill. Cruz, who introduced the bill, recalled the experience of a teenage victim, Elliston Berry, whose classmate used an app to create explicit images of her and then sent them to her classmates. Berry's mother had tried unsuccessfully to get Snapchat to remove the images for months before she contacted Cruz's office for help. "It should not take a sitting senator or sitting member of Congress picking up the phone to get a picture down or video down," Cruz said. The first lady, who rarely appears in public, attended the March discussion at the U.S. Capitol to advocate for the bill's passage in the House. "It's heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content like deep fakes," she said. "This toxic environment can be severely damaging." The first lady applauded Congress after its passage and said the bipartisan vote made a "powerful statement that we stand united in protecting the dignity, privacy, and safety of our children." "I am thankful to the Members of Congress -- both in the House and Senate -- who voted to protect the well-being of our youth," she said in a statement. According to the FBI, in recent years there have been an alarming number of cases where victims have been extorted that have ended in suicide. Lawmakers said they hope the bill will save lives by providing recourse for victims. "The mission of this bill is simple, profound and long lasting. It stops cyber abuse. It prevents the bullying of one child against another, and even more importantly, it prevents suicide born out of shame," Republican Rep. Maria Elvira Salazar of Florida, who cosponsored the legislation in the House, said Monday during floor debate. Meta, which owns Facebook and Instagram, as well as TikTok and Snapchat have all said they support the legislation. Digital rights groups, however, have warned that the legislation as written could lead to the suppression of lawful speech, including legitimate pornography, and does not contain protections against bad-faith takedown requests.
[9]
Congress passes 'revenge porn' ban, sending it to Trump
Washington (AFP) - The US House of Representatives voted almost unanimously Monday to make it a federal crime to post "revenge porn" -- whether it is real or AI-generated -- sending the bill to President Donald Trump's desk for approval. The Take it Down Act passed in a 409-2 vote, and would criminalize the non-consensual publication of intimate images, while also mandating their removal from online platforms, Republican Speaker of the House Mike Johnson said. In March, the president vowed to sign the bill into law during a joint session of Congress. "I look forward to signing that bill into law. Thank you," Trump said. "And I'm going to use that bill for myself too if you don't mind, because nobody gets treated worse than I do online, nobody." House approval of the bill follows its unanimous passage in the Senate in February, an advancement that Johnson called "a critical step in fighting" the growing online problem. Deepfakes often rely on artificial intelligence and other tools to create realistic-looking fake videos. They can be used to create falsified pornographic images of real women, which are then published without their consent and proliferate. First Lady Melania Trump endorsed the bill in early March and said in a statement Monday that the bipartisan passage "is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children." Some US states, including California and Florida, have already passed laws criminalizing the publication of sexually explicit deepfakes. Critics voiced concern that the Congress bill grants authorities increased censorship power. The Electronic Frontiers Foundation, a nonprofit focused on free expression, posted a statement Monday saying the new legislation gave "the powerful a dangerous new route to manipulate platforms into removing lawful speech that they simply don't like." "President Trump himself has said that he would use the law to censor his critics," they added.
[10]
House passes bipartisan bill to combat explicit deepfakes, sending it to Trump to sign into law
The measure would criminalize publishing nonconsensual pornographic images and videos online, including those generated by AI. The House passed a bipartisan bill Monday aimed at combating deepfake pornography, tackling a sensitive issue that has become a growing problem amid advances in artificial intelligence. President Donald Trump is expected to sign the measure, which sailed through the House in a 409-2 vote, into law. The "Take it Down Act" would criminalize publishing nonconsensual, sexually explicit images and videos -- including those generated by AI -- and require platforms to remove the content within 48 hours of notice. The Senate passed the legislation by unanimous consent earlier this year. Passing the bill is a rare legislative feat for Congress, which has been notoriously slow to keep up with the pace of technology. And the effort attracted broad bipartisan support: Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., are the lead sponsors of the bill in the Senate, while first lady Melania Trump has used her platform to help champion it. President Trump touted the bill in a speech before Congress last month. "Today's bipartisan passage of the Take It Down Act is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children," Melania Trump said in a statement. "I am thankful to the Members of Congress -- both in the House and Senate -- who voted to protect the well-being of our youth. Through this critical legislation and our continued focus with 'Be Best,' we are building a future where every child can thrive and achieve their full potential," said First Lady Melania Trump. Still, it has drawn some opponents. Digital rights groups have raised concerns that the measure as currently drafted could threaten free speech and privacy rights. Rep. Thomas Massie, R-Ky., cast one of the two "no" votes against the bill Monday, calling it "a slippery slope, ripe for abuse, with unintended consequences." Deepfake pornography and harassment have become more prevalent, including in schools, as AI has gotten more advanced. But no federal laws explicitly ban its use, making it harder to get images removed or hold people who create and distribute them accountable. One of the inspirations for the legislation was Texas teenager Elliston Berry, who woke up one morning during her freshman year of high school to learn that a male classmate had created a fake nude image of her and circulated it on social media. "I was completely shocked," Berry, who was 14 at the time of the incident, said in an interview. "It wasn't considered child pornography, although it completely is. So we were in a gray zone. No one knew what to do. School didn't know what to do. The local sheriff didn't know what to do." The images were eventually taken down, but Berry and her mother felt there were still glaring holes in the law and found that accountability was difficult to achieve. So they connected with Cruz, one of their home-state senators, to work on a legislative fix aimed at preventing the same thing from happening to other victims. Berry was also Melania Trump's guest for the president's March address. "The last thing I wanted to do was talk about it, but it's been super healing and super encouraging knowing that I'm able to have these opportunities to speak about this and to protect so many people and to be that voice," Berry said.
[11]
Congress passes Take It Down Act to combat elicit deepfakes - SiliconANGLE
The Take It Down Act, legislation that criminalizes the publication of nonconsensual sexually explicit deepfake videos and images, passed the House today and is already on its way to President Trump's desk. "The Senate just passed the Take It Down Act," Trump said in March. "Once it passes the House, I look forward to signing that bill into law." He added, "And I'm going to use that bill for myself too if you don't mind because nobody gets treated worse than I do online, nobody." The vote went 409-2 today, with two no votes from Republican Republicans. This overwhelming response now means that social media companies and other websites will have 48 hours to remove content when requested to by a member of the public or a public figure. This will include images or videos that have been created or enhanced by artificial intelligence. Senate Commerce Chair Ted Cruz called it a "historic win in the fight to protect victims of revenge porn and deepfake abuse." Cruz believes the act will spare "victims from repeated trauma" while "holding predators accountable." Republican Senator Thomas Massie, one of the no votes, disagrees, believing the law could be abused. He wrote on X today, "I'm voting NO because I feel this is a slippery slope, ripe for abuse, with unintended consequences." For years now, activists and politicians in the U.S. and abroad have fought laws that will address the proliferation of deepfake images being shared online, often of a sexual nature. Until now, governments have been slow to counteract the distribution of such content, while social media companies have been accused of taking a lax approach to removal. Now, with so many AI products for creating images and videos, it's feared the proliferation will reach levels previously unimagined. First Lady Melania Trump, who has supported the legislation from the start, called the act a "powerful statement." "It's heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content like deep fakes," she said. "This toxic environment can be severely damaging." The Electronic Frontier Foundation, EFF, pointed out that the act could have a chilling effect. Smaller companies concerned over legal action may now introduce filters in their products, which could be flawed. The foundation is also concerned that end-to-end encrypted private messaging systems and cloud storage are not exempt, possibly resulting in a loss of privacy. At the same time, the law may encourage bad-faith takedown requests, hampering journalism and satire. "While protecting victims of these heinous privacy invasions is a legitimate goal, good intentions alone are not enough to make good policy," the foundation said in a post. "As currently drafted, the Act mandates a notice-and-takedown system that threatens free expression, user privacy, and due process, without addressing the problem it claims to solve."
[12]
What to know about the 'revenge porn' bill that's headed to Trump's desk for approval
Congress has overwhelmingly approved bipartisan legislation to enact stricter penalties for the distribution of nonconsensual intimate imagery, sometimes called "revenge porn." Known as the Take It Down Act, the bill is now headed to President Donald Trump's desk for his signature. The measure was introduced by Sen. Ted Cruz, a Republican from Texas, and Sen. Amy Klobuchar, a Democrat from Minnesota, and later gained the support of First Lady Melania Trump. Critics of the bill, which addresses both real and artificial intelligence-generated imagery, say the language is too broad and could lead to censorship and First Amendment issues. The bill makes it illegal to "knowingly publish" or threaten to publish intimate images without a person's consent, including AI-created "deepfakes." It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies. The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March saying it was "heartbreaking" to see what teenagers, especially girls, go through after they are victimized by people who spread such content. President Trump is expected to sign it into law. Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated "deepfake" of the then 14-year-old. Meta, which owns and operates Facebook and Instagram, supports the legislation. "Having an intimate image -- real or AI-generated -- shared without consent can be devastating and Meta developed and backs many efforts to help prevent it," Meta spokesman Andy Stone said last month. The Information Technology and Innovation Foundation, a tech industry-supported think tank, said in a statement Monday that the bill's passage "is an important step forward that will help people pursue justice when they are victims of non-consensual intimate imagery, including deepfake images generated using AI." "We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse," Klobuchar said in a statement after the bill's passage late Monday. "These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable." Free speech advocates and digital rights groups say the bill is too broad and could lead to the censorship of legitimate images including legal pornography and LGBTQ content, as well as government critics. "While the bill is meant to address a serious problem, good intentions alone are not enough to make good policy," said the nonprofit Electronic Frontier Foundation, a digital rights advocacy group. "Lawmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse." The takedown provision in the bill "applies to a much broader category of content -- potentially any images involving intimate or sexual content" than the narrower definitions of nonconsensual intimate imagery found elsewhere in the text, EFF said. "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools," EFF said. "They frequently flag legal content, from fair-use commentary to news reporting. The law's tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal." As a result, the group said online companies, especially smaller ones that lack the resources to wade through a lot of content, "will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it." The measure, EFF said, also pressures platforms to "actively monitor speech, including speech that is presently encrypted" to address liability threats. The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said it has "serious reservations" about the bill. It called its takedown provision unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse." For instance, the group said, platforms could be obligated to remove a journalist's photographs of a topless protest on a public street, photos of a subway flasher distributed by law enforcement to locate the perpetrator, commercially produced sexually explicit content or sexually explicit material that is consensual but falsely reported as being nonconsensual.
[13]
Take It Down Act, addressing nonconsensual deepfakes and 'revenge porn,' passes. What is it?
Congress has overwhelmingly approved bipartisan legislation to enact stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn." Known as the Take It Down Act, the bill is now headed to President Donald Trump's desk for his signature. The measure was introduced by Sen. Ted Cruz, a Republican from Texas, and Sen. Amy Klobuchar, a Democrat from Minnesota, and later gained the support of First Lady Melania Trump. Critics of the bill, which addresses both real and artificial intelligence-generated imagery, say the language is too broad and could lead to censorship and First Amendment issues. What is the Take It Down Act? The bill makes it illegal to "knowingly publish" or threaten to publish intimate images without a person's consent, including AI-created "deepfakes." It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies. Who supports it? The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March saying it was "heartbreaking" to see what teenagers, especially girls, go through after they are victimized by people who spread such content. President Trump is expected to sign it into law. Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated "deepfake" of the then 14-year-old. Meta, which owns and operates Facebook and Instagram, supports the legislation. "Having an intimate image - real or AI-generated - shared without consent can be devastating and Meta developed and backs many efforts to help prevent it," Meta spokesman Andy Stone said last month. The Information Technology and Innovation Foundation, a tech industry-supported think tank, said in a statement Monday that the bill's passage "is an important step forward that will help people pursue justice when they are victims of non-consensual intimate imagery, including deepfake images generated using AI." "We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse," Klobuchar said in a statement after the bill's passage late Monday. "These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable." What are the censorship concerns? Free speech advocates and digital rights groups say the bill is too broad and could lead to the censorship of legitimate images including legal pornography and LGBTQ content, as well as government critics. "While the bill is meant to address a serious problem, good intentions alone are not enough to make good policy," said the nonprofit Electronic Frontier Foundation, a digital rights advocacy group. "Lawmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse." The takedown provision in the bill "applies to a much broader category of content -- potentially any images involving intimate or sexual content" than the narrower definitions of non-consensual intimate imagery found elsewhere in the text, EFF said. "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools," EFF said. "They frequently flag legal content, from fair-use commentary to news reporting. The law's tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal." As a result, the group said online companies, especially smaller ones that lack the resources to wade through a lot of content, "will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it." The measure, EFF said, also pressures platforms to "actively monitor speech, including speech that is presently encrypted" to address liability threats. The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said it has "serious reservations" about the bill. It called its takedown provision unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse." For instance, the group said, platforms could be obligated to remove a journalist's photographs of a topless protest on a public street, photos of a subway flasher distributed by law enforcement to locate the perpetrator, commercially produced sexually explicit content or sexually explicit material that is consensual but falsely reported as being nonconsensual.
[14]
Take It Down Act, Addressing Nonconsensual Deepfakes and 'Revenge Porn,' Passes. What Is It?
Congress has overwhelmingly approved bipartisan legislation to enact stricter penalties for the distribution of non-consensual intimate imagery, sometimes called "revenge porn." Known as the Take It Down Act, the bill is now headed to President Donald Trump's desk for his signature. The measure was introduced by Sen. Ted Cruz, a Republican from Texas, and Sen. Amy Klobuchar, a Democrat from Minnesota, and later gained the support of First Lady Melania Trump. Critics of the bill, which addresses both real and artificial intelligence-generated imagery, say the language is too broad and could lead to censorship and First Amendment issues. What is the Take It Down Act? The bill makes it illegal to "knowingly publish" or threaten to publish intimate images without a person's consent, including AI-created "deepfakes." It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies. Who supports it? The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March saying it was "heartbreaking" to see what teenagers, especially girls, go through after they are victimized by people who spread such content. President Trump is expected to sign it into law. Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated "deepfake" of the then 14-year-old. Meta, which owns and operates Facebook and Instagram, supports the legislation. "Having an intimate image - real or AI-generated - shared without consent can be devastating and Meta developed and backs many efforts to help prevent it," Meta spokesman Andy Stone said last month. The Information Technology and Innovation Foundation, a tech industry-supported think tank, said in a statement Monday that the bill's passage "is an important step forward that will help people pursue justice when they are victims of non-consensual intimate imagery, including deepfake images generated using AI." "We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse," Klobuchar said in a statement after the bill's passage late Monday. "These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable." What are the censorship concerns? Free speech advocates and digital rights groups say the bill is too broad and could lead to the censorship of legitimate images including legal pornography and LGBTQ content, as well as government critics. "While the bill is meant to address a serious problem, good intentions alone are not enough to make good policy," said the nonprofit Electronic Frontier Foundation, a digital rights advocacy group. "Lawmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse." The takedown provision in the bill "applies to a much broader category of content -- potentially any images involving intimate or sexual content" than the narrower definitions of non-consensual intimate imagery found elsewhere in the text, EFF said. "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools," EFF said. "They frequently flag legal content, from fair-use commentary to news reporting. The law's tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal." As a result, the group said online companies, especially smaller ones that lack the resources to wade through a lot of content, "will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it." The measure, EFF said, also pressures platforms to "actively monitor speech, including speech that is presently encrypted" to address liability threats. The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said it has "serious reservations" about the bill. It called its takedown provision unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse." For instance, the group said, platforms could be obligated to remove a journalist's photographs of a topless protest on a public street, photos of a subway flasher distributed by law enforcement to locate the perpetrator, commercially produced sexually explicit content or sexually explicit material that is consensual but falsely reported as being nonconsensual. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[15]
With rare bipartisan support, Congress passes bill to outlaw deepfake pornography
Melania Trump spoke out in favor of legislation that would criminalize the publication of non-consensual deepfake sexual images. A bill to criminalize AI-generated explicit images, or "deepfakes," is headed to President Donald Trump's desk after sailing through both chambers of Congress with near-unanimous approval. The Take It Down Act has enjoyed uncommon bipartisan support, along with a key endorsement from the first lady. "It's heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content, like deepfakes," Melania Trump said during a rare public appearance on Capitol Hill March 3 to lobby for the legislation. Deepfakes are photos, videos or audio altered or created by artificial intelligence to appear real, often without the subject of the media's consent. Many of the images are manipulated to put people into compromising situations, showing them appearing inappropriately or putting them in places that could spark controversy or embarrassment. The images have become a major cause for concern with the explosion of AI technology. The newly-passed bill will require technology platforms to remove reported "non-consensual, sexually exploitative images" within 48 hours of receiving a valid request. Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minnesota, introduced the legislation in August. Faked explicit images of pop star Taylor Swift circulated on social media last January, prompting backlash from fans and widespread calls for increased regulation. At the time, USA TODAY was able to identify 10 states outlawing deepfake pornography, like the depictions of Swift. There was then no federal equivalent, such as the Take It Down Act. Global celebrities are not the only targets of AI-generated attacks: One in eight teens say they personally know someone victimized by explicit deepfakes, according to a report in March by Thorn, a nonprofit advocating for online child safety. One of the victims, high schooler Elliston Berry, has been a vocal advocate of the Take It Down Act alongside the first lady. Berry was 14 when a classmate used AI to photoshop her face onto a naked body and shared the false digital images on social media. The Aledo, Texas teenager joined Melania Trump as a special White House guest for the president's annual joint address to Congress March 3, as well as the day before for the first lady's Capitol Hill remarks. "Fear, shock and disgust were just some of the many emotions I felt," Berry, then 15, said at the March 3 event. "I felt responsible and began to blame myself and was ashamed to tell my parents, despite doing nothing wrong." The Senate passed the Take It Down Act in February with unanimous consent. The House followed suit on April 28, approving it 409-2. President Trump is expected to sign the bill into law. "Today's bipartisan passage of the Take It Down Act is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children," Melania Trump said in a statement. "I am thankful to the Members of Congress -- both in the House and Senate -- who voted to protect the well-being of our youth."
[16]
Trump Set To Sign Bill That Criminalizes Creation of Deepfake A.I.-Generated Pornography
The legislation was due to be passed back in December, though Elon Musk stripped it out of a government funding deal to have fewer pages. President Trump will soon sign legislation to criminalize the distribution of explicit images and videos created with deceptive editing and artificial intelligence, something which has become more prevalent in recent years. The bill was due to be signed last year before Elon Musk objected to several parts of a government funding deal. The Take It Down Act, once signed, will establish new criminal and civil penalties for those who create altered pornographic images and videos of other people, whether it be through digital editing or by using artificial intelligence. Social media platforms will be required to take down the deepfake photos and videos within 48 hours of their original upload time. Individuals convicted of uploading deepfake pornography depicting adults will face up to two years in prison. Those uploading altered pornography of children face up to three years in prison. The Senate passed the bill by unanimous consent after Senator Cruz was contacted by a constituent whose daughter was a victim of such a scheme. The victim in question, Elliston Berry, became distressed in 2023 when she discovered that a classmate of hers had taken a photo from her Instagram account and put it through an A.I. platform to digitally alter the photo to remove her clothing. That altered photo was then sent to classmates. Ms. Berry was only 14 at the time of the event. First lady Melnia Trump became a champion for the legislation once she returned to the White House in January. On Monday, Mrs. Trump said the bill is key to protecting children online. "Advancing this legislation has been a key focus since I returned to my role as First Lady this past January. I am honored to have contributed to guiding it through Congress. By safeguarding children from hurtful online behavior today, we take a vital step in nurturing our leaders of tomorrow," the first lady said in a statement. Ms. Berry was invited to attend Mr. Trump's joint address to Congress back in March. In a video shared by the White House, Ms. Berry said it was "such an amazing experience" to work with Mrs. Trump and Mr. Cruz to get the bill passed. "She has the heart for my situation and ... she cares," Ms. Berry said of the first lady. Back in 2023, George Washington University law professor Spencer Overton told members of the House Oversight Committee that deepfake, A.I.-generated pornography was a threat to women across the country. He also said that it was growing more popular by the day. "Deepfake pornography accounts for 98 percent of deepfake videos online, and 99 percent of all deepfake porn features women while only one percent features men," Mr. Overton told lawmakers. "The total number of deepfake porn videos produced in 2023 increased 464 percent from 2022 -- to 21,019 from 3,725 -- and in 2023 the monthly traffic of the leading ten dedicated deepfake porn websites reached over 34 million." The legislation was killed in 2024 thanks to one of Mr. Trump's own senior advisors, however. The Take It Down Act was included in a 1,500-page government funding deal just before Christmas last year, though Mr. Musk was quick to object to the bill because he felt the package itself simply had too many pages. Just hours after Mr. Musk called on the Republican-led House to scrap the funding deal, Mr. Trump and Vice President Vance came out against that legislation and made demands that the bill be pared down. Congressional leaders then made the decision to strip the Take It Down Act from their broader agreement so that the funding deal would have fewer pages, soothing Mr. Musk's concerns. "Yesterday's bill vs today's bill," Mr. Musk wrote on X, comparing the 1,500-page bill which included the anti-deepfake pornography legislation to the slimmer government funding deal that he helped design. "Ever seen a bigger piece of pork?" Mr. Musk asked in another post, including a photo of the printed legislation. Only two House members voted against the Take It Down Act. Congressman Thomas Massie, who voted no on Monday, said he thought the new criminal and civil penalties could be used beyond their original intent. "I feel this is a slippery slope, ripe for abuse, with unintended consequences," Mr. Massie wrote on X.
Share
Share
Copy Link
The US Congress has passed the Take It Down Act, a bipartisan bill aimed at criminalizing the distribution of non-consensual intimate imagery, including AI-generated deepfakes. The bill requires online platforms to remove such content within 48 hours of notification.
In a rare display of bipartisanship, the US Congress has overwhelmingly passed the Take It Down Act, a bill aimed at addressing the growing problem of non-consensual intimate imagery (NCII), including AI-generated deepfakes. The legislation, which passed the House with a 409-2 vote and the Senate unanimously, is now headed to President Trump's desk for his signature 12.
The bill introduces several significant measures:
The Take It Down Act has garnered strong bipartisan support, with Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN) as its primary sponsors 2. First Lady Melania Trump has also championed the bill as part of her "Be Best" anti-cyberbullying campaign 15.
The legislation was inspired by real-life cases, including that of Elliston Berry, a 14-year-old girl from Texas whose male classmate created and posted a deepfake nude image of her on social media 15.
Several major tech companies, including Meta (Facebook and Instagram), Google, and Snap, have expressed support for the legislation 5. Meta spokesperson Andy Stone stated, "Having an intimate image - real or AI-generated - shared without consent can be devastating and Meta developed and backs many efforts to help prevent it" 2.
Despite widespread support, the bill has faced criticism from digital rights advocates and free speech proponents:
The passage of the Take It Down Act is expected to have significant implications for online platforms and users:
As the bill awaits President Trump's signature, it represents a significant step in addressing the growing issue of non-consensual intimate imagery and deepfakes. However, its implementation and potential unintended consequences remain subjects of ongoing debate in the tech and legal communities 45.
Reference
[2]
[5]
First Lady Melania Trump returns to public advocacy, championing legislation against non-consensual intimate imagery and AI-generated deepfakes. The bipartisan 'Take It Down Act' aims to criminalize revenge porn and protect victims, especially young people, from online exploitation.
5 Sources
5 Sources
A new study reveals that 1 in 6 congresswomen have been victims of AI-generated sexually explicit deepfakes, highlighting the urgent need for legislative action to combat this growing threat.
6 Sources
6 Sources
The U.S. Senate has unanimously passed the DEFIANCE Act, a bipartisan bill aimed at empowering victims of non-consensual deepfake pornography. The legislation allows victims to sue creators and distributors of such content.
4 Sources
4 Sources
Governor Gavin Newsom signs bills closing legal loopholes and criminalizing AI-generated child sexual abuse material, positioning California as a leader in AI regulation.
7 Sources
7 Sources
Minnesota legislators are considering a pioneering bill to block AI-powered 'nudification' apps and websites, aiming to prevent the creation of non-consensual explicit images before they can spread online.
5 Sources
5 Sources