Curated by THEOUTPOST
On Thu, 25 Jul, 12:02 AM UTC
4 Sources
[1]
Defiance act passes in the Senate, potentially allowing deepfake victims to sue over non-consensual images
The Defiance Act would give victims the ability to sue anyone who creates, shares or receives nonconsensual sexually explicit deepfakes depicting them. A federal bill that would allow victims of nonconsensual sexually explicit deepfakes to sue people who create, share and receive them has unanimously passed the Senate and now moves to the House for a vote. The Disrupt Explicit Forged Images and Non-Consensual Edits (Defiance) Act of 2024, introduced by Senate Judiciary Chair Dick Durbin, D-Ill., and Sen. Lindsay Graham, R-S.C., would create a federal civil remedy for identifiable victims of deepfake sexual abuse. Rep. Alexandria Ocasio-Cortez, D-N.Y., who is sponsoring the legislation in the House, called the act the first federal protection for "survivors of nonconsensual deepfake pornography." "Over 90% of all deepfake videos made are nonconsensual sexually explicit images, and women are the targets 9 times out of 10," Ocasio-Cortez said in a statement after the Senate passed the bill on Thursday. She has previously spoken out on being targeted with such deepfakes herself. The House version of the bill is still being considered in committee. Deepfakes typically refer to digitally manipulated images that falsely depict someone saying or doing something. They often take the form of sexually explicit photos and videos spread online. The material frequently merges a victim's face with a body in a pornographic video. Generative artificial intelligence models can also create audio, videos and images that are entirely fake but look and sound realistic. The Defiance Act defines a nonconsensual sexually explicit deepfake as a "visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic" that "depict the victim in the nude, or engaged in sexually-explicit conduct or sexual scenarios." The civil remedy would be enforceable against people who create or possess the deepfake with intent to distribute it, as well as people who distribute and receive it if they knew or recklessly disregarded that the victim did not consent to the deepfake. The statute of limitations for the remedy is 10 years. The production of nonconsensual sexually explicit deepfakes has skyrocketed since 2023, first becoming popular with the likenesses of female public figures like influencers, politicians and celebrities. Cases have also sprung up at middle and high schools around the world, with teen girls frequently being victimized by their male classmates. While the trend has overwhelmingly targeted women and girls, men have also been depicted in deepfakes. Following a high-profile incident with sexually suggestive AI-generated images of singer Taylor Swift going viral on X, formerly Twitter, multiple representatives introduced state and federal legislation to combat the issue. Durbin has been particularly vocal on the issue, sending a letter to the CEO of Google's parent company Alphabet, Sundar Pichai, in June asking for details on how the search engine planned to address the proliferation of deepfakes in search results. "Current laws don't apply to deepfakes, leaving women and girls who suffer from this image-based sexual abuse without a legal remedy," Durbin posted on X after the Defiance Act passed the Senate. "It's time to give victims their day in court and the tools they need to fight back."
[2]
The US Senate unanimously passes a bill to empower victims of intimate deepfakes
If signed into law, it would give victims a 10-year window to sue for damages. The US Senate unanimously passed a bill on Tuesday designed to hold accountable those who make or share deepfake porn. The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act) would allow victims to sue those who create, share or possess AI-generated sexual images or videos using their likeness. The issue took root in the public consciousness after the infamous Taylor Swift deepfake that circulated among online lowlifes early this year. The bill would let victims sue for up to $150,000 in damages. That number grows to $250,000 if it's related to attempted sexual assault, stalking or harassment. It now moves to the House, where a companion bill awaits. Rep. Alexandria Ocasio-Cortez (D-NY) sponsors the sister bill. If it passes there (which sounds likely, given the unanimous nature of the Senate's vote), it will move to President Biden's desk for final passage. "There's a shock to seeing images of yourself that someone could think are real," Ocasio-Cortez told Rolling Stone earlier this year. "And once you've seen it, you've seen it. It parallels the same exact intention of physical rape and sexual assault, [which] is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people." The bill, sponsored by Senator Dick Durbin (D-IL), lets the victims of intimate digital forgeries (deepfakes) sue for damages. It would give victims a 10-year statute of limitations, beginning either from the discovery of the content or when they turn 18 in the (even more disturbing) case of minors. "As we know, AI plays a bigger role in our lives than ever before, and while it has many benefits, it's also easier than ever to create sexually explicit deep fakes without a person's consent," Senate Majority Leader Chuck Schumer (D-NY) said on the Senate floor late Tuesday. "It is a horrible attack on someone's privacy and dignity to have these fake images of them circulating online without recourse." Schumer cited Swift and Megan Thee Stallion in his floor speech as two celebrity examples who have fallen victim to the types of content the bill targets. However, The Verge notes online sexual deepfakes have affected those with much less clout (and money for lawyers) than A-list pop stars, like high school girls, some of whom have found out about contrived sexual images of them being passed around among their peers. Fortunately, the bill stipulates that victims would have privacy protections during court proceedings and that they could recover legal costs. "It's a grotesque practice and victims of these deep fakes deserve justice," Schumer said.
[3]
The Senate passed a bill cracking down on sexually explicit deepfakes
The Senate unanimously passed a bill on Tuesday letting victims of non-consensual intimate images created by AI -- or "deepfakes" -- sue their creators for damages. The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE) Act lets victims of sexually explicit deepfakes pursue civil remedies against those that produced or processed the image with the intent to distribute it. Victims who are identifiable in these kinds of deepfakes can receive up to $150,000 in damages under the bill, and up to $250,000 if the incident was connected to "actual or attempted sexual assault, stalking, or harassment" or "the direct and proximate cause" of those harms. It's now up to the House to take up the bill before it can be moved to the president's desk to be signed into law.
[4]
AOC's Deepfake AI Porn Bill Unanimously Passes the Senate
Louisiana Lawmakers Move to Criminalize Possession of Abortion Pills The Senate unanimously passed a bipartisan bill to provide recourse to victims of porn deepfakes -- or sexually-explicit, non-consensual images created with artificial intelligence. The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act -- passed in Congress' upper chamber on Tuesday. The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House. The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they "knew or recklessly disregarded" the fact that the victim did not consent to those images. "Current laws don't apply to deepfakes, leaving women and girls who suffer from this image-based sexual abuse without a legal remedy," Durbin posted on X after the bill's passage. "It's time to give victims their day in court and the tools they need to fight back. I urge my House colleagues to pass this bill expediently." Senate Majority Leader Chuck Schumer (D-N.Y.) praised the bill's passage, commending Durbing for his work. "This isn't just some fringe issue that happens to only a few people -- it's a widespread problem," said Schumer. "These types of malicious and hurtful pictures can destroy lives. Nobody is immune, not even celebrities like Taylor Swift or Megan Thee Stallion. It's a grotesque practice and victims of these deepfakes deserve justice." Ocasio-Cortez, the progressive New York lawmaker, first announced she was co-leading the bicameral legislation in an interview with Rolling Stone. She's had personal experience with this specific type of abuse, and discussed the trauma it can cause. "There's a shock to seeing images of yourself that someone could think are real," she told us in March. "And once you've seen it, you've seen it. It parallels the same exact intention of physical rape and sexual assault, [which] is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people." In a press release following the bill's passage in the Senate, Ocasio-Cortez said it "marks an important step in the fight to protect survivors of non-consensual deepfake pornography," adding: "I'm committed to collaborating with colleagues from both sides of the aisle to shepherd the bill through the House of Representatives to get it to the president's desk. Together, we can give survivors the justice they deserve." When the bill first came up for a unanimous vote in the Senate in June, it was blocked by Rep. Cynthia Lummis (R-Wyo.). The most recent version of the bill now includes a "findings" section, refining the definition of "digital forgery" and updating the available damages to ensure victims receive appropriate compensation, along with some other clarifications. The "findings" section discusses facts previously reported on by Rolling Stone: Technology like generative AI has made it easier for people to quickly generate digital forgeries without technological experience; victims of this abuse can potentially experience depression, anxiety, and suicidal ideation; the harms of this abuse are not mitigated through labels depicting the image as fake; and victims do not know how to prevent future abuse. "I had no idea if the person who did it was near me location-wise, [or] if they were going to do anything else to me," one survivor told Rolling Stone in our April print issue. "I had no idea if anyone who saw that video was going to try to find me. I was very physically vulnerable at that point." The Senate introduced the DEFIANCE Act on Jan. 30, about a week after several AI-generated, sexually-explicit deepfakes of Taylor Swift went viral on X. If the bill passes the House, it would become the first federal law to create a civil cause of action for deepfakes. There are currently a few federal statutes that can be used for criminal prosecution of deepfakes for minors, but DEFIANCE would allow both adults and minors to sue. Past legislative efforts by Rep. Yvette Clarke (D-N.Y.) and Rep. Joe Morelle (D-N.Y.) to rein in deepfakes were not successful. Those bills involved criminal penalties, while DEFIANCE focuses on civil legal recourse. "It's so important to me that people understand that this is not just a form of interpersonal violence, it's not just about the harm that's done to the victim," Ocasio-Cortez previously told Rolling Stone. "Because this technology threatens to do it at scale -- this is about class subjugation. It's a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it's a direct connection [with] taking their rights away."
Share
Share
Copy Link
The U.S. Senate has unanimously passed the DEFIANCE Act, a bipartisan bill aimed at empowering victims of non-consensual deepfake pornography. The legislation allows victims to sue creators and distributors of such content.
In a rare display of bipartisan unity, the U.S. Senate has unanimously passed the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act. This landmark legislation aims to combat the growing threat of non-consensual deepfake pornography by providing victims with legal recourse against perpetrators 1.
Deepfakes are AI-generated or manipulated media that can realistically depict individuals saying or doing things they never actually said or did. While this technology has various applications, it has increasingly been misused to create non-consensual intimate content, raising serious concerns about privacy and consent 2.
The DEFIANCE Act empowers victims by allowing them to sue both the creators and distributors of non-consensual deepfake pornography. This includes individuals who share such content without the subject's permission. The legislation also enables victims to seek monetary damages and injunctive relief, potentially forcing the removal of the offending content from the internet 3.
The bill, introduced by Senators Kevin Cramer and Amy Klobuchar, garnered support from both sides of the aisle. Representative Alexandria Ocasio-Cortez, who has been a vocal advocate for addressing this issue, praised the Senate's action. Ocasio-Cortez herself has been a target of deepfake pornography, highlighting the personal nature of this legislative effort 4.
While the DEFIANCE Act represents a significant step forward in protecting individuals from non-consensual deepfake pornography, experts note that challenges remain. Identifying creators and distributors of such content can be difficult, and the global nature of the internet may complicate enforcement efforts 1.
With the Senate's approval secured, the DEFIANCE Act now moves to the House of Representatives for consideration. If passed by the House and signed into law by the President, this legislation could provide a powerful tool for victims and potentially deter the creation and distribution of non-consensual deepfake pornography 2.
Reference
[1]
[4]
A new study reveals that 1 in 6 congresswomen have been victims of AI-generated sexually explicit deepfakes, highlighting the urgent need for legislative action to combat this growing threat.
6 Sources
6 Sources
A bipartisan group of U.S. senators has introduced legislation aimed at protecting individuals and artists from AI-generated deepfakes. The bill seeks to establish legal safeguards and address concerns about AI exploitation in various sectors.
5 Sources
5 Sources
The UK government plans to introduce new legislation making the creation and sharing of sexually explicit deepfake images a criminal offense, aiming to protect women and girls from online abuse.
9 Sources
9 Sources
Governor Gavin Newsom signs bills closing legal loopholes and criminalizing AI-generated child sexual abuse material, positioning California as a leader in AI regulation.
7 Sources
7 Sources
First Lady Melania Trump returns to public advocacy, championing legislation against non-consensual intimate imagery and AI-generated deepfakes. The bipartisan 'Take It Down Act' aims to criminalize revenge porn and protect victims, especially young people, from online exploitation.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved