Curated by THEOUTPOST
On Thu, 24 Apr, 8:03 AM UTC
9 Sources
[1]
Musk's X sues to block Minnesota 'deepfake' law over free speech concerns
WILMINGTON, Delaware, April 23 (Reuters) - Elon Musk's social media platform X sued Minnesota on Wednesday over a state law that bans people from using AI-generated "deepfakes" to influence an election, which the company said violated protections of free speech. The law replaces social media platforms' judgment about the content with the judgment of the state and threatens criminal liability if the platforms get it wrong, according to the lawsuit that was filed in Minnesota federal court. "This system will inevitably result in the censorship of wide swaths of valuable political speech and commentary," X said in its complaint. Musk has described himself as a free speech absolutist and he did away with Twitter's content moderation policy when he bought the company in 2022 and renamed it X. Minnesota Attorney General Keith Ellison, the named defendant, did not immediately respond to a request for comment. Minnesota's law bans the use of deepfakes - videos, pictures or audio clips made with AI to look real - to influence an election. At least 22 states have enacted some form of prohibition on the use of deepfakes in elections, according to data compiled by Public Citizen, which says that AI can be used to manipulate voters. X asked the federal judge to declare the law violated the First Amendment of the U.S. Constitution, Minnesota's constitution and that it was impermissibly vague. It also wants the judge to find the law is precluded by what is known as Section 230, a federal law that protects social media companies from being held liable for content posted by users. The company wants a permanent injunction preventing the law from being enforced. The Minnesota law has already been challenged on similar grounds by a Republican state lawmaker Mary Franson and social media influencer Christopher Kohls. In January, U.S. District Judge Laura Provinzino rejected their bid for a preliminary injunction to block the law, which they appealed. Provinzino's ruling did not address the merits of the lawsuit. Reporting by Tom Hals in Wilmington, Delaware Editing by Marguerita Choy Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Media & Telecom
[2]
Elon Musk's X sues to overturn Minnesota political deepfakes ban
MINNEAPOLIS (AP) -- X Corp., the social media platform owned by Trump adviser Elon Musk, is challenging the constitutionality of a Minnesota ban on using deepfakes to influence elections and harm candidates, saying it violates First Amendment speech protections. The company's federal lawsuit filed this week also contends that the 2023 state law is preempted by a 1996 federal statute that shields social media from being held responsible for material posted on their platforms. "While the law's reference to banning 'deep fakes' might sound benign, in reality it would criminalize innocuous, election-related speech, including humor, and make social-media platforms criminally liable for censoring such speech," the company said in a statement. "Instead of defending democracy, this law would erode it." Minnesota's law imposes criminal penalties -- including jail time -- for disseminating a deepfake video, image or audio if a person knows it's fake, or acts with reckless disregard to its authenticity, either within 90 days before a party nominating convention, or after the start of early voting in a primary or general election. It says the intent must be to injure a candidate or influence an election result. And it defines deepfakes as material so realistic that a reasonable person would believe it's real, and generated by artificial intelligence or other technical means. "Elon Musk funneled hundreds of millions of dollars into the 2024 presidential election and tried to buy a Wisconsin Supreme Court seat," said the law's author, Democratic state Sen. Erin Maye Quade. "Of course he is upset that Minnesota law prevents him from spreading deepfakes that meant to harm candidates and influence elections. Minnesota's law is clear and precise, while this lawsuit is petty, misguided and a waste of the Attorney General Office's time and resources," her statement said. Democratic Minnesota Attorney General Keith Ellison's office, which is legally obligated to defend the constitutionality of state laws in court, said in a statement that it's "reviewing the lawsuit and will respond in the appropriate time and manner." The Minnesota law was already the subject of a constitutional challenge by Christopher Kohls, a content creator, and GOP state Rep. Mary Franson, who likes to post AI-generated parodies of politicians. That case is on hold while they appeal to overturn a judge's denial of their request to suspend the law. The attorney general's office argues in that case that deepfakes are a real and growing threat to free elections and democratic institutions, that the law is a legitimate and constitutional response to the problem, and that it contains important limitations on its scope that protect satire and parody. X, formerly known as Twitter, said it's the only social media platform challenging the Minnesota law, and that it has also challenged other laws it considers infringements of free speech, such as a 2024 California political deepfakes law that a judge has blocked. X said in its statement that its "Community Notes" feature allows users to flag content they consider problematic, and that it's been adopted by Facebook, YouTube and TikTok. The company's lawsuit said its "Authenticity Policy" and "Grok AI" tool provide additional safeguards. Alan Rozenshtein, a University of Minnesota law professor and expert on technology law, said in an interview Friday that it's important to separate the free-speech issues from whatever one thinks about the controversial Musk. "I'm almost positive that this will be struck down," Rozenshtein said. There's no exception under the First Amendment for false or misleading political speech, even lies, he said. And the potential for criminal penalties gives social media companies like X and Facebook "an incentive to take down anything that might be a deepfake. ... You're going to censor a massive amount to comply with this law." Deepfakes aren't good, but it would be nice to get evidence that they're causing actual problems before imposing such limits on free speech, the professor said. And while it's easy to focus on the supply of misinformation, the large demand for it is the problem. "People want to be fooled, and it's very bad for our democracy, but it's not something I think can be solved with a deepfakes ban," he said.
[3]
Elon Musk's X sues to overturn Minnesota political deepfakes ban
MINNEAPOLIS (AP) -- X Corp., the social media platform owned by Trump adviser Elon Musk, is challenging the constitutionality of a Minnesota ban on using deepfakes to influence elections and harm candidates, saying it violates First Amendment speech protections. The company's federal lawsuit filed this week also contends that the 2023 state law is preempted by a 1996 federal statute that shields social media from being held responsible for material posted on their platforms. "While the law's reference to banning 'deep fakes' might sound benign, in reality it would criminalize innocuous, election-related speech, including humor, and make social-media platforms criminally liable for censoring such speech," the company said in a statement. "Instead of defending democracy, this law would erode it." Minnesota's law imposes criminal penalties -- including jail time -- for disseminating a deepfake video, image or audio if a person knows it's fake, or acts with reckless disregard to its authenticity, either within 90 days before a party nominating convention, or after the start of early voting in a primary or general election. It says the intent must be to injure a candidate or influence an election result. And it defines deepfakes as material so realistic that a reasonable person would believe it's real, and generated by artificial intelligence or other technical means. "Elon Musk funneled hundreds of millions of dollars into the 2024 presidential election and tried to buy a Wisconsin Supreme Court seat," said the law's author, Democratic state Sen. Erin Maye Quade. "Of course he is upset that Minnesota law prevents him from spreading deepfakes that meant to harm candidates and influence elections. Minnesota's law is clear and precise, while this lawsuit is petty, misguided and a waste of the Attorney General Office's time and resources," her statement said. Democratic Minnesota Attorney General Keith Ellison's office, which is legally obligated to defend the constitutionality of state laws in court, said in a statement that it's "reviewing the lawsuit and will respond in the appropriate time and manner." The Minnesota law was already the subject of a constitutional challenge by Christopher Kohls, a content creator, and GOP state Rep. Mary Franson, who likes to post AI-generated parodies of politicians. That case is on hold while they appeal to overturn a judge's denial of their request to suspend the law. The attorney general's office argues in that case that deepfakes are a real and growing threat to free elections and democratic institutions, that the law is a legitimate and constitutional response to the problem, and that it contains important limitations on its scope that protect satire and parody. X, formerly known as Twitter, said it's the only social media platform challenging the Minnesota law, and that it has also challenged other laws it considers infringements of free speech, such as a 2024 California political deepfakes law that a judge has blocked. X said in its statement that its "Community Notes" feature allows users to flag content they consider problematic, and that it's been adopted by Facebook, YouTube and TikTok. The company's lawsuit said its "Authenticity Policy" and "Grok AI" tool provide additional safeguards. Alan Rozenshtein, a University of Minnesota law professor and expert on technology law, said in an interview Friday that it's important to separate the free-speech issues from whatever one thinks about the controversial Musk. "I'm almost positive that this will be struck down," Rozenshtein said. There's no exception under the First Amendment for false or misleading political speech, even lies, he said. And the potential for criminal penalties gives social media companies like X and Facebook "an incentive to take down anything that might be a deepfake. ... You're going to censor a massive amount to comply with this law." Deepfakes aren't good, but it would be nice to get evidence that they're causing actual problems before imposing such limits on free speech, the professor said. And while it's easy to focus on the supply of misinformation, the large demand for it is the problem. "People want to be fooled, and it's very bad for our democracy, but it's not something I think can be solved with a deepfakes ban," he said.
[4]
Elon Musk's X sues to overturn Minnesota political deepfakes ban
MINNEAPOLIS -- X Corp., the social media platform owned by Trump adviser Elon Musk, is challenging the constitutionality of a Minnesota ban on using deepfakes to influence elections and harm candidates, saying it violates First Amendment speech protections. The company's federal lawsuit filed this week also contends that the 2023 state law is preempted by a 1996 federal statute that shields social media from being held responsible for material posted on their platforms. "While the law's reference to banning 'deep fakes' might sound benign, in reality it would criminalize innocuous, election-related speech, including humor, and make social-media platforms criminally liable for censoring such speech," the company said in a statement. "Instead of defending democracy, this law would erode it." Minnesota's law imposes criminal penalties -- including jail time -- for disseminating a deepfake video, image or audio if a person knows it's fake, or acts with reckless disregard to its authenticity, either within 90 days before a party nominating convention, or after the start of early voting in a primary or general election. It says the intent must be to injure a candidate or influence an election result. And it defines deepfakes as material so realistic that a reasonable person would believe it's real, and generated by artificial intelligence or other technical means. "Elon Musk funneled hundreds of millions of dollars into the 2024 presidential election and tried to buy a Wisconsin Supreme Court seat," said the law's author, Democratic state Sen. Erin Maye Quade. "Of course he is upset that Minnesota law prevents him from spreading deepfakes that meant to harm candidates and influence elections. Minnesota's law is clear and precise, while this lawsuit is petty, misguided and a waste of the Attorney General Office's time and resources," her statement said. Democratic Minnesota Attorney General Keith Ellison's office, which is legally obligated to defend the constitutionality of state laws in court, said in a statement that it's "reviewing the lawsuit and will respond in the appropriate time and manner." The Minnesota law was already the subject of a constitutional challenge by Christopher Kohls, a content creator, and GOP state Rep. Mary Franson, who likes to post AI-generated parodies of politicians. That case is on hold while they appeal to overturn a judge's denial of their request to suspend the law. The attorney general's office argues in that case that deepfakes are a real and growing threat to free elections and democratic institutions, that the law is a legitimate and constitutional response to the problem, and that it contains important limitations on its scope that protect satire and parody. X, formerly known as Twitter, said it's the only social media platform challenging the Minnesota law, and that it has also challenged other laws it considers infringements of free speech, such as a 2024 California political deepfakes law that a judge has blocked. X said in its statement that its "Community Notes" feature allows users to flag content they consider problematic, and that it's been adopted by Facebook, YouTube and TikTok. The company's lawsuit said its "Authenticity Policy" and "Grok AI" tool provide additional safeguards. Alan Rozenshtein, a University of Minnesota law professor and expert on technology law, said in an interview Friday that it's important to separate the free-speech issues from whatever one thinks about the controversial Musk. "I'm almost positive that this will be struck down," Rozenshtein said. There's no exception under the First Amendment for false or misleading political speech, even lies, he said. And the potential for criminal penalties gives social media companies like X and Facebook "an incentive to take down anything that might be a deepfake. ... You're going to censor a massive amount to comply with this law." Deepfakes aren't good, but it would be nice to get evidence that they're causing actual problems before imposing such limits on free speech, the professor said. And while it's easy to focus on the supply of misinformation, the large demand for it is the problem. "People want to be fooled, and it's very bad for our democracy, but it's not something I think can be solved with a deepfakes ban," he said.
[5]
Musk's X Sues to Block Minnesota 'Deepfake' Law Over Free Speech Concerns
WILMINGTON, Delaware (Reuters) -Elon Musk's social media platform X sued Minnesota on Wednesday over a state law that bans people from using AI-generated "deepfakes" to influence an election, which the company said violated protections of free speech. The law replaces social media platforms' judgment about the content with the judgment of the state and threatens criminal liability if the platforms get it wrong, according to the lawsuit that was filed in Minnesota federal court. "This system will inevitably result in the censorship of wide swaths of valuable political speech and commentary," X said in its complaint. Musk has described himself as a free speech absolutist and he did away with Twitter's content moderation policy when he bought the company in 2022 and renamed it X. Minnesota Attorney General Keith Ellison, the named defendant, did not immediately respond to a request for comment. Minnesota's law bans the use of deepfakes - videos, pictures or audio clips made with AI to look real - to influence an election. At least 22 states have enacted some form of prohibition on the use of deepfakes in elections, according to data compiled by Public Citizen, which says that AI can be used to manipulate voters. X asked the federal judge to declare the law violated the First Amendment of the U.S. Constitution, Minnesota's constitution and that it was impermissibly vague. It also wants the judge to find the law is precluded by what is known as Section 230, a federal law that protects social media companies from being held liable for content posted by users. The company wants a permanent injunction preventing the law from being enforced. The Minnesota law has already been challenged on similar grounds by a Republican state lawmaker Mary Franson and social media influencer Christopher Kohls. In January, U.S. District Judge Laura Provinzino rejected their bid for a preliminary injunction to block the law, which they appealed. Provinzino's ruling did not address the merits of the lawsuit. (Reporting by Tom Hals in Wilmington, DelawareEditing by Marguerita Choy)
[6]
Musk's X sues over Minnesota deepfake law
Elon Musk's social media platform X filed a suit Wednesday over a Minnesota law prohibiting the sharing of "deepfake" videos to influence an election, alleging it violates free speech. In the complaint, filed in federal court in Minnesota, X argues the law will "lead to blanket censorship, including of fully protected, core political speech." The law, passed in 2023, defines a "deepfake" as videos, audio recordings or photos created with artificial intelligence (AI) tools to "realistically impersonate" a person without their permission or knowledge. Anyone who widely shares the deepfake within 90 days of an election could face criminal action. By threatening criminal action against social media platforms, X argues platforms are more likely to err on the side of removing content, even when it is a "close call" on whether it is a deep fake. "Under this enforcement system, platforms that keep up content presenting a close call under the statute run the risk of criminal penalties, but there is no penalty for erring on the side of too much censorship," the complaint stated. "This system will inevitably result in the censorship of wide swaths of valuable political speech and commentary and will limit the type of 'uninhibited, robust, and wide-open' 'debate on public issues' that core First Amendment protections are designed to ensure," it continued. The suit urges a federal judge to rule the law is a violation of the First Amendment of the U.S. Constitution, along with the Minnesota Constitution, and seeks a permanent injunction to block the enforcement of the law. The Hill reached out to Minnesota Attorney General Keith Ellison, the listed defendant, for comment. Minnesota is one of more than two dozen states in the U.S. to pass legislation related to the regulation of deepfakes in elections, according to a tracker from Public Citizen, a progressive consumer rights watchdog nonprofit. Lawmakers have increasingly raised concerns deepfake technology risks spreading disinformation and manipulating voters during an election. Musk, who purchased X, then known as Twitter, in 2021, has touted himself as a champion of free speech. Shortly after the purchase, he pulled back a number of content moderation policies on X, defending the move as a protection of free speech.
[7]
Elon Musk's X Sues to Overturn Minnesota Political Deepfakes Ban
MINNEAPOLIS (AP) -- X Corp., the social media platform owned by Trump adviser Elon Musk, is challenging the constitutionality of a Minnesota ban on using deepfakes to influence elections and harm candidates, saying it violates First Amendment speech protections. The company's federal lawsuit filed this week also contends that the 2023 state law is preempted by a 1996 federal statute that shields social media from being held responsible for material posted on their platforms. "While the law's reference to banning 'deep fakes' might sound benign, in reality it would criminalize innocuous, election-related speech, including humor, and make social-media platforms criminally liable for censoring such speech," the company said in a statement. "Instead of defending democracy, this law would erode it." Minnesota's law imposes criminal penalties -- including jail time -- for disseminating a deepfake video, image or audio if a person knows it's fake, or acts with reckless disregard to its authenticity, either within 90 days before a party nominating convention, or after the start of early voting in a primary or general election. It says the intent must be to injure a candidate or influence an election result. And it defines deepfakes as material so realistic that a reasonable person would believe it's real, and generated by artificial intelligence or other technical means. "Elon Musk funneled hundreds of millions of dollars into the 2024 presidential election and tried to buy a Wisconsin Supreme Court seat," said the law's author, Democratic state Sen. Erin Maye Quade. "Of course he is upset that Minnesota law prevents him from spreading deepfakes that meant to harm candidates and influence elections. Minnesota's law is clear and precise, while this lawsuit is petty, misguided and a waste of the Attorney General Office's time and resources," her statement said. Democratic Minnesota Attorney General Keith Ellison's office, which is legally obligated to defend the constitutionality of state laws in court, said in a statement that it's "reviewing the lawsuit and will respond in the appropriate time and manner." The Minnesota law was already the subject of a constitutional challenge by Christopher Kohls, a content creator, and GOP state Rep. Mary Franson, who likes to post AI-generated parodies of politicians. That case is on hold while they appeal to overturn a judge's denial of their request to suspend the law. The attorney general's office argues in that case that deepfakes are a real and growing threat to free elections and democratic institutions, that the law is a legitimate and constitutional response to the problem, and that it contains important limitations on its scope that protect satire and parody. X, formerly known as Twitter, said it's the only social media platform challenging the Minnesota law, and that it has also challenged other laws it considers infringements of free speech, such as a 2024 California political deepfakes law that a judge has blocked. X said in its statement that its "Community Notes" feature allows users to flag content they consider problematic, and that it's been adopted by Facebook, YouTube and TikTok. The company's lawsuit said its "Authenticity Policy" and "Grok AI" tool provide additional safeguards. Alan Rozenshtein, a University of Minnesota law professor and expert on technology law, said in an interview Friday that it's important to separate the free-speech issues from whatever one thinks about the controversial Musk. "I'm almost positive that this will be struck down," Rozenshtein said. There's no exception under the First Amendment for false or misleading political speech, even lies, he said. And the potential for criminal penalties gives social media companies like X and Facebook "an incentive to take down anything that might be a deepfake. ... You're going to censor a massive amount to comply with this law." Deepfakes aren't good, but it would be nice to get evidence that they're causing actual problems before imposing such limits on free speech, the professor said. And while it's easy to focus on the supply of misinformation, the large demand for it is the problem. "People want to be fooled, and it's very bad for our democracy, but it's not something I think can be solved with a deepfakes ban," he said. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[8]
Elon Musk's X sues to block Minnesota 'deepfake' law over free speech concerns
The law replaces social media platforms' judgment about the content with the judgment of the state and threatens criminal liability if the platforms get it wrong, according to the lawsuit that was filed in Minnesota federal court.Elon Musk's social media platform X sued Minnesota on Wednesday over a state law that bans people from using AI-generated "deepfakes" to influence an election, which the company said violated protections of free speech. The law replaces social media platforms' judgment about the content with the judgment of the state and threatens criminal liability if the platforms get it wrong, according to the lawsuit that was filed in Minnesota federal court. "This system will inevitably result in the censorship of wide swaths of valuable political speech and commentary," X said in its complaint. Musk has described himself as a free speech absolutist and he did away with Twitter's content moderation policy when he bought the company in 2022 and renamed it X. Minnesota Attorney General Keith Ellison, the named defendant, did not immediately respond to a request for comment. Minnesota's law bans the use of deepfakes - videos, pictures or audio clips made with AI to look real - to influence an election. At least 22 states have enacted some form of prohibition on the use of deepfakes in elections, according to data compiled by Public Citizen, which says that AI can be used to manipulate voters. X asked the federal judge to declare the law violated the First Amendment of the U.S. Constitution, Minnesota's constitution and that it was impermissibly vague. It also wants the judge to find the law is precluded by what is known as Section 230, a federal law that protects social media companies from being held liable for content posted by users. The company wants a permanent injunction preventing the law from being enforced. The Minnesota law has already been challenged on similar grounds by a Republican state lawmaker Mary Franson and social media influencer Christopher Kohls. In January, U.S. District Judge Laura Provinzino rejected their bid for a preliminary injunction to block the law, which they appealed. Provinzino's ruling did not address the merits of the lawsuit.
[9]
X Fights Minnesota's Deepfake Law, Citing Free Speech Violations
Adding to its list of ongoing lawsuits, social media company X (formerly Twitter) has filed a legal challenge against an anti-deepfake law in the US state of Minnesota. The law in question regulates the use of deepfakes during elections. X argues that this law violates its rights and its users' rights under the First Amendment of the U.S. Constitution (which includes the right to free speech). It also claims that the law violates Section 230 of the U.S. Communications Decency Act, which protects platforms from liability for what their users post (safe harbor). "While the law's reference to banning 'deepfakes' might sound benign, in reality it would criminalize innocuous, election-related speech, including humor, and make social media platforms criminally liable for not censoring such speech," the company said in a blog post explaining its concerns with the law. It says that anyone who disseminates deepfakes or enters into a contract to disseminate deepfakes will be in violation of the law, provided that: The law states that people who post deepfakes during the election period can be sentenced to imprisonment for up to five years or fined up to US$3,000 if they have one or more prior convictions. In other cases, non-compliance can result in a prison sentence of up to 90 days or a fine of US$1,000. This isn't the first time that X has filed a lawsuit challenging a law. In November last year, the company challenged a similar law regulating deepfakes during elections in the U.S. state of California. As in the current case, X's key opposition to the law lies in its implications for free speech and safe harbor. Similarly, in India, the company is legally challenging the government's use of Section 79(3)(b) of the Information Technology Act, 2000, to issue content takedown requests to social media platforms. Section 79 deals with safe harbor, or protection from liability for user-generated content, similar to Section 230 of the U.S. Communications Decency Act. Part (3)(b) of Section 79 states that platforms can lose this liability if they fail to remove unlawful content after the government or its agencies notify them about said content. In June last year, the U.S. Subcommittee on Science and Technology conducted a hearing to discuss a proposal to "sunset" safe harbor protections for platforms under Section 230. During the hearing, Congresswoman Kathy McMorris Rodgers mentioned that U.S. courts have expanded the scope of safe harbor, and it is becoming harder to hold platforms responsible if they amplify illegal content. "As more and more companies integrate generative artificial intelligence technologies into their platforms, these harms will only get worse, and AI will redefine what it means to be a publisher, potentially creating new legal challenges for companies," she argued. In the Indian context, the question of safe harbor for AI-generated content came up in 2023 when the government sent an advisory to social media platforms, warning them that if they fail to take down deepfake content, they could lose safe harbor protections. While governments, whether in the U.S. or India, may have reasons to act against deepfakes, X's arguments focus on the fact that people don't just use deepfakes for harmful purposes but also for legitimate content, such as social commentary using AI. The company has reportedly presented the same perspective to the central committee currently conducting consultations on deepfake regulation.
Share
Share
Copy Link
X Corp., owned by Elon Musk, has filed a lawsuit against Minnesota's ban on using AI-generated deepfakes in elections, citing First Amendment concerns and potential censorship of political speech.
Elon Musk's social media platform X has filed a federal lawsuit against the state of Minnesota, challenging a law that prohibits the use of AI-generated deepfakes to influence elections 1. The company argues that the law violates First Amendment protections and could lead to widespread censorship of political speech.
The Minnesota law, enacted in 2023, imposes criminal penalties, including jail time, for disseminating deepfake videos, images, or audio within 90 days of a party nominating convention or during early voting periods 2. It defines deepfakes as AI-generated content so realistic that a reasonable person would believe it's authentic.
X Corp. contends that the law:
The company argues that the law would "criminalize innocuous, election-related speech, including humor" and make social media platforms criminally liable for censoring such content 4.
At least 22 states have enacted similar prohibitions on deepfakes in elections 5. However, legal experts like Alan Rozenshtein, a University of Minnesota law professor, believe the Minnesota law is likely to be struck down. Rozenshtein argues that there's no First Amendment exception for false or misleading political speech, and the law could lead to excessive censorship by social media companies 2.
The Minnesota law has already faced a constitutional challenge from content creator Christopher Kohls and GOP state Rep. Mary Franson. Their case is currently on hold pending an appeal 3.
Democratic state Sen. Erin Maye Quade, the law's author, criticized Musk's lawsuit, stating, "Of course he is upset that Minnesota law prevents him from spreading deepfakes that meant to harm candidates and influence elections" 4.
X Corp. argues that its existing features, such as "Community Notes," "Authenticity Policy," and "Grok AI" tool, provide sufficient safeguards against misinformation 3. The company has also challenged similar laws in other states, including a 2024 California political deepfakes law that has been temporarily blocked by a judge 2.
Reference
[3]
[5]
Elon Musk's social media platform X has filed a lawsuit against California's new law targeting AI-generated deepfakes in elections, claiming it violates free speech protections.
7 Sources
7 Sources
California's recently enacted law targeting AI-generated deepfakes in elections is being put to the test, as Elon Musk's reposting of Kamala Harris parody videos sparks debate and potential legal challenges.
6 Sources
6 Sources
A man who created an AI-generated parody video of Vice President Kamala Harris is suing California over new deepfake laws, claiming they violate free speech rights.
2 Sources
2 Sources
A federal judge has granted a preliminary injunction against California's new law allowing individuals to sue for damages over election deepfakes. The judge ruled that the law likely violates the First Amendment, despite acknowledging the risks posed by AI and deepfakes.
14 Sources
14 Sources
Elon Musk's sharing of an AI-manipulated video imitating Vice President Kamala Harris's voice has ignited a debate about the potential misuse of artificial intelligence in politics and the spread of misinformation.
18 Sources
18 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved