The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On July 27, 2024
4 Sources
[1]
Justice Department defends group's right to sue over AI robocalls sent to New Hampshire voters
CONCORD, N.H. (AP) -- The federal Justice Department is defending the legal right to challenge robocalls sent to New Hampshire voters that used artificial intelligence to mimic President Joe Biden's voice. Assistant Attorney General Kristen Clarke and U.S. Attorney Jane Young filed a statement of interest Thursday in the lawsuit brought by the League of Women Voters against Steve Kramer -- the political consultant behind the calls -- and the three companies involved in transmitting them. Kramer, who is facing separate criminal charges related to the calls, has yet to respond to the lawsuit filed in March, but the companies filed a motion to dismiss last month. Among other arguments, they said robocalls don't violate the section of the Voting Rights Act that prohibits attempting to or actually intimidating, threatening or coercing voters and that there is no private right of action under the law. The Justice Department countered that the law clearly allows aggrieved individuals and organizations representing them to enforce their rights under the law. And it said the companies were incorrect in arguing that the law doesn't apply to robocalls because they are merely "deceptive" and not intimidating, threatening or coercive. "Robocalls in particular can violate voting rights by incentivizing voters to remain away from the polls, deceive voters into believing false information and provoke fear among the targeted individuals," Young said in a statement. "The U.S. Attorney's Office commends any private citizen willing to stand up against these aggressive tactics and exercise their rights to participate in the enforcement process for the Voting Rights Act." At issue is a message sent to thousands of New Hampshire voters on Jan. 21 that featured a voice similar to Biden's falsely suggesting that voting in the state's first-in-the-nation presidential primary two days later would preclude them from casting ballots in November. Kramer, who paid a magician and self-described "digital nomad" who does technology consulting $150 to create the recording, has said he orchestrated the call to publicize the potential dangers of AI and spur action from lawmakers. He faces 26 criminal charges in New Hampshire, along with a proposed $6 million fine from the Federal Communications Commission, which has taken multiple steps in recent months to combat the growing use of AI tools in political communications. On Thursday, it advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads, though it is unclear whether new regulations may be in place before the November presidential election.
[2]
Justice Department defends group's right to sue over AI robocalls sent to New Hampshire voters
CONCORD, N.H. (AP) -- The federal Justice Department is defending the legal right to challenge robocalls sent to New Hampshire voters that used artificial intelligence to mimic President Joe Biden's voice. Assistant Attorney General Kristen Clarke and U.S. Attorney Jane Young filed a statement of interest Thursday in the lawsuit brought by the League of Women Voters against Steve Kramer -- the political consultant behind the calls -- and the three companies involved in transmitting them. Kramer, who is facing separate criminal charges related to the calls, has yet to respond to the lawsuit filed in March, but the companies filed a motion to dismiss last month. Among other arguments, they said robocalls don't violate the section of the Voting Rights Act that prohibits attempting to or actually intimidating, threatening or coercing voters and that there is no private right of action under the law. The Justice Department countered that the law clearly allows aggrieved individuals and organizations representing them to enforce their rights under the law. And it said the companies were incorrect in arguing that the law doesn't apply to robocalls because they are merely "deceptive" and not intimidating, threatening or coercive. "Robocalls in particular can violate voting rights by incentivizing voters to remain away from the polls, deceive voters into believing false information and provoke fear among the targeted individuals," Young said in a statement. "The U.S. Attorney's Office commends any private citizen willing to stand up against these aggressive tactics and exercise their rights to participate in the enforcement process for the Voting Rights Act." At issue is a message sent to thousands of New Hampshire voters on Jan. 21 that featured a voice similar to Biden's falsely suggesting that voting in the state's first-in-the-nation presidential primary two days later would preclude them from casting ballots in November. Kramer, who paid a magician and self-described "digital nomad" who does technology consulting $150 to create the recording, has said he orchestrated the call to publicize the potential dangers of AI and spur action from lawmakers. He faces 26 criminal charges in New Hampshire, along with a proposed $6 million fine from the Federal Communications Commission, which has taken multiple steps in recent months to combat the growing use of AI tools in political communications. On Thursday, it advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads, though it is unclear whether new regulations may be in place before the November presidential election.
[3]
Justice Department defends group's right to sue over AI robocalls sent to New Hampshire voters
CONCORD, N.H. -- The federal Justice Department is defending the legal right to challenge robocalls sent to New Hampshire voters that used artificial intelligence to mimic President Joe Biden's voice. Assistant Attorney General Kristen Clarke and U.S. Attorney Jane Young filed a statement of interest Thursday in the lawsuit brought by the League of Women Voters against Steve Kramer -- the political consultant behind the calls -- and the three companies involved in transmitting them. Kramer, who is facing separate criminal charges related to the calls, has yet to respond to the lawsuit filed in March, but the companies filed a motion to dismiss last month. Among other arguments, they said robocalls don't violate the section of the Voting Rights Act that prohibits attempting to or actually intimidating, threatening or coercing voters and that there is no private right of action under the law. The Justice Department countered that the law clearly allows aggrieved individuals and organizations representing them to enforce their rights under the law. And it said the companies were incorrect in arguing that the law doesn't apply to robocalls because they are merely "deceptive" and not intimidating, threatening or coercive. "Robocalls in particular can violate voting rights by incentivizing voters to remain away from the polls, deceive voters into believing false information and provoke fear among the targeted individuals," Young said in a statement. "The U.S. Attorney's Office commends any private citizen willing to stand up against these aggressive tactics and exercise their rights to participate in the enforcement process for the Voting Rights Act." At issue is a message sent to thousands of New Hampshire voters on Jan. 21 that featured a voice similar to Biden's falsely suggesting that voting in the state's first-in-the-nation presidential primary two days later would preclude them from casting ballots in November. Kramer, who paid a magician and self-described "digital nomad" who does technology consulting $150 to create the recording, has said he orchestrated the call to publicize the potential dangers of AI and spur action from lawmakers. He faces 26 criminal charges in New Hampshire, along with a proposed $6 million fine from the Federal Communications Commission, which has taken multiple steps in recent months to combat the growing use of AI tools in political communications. On Thursday, it advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads, though it is unclear whether new regulations may be in place before the November presidential election.
[4]
Justice Department Defends Group's Right to Sue Over AI Robocalls Sent to New Hampshire Voters
CONCORD, N.H. (AP) -- The federal Justice Department is defending the legal right to challenge robocalls sent to New Hampshire voters that used artificial intelligence to mimic President Joe Biden's voice. Assistant Attorney General Kristen Clarke and U.S. Attorney Jane Young filed a statement of interest Thursday in the lawsuit brought by the League of Women Voters against Steve Kramer -- the political consultant behind the calls -- and the three companies involved in transmitting them. Kramer, who is facing separate criminal charges related to the calls, has yet to respond to the lawsuit filed in March, but the companies filed a motion to dismiss last month. Among other arguments, they said robocalls don't violate the section of the Voting Rights Act that prohibits attempting to or actually intimidating, threatening or coercing voters and that there is no private right of action under the law. The Justice Department countered that the law clearly allows aggrieved individuals and organizations representing them to enforce their rights under the law. And it said the companies were incorrect in arguing that the law doesn't apply to robocalls because they are merely "deceptive" and not intimidating, threatening or coercive. "Robocalls in particular can violate voting rights by incentivizing voters to remain away from the polls, deceive voters into believing false information and provoke fear among the targeted individuals," Young said in a statement. "The U.S. Attorney's Office commends any private citizen willing to stand up against these aggressive tactics and exercise their rights to participate in the enforcement process for the Voting Rights Act." At issue is a message sent to thousands of New Hampshire voters on Jan. 21 that featured a voice similar to Biden's falsely suggesting that voting in the state's first-in-the-nation presidential primary two days later would preclude them from casting ballots in November. Kramer, who paid a magician and self-described "digital nomad" who does technology consulting $150 to create the recording, has said he orchestrated the call to publicize the potential dangers of AI and spur action from lawmakers. He faces 26 criminal charges in New Hampshire, along with a proposed $6 million fine from the Federal Communications Commission, which has taken multiple steps in recent months to combat the growing use of AI tools in political communications. On Thursday, it advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads, though it is unclear whether new regulations may be in place before the November presidential election. Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
Share
Share
Copy Link
The U.S. Justice Department has defended a voting rights group's right to sue over AI-generated robocalls that discouraged voting in New Hampshire's primary. The case highlights growing concerns about AI's potential misuse in elections.
In a significant development surrounding the use of artificial intelligence in political campaigns, the U.S. Justice Department has thrown its weight behind a lawsuit concerning AI-generated robocalls in New Hampshire. These calls, which mimicked President Joe Biden's voice, attempted to discourage voters from participating in the state's primary election held in January 1.
The Department of Justice has filed a "statement of interest" in the federal court case, asserting that the MoveOn organization has the legal standing to sue over these deceptive robocalls 2. This move underscores the government's commitment to protecting voting rights and combating election interference.
MoveOn, a progressive advocacy group, filed the lawsuit against two Texas companies and their leaders, alleging violations of the Telephone Consumer Protection Act. The defendants have sought to dismiss the case, arguing that MoveOn lacks the standing to sue 3.
However, the Justice Department contends that MoveOn has indeed suffered harm, as the organization had to divert resources to counteract the misinformation spread by the robocalls. This argument strengthens MoveOn's position and highlights the potential broader impacts of such deceptive practices on civic organizations and the democratic process.
The robocalls in question used an AI-generated voice resembling President Biden's, telling voters they should "save" their vote for the November general election. This message was both false and potentially harmful to the democratic process, as it could have discouraged voter participation in the primary 4.
This case has brought to the forefront growing concerns about the potential misuse of AI technology in elections. As AI becomes more sophisticated, there are fears that it could be used to create increasingly convincing and widespread misinformation campaigns, potentially influencing election outcomes.
The Justice Department's involvement in this case signals a growing recognition of the need to address AI-related election interference. It may pave the way for more robust legal frameworks and regulations surrounding the use of AI in political contexts, especially as the U.S. approaches the 2024 presidential election.
Reference
[1]
[2]
[3]
[4]
U.S. News & World Report
|Justice Department Defends Group's Right to Sue Over AI Robocalls Sent to New Hampshire VotersLingo Telecom agrees to pay a $1 million fine for facilitating AI-generated robocalls impersonating President Joe Biden during the New Hampshire primary. The incident highlights growing concerns over AI misuse in political campaigns.
16 Sources
The Federal Communications Commission (FCC) is considering new regulations that would require companies to disclose the use of artificial intelligence in robocalls and texts. This move aims to combat the rising threat of AI-generated scams and misinformation.
2 Sources
As Congress faces gridlock on various issues, a bipartisan group of lawmakers sees artificial intelligence (AI) as a potential area for breakthrough legislation. The urgency to regulate AI is driven by concerns over its rapid advancement and potential risks.
5 Sources
Despite predictions of AI significantly influencing elections in 2023, its impact was less dramatic than anticipated. This story explores the actual role of AI in recent elections and the ongoing concerns about its potential future effects.
2 Sources
Artificial intelligence poses a significant threat to the integrity of the 2024 US elections. Experts warn about the potential for AI-generated misinformation to influence voters and disrupt the electoral process.
2 Sources