Curated by THEOUTPOST
On Fri, 1 Nov, 12:07 AM UTC
3 Sources
[1]
Pastors and secret codes: US election officials wage low-tech battle against AI robocalls
(Reuters) - While fake videos of Democratic candidate Kamala Harris spread across social media but fail to capture much interest, state officials are girding for what they consider a far more dangerous deception days before the U.S. presidential election - deepfake robocalls. Officials in states from Arizona to Vermont are preparing for fake audio messages piped directly to home and mobile phones and out of public view, a concern exacerbated by rapidly advancing generative AI technology. And unlike AI-generated photos and videos, which often have small, telltale signs of manipulation such as an extra finger on a person's hand, it is more difficult for the average voter to spot a fake phone call, experts said. Ahead of the Nov. 5 election that pits Harris against Republican Donald Trump, election officials are on alert given early examples of such calls. In January, a robocall impersonating U.S. President Joe Biden circulated in New Hampshire, urging Democrats to stay home during the primary and "save your vote for the November election." The political consultant behind the robocall was fined $6 million in September. "We've already seen examples of audio deepfakes. It's not something that is this imaginary technology. It's here," said Colorado Secretary of State Jena Griswold. Audio is most concerning because it is difficult to track and verify, said Amy Cohen, executive director of the National Association of State Election Directors, a nonpartisan professional organization for election directors. "Even without AI, every election official spends hours chasing their tails because of robocalls," she said. That's because investigating robocalls - automated calls delivering a recorded message - depends on people hearing the call correctly, recognizing the call is fake and then reporting it to authorities. Rarely do election officials receive a recording of the robocall, Cohen added. To prepare, election directors have considered potential scenarios in training sessions and discussions throughout the year, according to interviews with officials from six states. 'A WAKE-UP CALL' To arm themselves, officials are using decidedly old-school strategies. In Colorado, election officials have considered how to react if they themselves are targeted with deepfake calls. For example, what should officials do if they receive a call with a voice that sounds like Griswold's, instructing them to alter voting hours at polling locations? Griswold says she has instructed officials to hang up and call her office if they suspect anything out of the ordinary. "The issue with AI technology is that we literally need to train ourselves to not believe our eyes and ears," she said. Another tactic is more commonly seen in spy novels - election officials can agree on a secret code word with their colleagues as an added measure to verify identities over the phone, Cohen said. State officials say they are particularly worried about false information spreading just days before the vote, leaving them with little time to respond. In addition to working with media, Minnesota Secretary of State Steve Simon said his office would enlist local and religious leaders trusted by their communities to help debunkfalse information quickly. In a rural state like Maine, Secretary of State Shenna Bellows said even something as simple as posting signs in towns and at fire stations would help amplify important news. As misleading content runs rampant on social media, the Illinois State Board of Elections in August began running its first ad campaign warning of election disinformation. The ads aired on about 37 television and 270 radio stations. When thousands of New Hampshire residents received the purported call in January from "Biden" urging them not to vote, Secretary of State David Scanlan said his office sprung into action. The state attorney general and law enforcement officials issued a statement about the fake call, prompting coverage on local radio and television. In the event of another robocall, "I think we'd react the same way," Scanlan said. "We used all the resources that were available to us." And while there was no indication that the fake Biden call swayed any voters, the incident showed that officials need to be prepared for new risks emerging from the advent of AI. "The robocall was a wake-up call to the country," he said. (Reporting by Sheila Dang in Austin, editing by Deepa Babington)
[2]
Pastors and secret codes: US election officials wage low-tech battle against AI robocalls
Oct 31 (Reuters) - While fake videos of Democratic candidate Kamala Harris spread across social media but fail to capture much interest, state officials are girding for what they consider a far more dangerous deception days before the U.S. presidential election - deepfake robocalls. Officials in states from Arizona to Vermont are preparing for fake audio messages piped directly to home and mobile phones and out of public view, a concern exacerbated by rapidly advancing generative AI technology. And unlike AI-generated photos and videos, which often have small, telltale signs of manipulation such as an extra finger on a person's hand, it is more difficult for the average voter to spot a fake phone call, experts said. Ahead of the Nov. 5 election that pits Harris against Republican Donald Trump, election officials are on alert given early examples of such calls. In January, a robocall impersonating U.S. President Joe Biden circulated in New Hampshire, urging Democrats to stay home during the primary and "save your vote for the November election." The political consultant behind the robocall was fined $6 million in September. "We've already seen examples of audio deepfakes. It's not something that is this imaginary technology. It's here," said Colorado Secretary of State Jena Griswold. Audio is most concerning because it is difficult to track and verify, said Amy Cohen, executive director of the National Association of State Election Directors, a nonpartisan professional organization for election directors. "Even without AI, every election official spends hours chasing their tails because of robocalls," she said. That's because investigating robocalls - automated calls delivering a recorded message - depends on people hearing the call correctly, recognizing the call is fake and then reporting it to authorities. Rarely do election officials receive a recording of the robocall, Cohen added. To prepare, election directors have considered potential scenarios in training sessions and discussions throughout the year, according to interviews with officials from six states. 'A WAKE-UP CALL' To arm themselves, officials are using decidedly old-school strategies. In Colorado, election officials have considered how to react if they themselves are targeted with deepfake calls. For example, what should officials do if they receive a call with a voice that sounds like Griswold's, instructing them to alter voting hours at polling locations? Griswold says she has instructed officials to hang up and call her office if they suspect anything out of the ordinary. "The issue with AI technology is that we literally need to train ourselves to not believe our eyes and ears," she said. Another tactic is more commonly seen in spy novels - election officials can agree on a secret code word with their colleagues as an added measure to verify identities over the phone, Cohen said. State officials say they are particularly worried about false information spreading just days before the vote, leaving them with little time to respond. In addition to working with media, Minnesota Secretary of State Steve Simon said his office would enlist local and religious leaders trusted by their communities to help debunk false information quickly. In a rural state like Maine, Secretary of State Shenna Bellows said even something as simple as posting signs in towns and at fire stations would help amplify important news. As misleading content runs rampant on social media, the Illinois State Board of Elections in August began running its first ad campaign warning of election disinformation. The ads aired on about 37 television and 270 radio stations. When thousands of New Hampshire residents received the purported call in January from "Biden" urging them not to vote, Secretary of State David Scanlan said his office sprung into action. The state attorney general and law enforcement officials issued a statement about the fake call, prompting coverage on local radio and television. In the event of another robocall, "I think we'd react the same way," Scanlan said. "We used all the resources that were available to us." And while there was no indication that the fake Biden call swayed any voters, the incident showed that officials need to be prepared for new risks emerging from the advent of AI. "The robocall was a wake-up call to the country," he said. Reporting by Sheila Dang in Austin, editing by Deepa Babington Our Standards: The Thomson Reuters Trust Principles., opens new tab
[3]
Pastors and Secret Codes: US Election Officials Wage Low-Tech Battle Against AI Robocalls
(Reuters) - While fake videos of Democratic candidate Kamala Harris spread across social media but fail to capture much interest, state officials are girding for what they consider a far more dangerous deception days before the U.S. presidential election - deepfake robocalls. Officials in states from Arizona to Vermont are preparing for fake audio messages piped directly to home and mobile phones and out of public view, a concern exacerbated by rapidly advancing generative AI technology. And unlike AI-generated photos and videos, which often have small, telltale signs of manipulation such as an extra finger on a person's hand, it is more difficult for the average voter to spot a fake phone call, experts said. Ahead of the Nov. 5 election that pits Harris against Republican Donald Trump, election officials are on alert given early examples of such calls. In January, a robocall impersonating U.S. President Joe Biden circulated in New Hampshire, urging Democrats to stay home during the primary and "save your vote for the November election." The political consultant behind the robocall was fined $6 million in September. "We've already seen examples of audio deepfakes. It's not something that is this imaginary technology. It's here," said Colorado Secretary of State Jena Griswold. Audio is most concerning because it is difficult to track and verify, said Amy Cohen, executive director of the National Association of State Election Directors, a nonpartisan professional organization for election directors. "Even without AI, every election official spends hours chasing their tails because of robocalls," she said. That's because investigating robocalls - automated calls delivering a recorded message - depends on people hearing the call correctly, recognizing the call is fake and then reporting it to authorities. Rarely do election officials receive a recording of the robocall, Cohen added. To prepare, election directors have considered potential scenarios in training sessions and discussions throughout the year, according to interviews with officials from six states. 'A WAKE-UP CALL' To arm themselves, officials are using decidedly old-school strategies. In Colorado, election officials have considered how to react if they themselves are targeted with deepfake calls. For example, what should officials do if they receive a call with a voice that sounds like Griswold's, instructing them to alter voting hours at polling locations? Griswold says she has instructed officials to hang up and call her office if they suspect anything out of the ordinary. "The issue with AI technology is that we literally need to train ourselves to not believe our eyes and ears," she said. Another tactic is more commonly seen in spy novels - election officials can agree on a secret code word with their colleagues as an added measure to verify identities over the phone, Cohen said. State officials say they are particularly worried about false information spreading just days before the vote, leaving them with little time to respond. In addition to working with media, Minnesota Secretary of State Steve Simon said his office would enlist local and religious leaders trusted by their communities to help debunkfalse information quickly. In a rural state like Maine, Secretary of State Shenna Bellows said even something as simple as posting signs in towns and at fire stations would help amplify important news. As misleading content runs rampant on social media, the Illinois State Board of Elections in August began running its first ad campaign warning of election disinformation. The ads aired on about 37 television and 270 radio stations. When thousands of New Hampshire residents received the purported call in January from "Biden" urging them not to vote, Secretary of State David Scanlan said his office sprung into action. The state attorney general and law enforcement officials issued a statement about the fake call, prompting coverage on local radio and television. In the event of another robocall, "I think we'd react the same way," Scanlan said. "We used all the resources that were available to us." And while there was no indication that the fake Biden call swayed any voters, the incident showed that officials need to be prepared for new risks emerging from the advent of AI. "The robocall was a wake-up call to the country," he said. (Reporting by Sheila Dang in Austin, editing by Deepa Babington)
Share
Share
Copy Link
As the US presidential election approaches, state officials are preparing to counter the threat of AI-generated deepfake robocalls using old-school tactics and community engagement to protect the integrity of the electoral process.
As the November 5th US presidential election approaches, state officials are gearing up to combat a novel threat: AI-generated deepfake robocalls. While fake videos of Democratic candidate Kamala Harris have failed to gain traction, election officials from Arizona to Vermont are more concerned about audio deepfakes delivered directly to voters' phones 123.
Unlike AI-generated photos and videos, which often contain visible artifacts, audio deepfakes are particularly challenging for the average voter to detect. Amy Cohen, executive director of the National Association of State Election Directors, emphasizes that audio is most concerning due to its difficulty in tracking and verification 123.
Colorado Secretary of State Jena Griswold states, "We've already seen examples of audio deepfakes. It's not something that is this imaginary technology. It's here" 123. This sentiment underscores the immediacy of the threat and the need for proactive measures.
In response to this emerging threat, election officials are adopting surprisingly old-school tactics:
Secret Code Words: Officials are considering using secret code words to verify identities over the phone, a tactic reminiscent of spy novels 123.
Community Engagement: Minnesota Secretary of State Steve Simon plans to enlist local and religious leaders to help debunk false information quickly 123.
Traditional Media: In rural areas like Maine, Secretary of State Shenna Bellows suggests using simple methods like posting signs in towns and at fire stations to amplify important news 123.
Public Awareness Campaigns: The Illinois State Board of Elections launched its first ad campaign warning of election disinformation, airing on 37 television and 270 radio stations 123.
A January 2024 incident in New Hampshire, where a robocall impersonating President Joe Biden urged Democrats not to vote, served as a wake-up call. The political consultant behind the call was fined $6 million in September 123. New Hampshire Secretary of State David Scanlan described the rapid response to this incident, which involved issuing statements and leveraging local media coverage 123.
State officials are particularly concerned about false information spreading just days before the election, leaving little time for correction. This time pressure underscores the importance of preparedness and quick response strategies 123.
Griswold aptly summarizes the challenge: "The issue with AI technology is that we literally need to train ourselves to not believe our eyes and ears" 123. This statement highlights the paradigm shift required in voter education and election security in the age of AI.
Reference
[1]
[2]
[3]
U.S. News & World Report
|Pastors and Secret Codes: US Election Officials Wage Low-Tech Battle Against AI RobocallsAs the 2024 US presidential election approaches, the rise of AI-generated fake content is raising alarms about potential voter manipulation. Experts warn that the flood of AI-created misinformation could significantly impact the electoral process.
5 Sources
5 Sources
As the 2024 U.S. presidential election approaches, artificial intelligence emerges as a powerful and potentially disruptive force, raising concerns about misinformation, deepfakes, and foreign interference while also offering new campaign tools.
6 Sources
6 Sources
Artificial intelligence poses a significant threat to the integrity of the 2024 US elections. Experts warn about the potential for AI-generated misinformation to influence voters and disrupt the electoral process.
2 Sources
2 Sources
Artificial Intelligence is playing a significant role in the 2024 US presidential race, but not in the ways experts initially feared. Instead of deepfakes and misinformation, AI is being used for campaign organization, voter outreach, and creating viral content.
6 Sources
6 Sources
As the U.S. presidential election approaches, foreign interference and disinformation campaigns from Russia, China, and Iran have become more sophisticated and pervasive, posing significant challenges to election integrity and public trust.
8 Sources
8 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved