Curated by THEOUTPOST
On Sat, 24 Aug, 8:02 AM UTC
2 Sources
[1]
The year of the AI election that wasn't
SAN FRANCISCO -- Matthew Diemer, a Democrat running for election in Ohio's 7th Congressional District, was approached in January with a pitch by artificial intelligence company Civox: AI-backed voice technology that could make tens of thousands of personalized phone calls to voters using Diemer's talking points and sense of humor. His campaign agreed to try out the technology. But it turned out that the only thing voters hated more than a robocall was an AI-backed one. While Civox's AI program made almost 1,000 calls to voters in five minutes, nearly all of them hung up in the first few seconds when they heard a voice that described itself as an AI volunteer, Diemer said. "People just didn't want to be on the phone, and they especially didn't want to be on the phone when they heard they were talking to an AI program," said the entrepreneur, who ran unsuccessfully in 2022 for the same seat he is seeking now. "Maybe people weren't ready yet for this type of technology." This was supposed to be the year of the AI election. Fueled by a proliferation of AI tools such as chatbots and image generators, more than 30 tech companies have offered AI products to national, state and local U.S. political campaigns in recent months. The companies -- mostly smaller firms such as BHuman, VoterVoice and Poll the People -- make products that reorganize voter rolls and campaign emails, expand robocalls and create AI-generated likenesses of candidates that can meet and greet constituents virtually. But campaigns are largely not biting -- and when they have, the technology has fallen flat. Only a handful of candidates are using AI, and even fewer are willing to admit it, according to interviews with 23 tech companies and seven political campaigns. Three of the companies said campaigns agreed to buy their tech only if they could ensure the public would never find out they had used AI. Much of the hesitation stems from internal campaign polls that found voters were nervous about AI and distrusted the technology, said four officials involved in Democratic and Republican campaigns. When campaigns turned to AI to generate photos or videos of candidates, the numbers were even worse, one of them said. Some uses of AI in political campaigns have already flopped. In January, an AI robocall that mimicked President Joe Biden's voice in the New Hampshire primary was denounced by political watchdogs and investigated by local law enforcement. On Monday, former President Donald Trump posted AI-generated images of Taylor Swift endorsing him to his social media site, Truth Social. The response from her fans was anger and condemnation. "Political campaigns have trust issues to begin with," said Phillip Walzak, a political consultant in New York. "No candidate wants to be accused of posting deepfakes in the election or using AI in a way that deceives voters." The skepticism is part of a new reality for AI as enthusiasm over the technology has cooled. This year, tech giants and startups that had celebrated AI as the wave of the future have begun hedging their promises. Wall Street has become wary of the financial goals set by AI companies. And lawmakers have proposed measures that could slow the AI industry's growth. Just six months ago, it was a different story. Drawn by the promise of millions of dollars in campaign funds that candidates would spend to win, dozens of tech companies shifted their technology toward the U.S. election. They created chatbots like ChatGPT with AI image generators to create walking, talking clones of candidates that could interact with voters virtually. BHuman, a New York company founded in 2020 that uses AI to create videos, has pitched political campaigns on a product that personalizes videos of candidates for voters. Candidates could record themselves speaking on an issue and BHuman's AI-based technology could then clone their face and voice to create new videos. The opening lines could be tweaked to greet a specific voter, or recite a particular talking point. "Imagine you're a voter and you get a video in which a candidate says your name and speaks to your issues," said Don Bosco, BHuman's founder. "That is creating human connection." BHuman also offers a product that creates a digital replica of a candidate, mimicking the candidate's writing style to answer emails or engaging in virtual chats with voters. Bosco declined to comment on which campaigns had used his company's products. Personaliz.ai, an AI company founded last year and based in Hyderabad, India, said it worked with more than 30 politicians in India's national elections this year. The firm made videos where AI versions of candidates interacted with voters on LinkedIn and on campaign websites. They also sent personalized videos to people's phones through WhatsApp and text messages. Santosh Thota, CEO of Personaliz.ai, said that the response from candidates and voters in India was "great" and that his company had seen interest from other Southeast Asian countries and had shown its tech to politicians in several African countries. But he has not seen the same interest from the United States and Europe, he said. "People in the U.S. are skeptical of the technology," Thota said. Civox, which is based in London and worked with Diemer's campaign, said it was still experimenting with the right way to reach voters with its technology. Apart from its AI voice technology, the company offers chatbot-like programs that can answer voters' questions on behalf of a campaign. Ilya Mouzykantskii, Civox's CEO, said AI is not a magic bullet for winning, but that the tools could help campaigns -- especially small ones -- "run more automated and targeted outreach." Some campaigns have been more willing to buy tech from AI companies for behind-the-scenes tasks, such as helping organize email lists and voter databases, three of the companies said. When Diemer initially began working with Civox on AI voice technology, he asked that his voice be used to train the AI robocall, the company said. But Civox urged him to go with a voice that was clearly artificially generated, so voters would know that AI was involved and that the campaign was acting transparently. Diemer's campaign eventually settled on an AI voice that said, "Hi, I'm Ashley, an artificial intelligence volunteer for Matt Diemer." The calls were placed in March, just before Super Tuesday. The pickup rate of robocalls, whether done by AI or by human voice, was in the single digits, Civox said. Most people hung up on the calls from Diemer's campaign in the first few seconds. Civox declined to comment on how much the tech cost. The company worked with about a dozen political campaigns over four months in the spring and made hundreds of thousands of robocalls to test its AI technology. Diemer said he didn't regret experimenting with AI. "I love AI and tech and what it could potentially do to make political campaigns more affordable and accessible for everyone," he said. "I don't think everyone got what we were trying to do, or gave it a chance to see that maybe, AI was a great tool in reaching voters."
[2]
The year of the AI election that wasn't
This was supposed to be the year of the AI election. Fueled by a proliferation of AI tools such as chatbots and image generators, more than 30 tech companies have offered AI products to national, state and local U.S. political campaigns in recent months. The companies mostly smaller firms such as BHuman, VoterVoice and Poll the People make products that reorganize voter rolls and campaign emails, expand robocalls and create AI-generated likenesses of candidates that can meet and greet constituents virtually.Matthew Diemer, a Democrat running for election in Ohio's 7th Congressional District, was approached in January with a pitch by artificial intelligence company Civox: AI-backed voice technology that could make tens of thousands of personalized phone calls to voters using Diemer's talking points and sense of humor. His campaign agreed to try out the technology. But it turned out that the only thing voters hated more than a robocall was an AI-backed one. While Civox's AI program made almost 1,000 calls to voters in five minutes, nearly all of them hung up in the first few seconds when they heard a voice that described itself as an AI volunteer, Diemer said. "People just didn't want to be on the phone, and they especially didn't want to be on the phone when they heard they were talking to an AI program," said the entrepreneur, who ran unsuccessfully in 2022 for the same seat he is seeking now. "Maybe people weren't ready yet for this type of technology." This was supposed to be the year of the AI election. Fueled by a proliferation of AI tools such as chatbots and image generators, more than 30 tech companies have offered AI products to national, state and local U.S. political campaigns in recent months. The companies -- mostly smaller firms such as BHuman, VoterVoice and Poll the People -- make products that reorganize voter rolls and campaign emails, expand robocalls and create AI-generated likenesses of candidates that can meet and greet constituents virtually. But campaigns are largely not biting -- and when they have, the technology has fallen flat. Only a handful of candidates are using AI, and even fewer are willing to admit it, according to interviews with 23 tech companies and seven political campaigns. Three of the companies said campaigns agreed to buy their tech only if they could ensure the public would never find out they had used AI. Much of the hesitation stems from internal campaign polls that found voters were nervous about AI and distrusted the technology, said four officials involved in Democratic and Republican campaigns. When campaigns turned to AI to generate photos or videos of candidates, the numbers were even worse, one of them said. Some uses of AI in political campaigns have already flopped. In January, an AI robocall that mimicked President Joe Biden's voice in the New Hampshire primary was denounced by political watchdogs and investigated by local law enforcement. On Monday, former President Donald Trump posted AI-generated images of Taylor Swift endorsing him to his social media site, Truth Social. The response from her fans was anger and condemnation. "Political campaigns have trust issues to begin with," said Phillip Walzak, a political consultant in New York. "No candidate wants to be accused of posting deepfakes in the election or using AI in a way that deceives voters." The skepticism is part of a new reality for AI as enthusiasm over the technology has cooled. This year, tech giants and startups that had celebrated AI as the wave of the future have begun hedging their promises. Wall Street has become wary of the financial goals set by AI companies. And lawmakers have proposed measures that could slow the AI industry's growth. Just six months ago, it was a different story. Drawn by the promise of millions of dollars in campaign funds that candidates would spend to win, dozens of tech companies shifted their technology toward the U.S. election. They created chatbots like ChatGPT with AI image generators to create walking, talking clones of candidates that could interact with voters virtually. (BEGIN OPTIONAL TRIM.) BHuman, a New York company founded in 2020 that uses AI to create videos, has pitched political campaigns on a product that personalizes videos of candidates for voters. Candidates could record themselves speaking on an issue and BHuman's AI-based technology could then clone their face and voice to create new videos. The opening lines could be tweaked to greet a specific voter, or recite a particular talking point. "Imagine you're a voter and you get a video in which a candidate says your name and speaks to your issues," said Don Bosco, BHuman's founder. "That is creating human connection." BHuman also offers a product that creates a digital replica of a candidate, mimicking the candidate's writing style to answer emails or engaging in virtual chats with voters. Bosco declined to comment on which campaigns had used his company's products. Personaliz.ai, an AI company founded last year and based in Hyderabad, India, said it worked with more than 30 politicians in India's national elections this year. The firm made videos where AI versions of candidates interacted with voters on LinkedIn and on campaign websites. They also sent personalized videos to people's phones through WhatsApp and text messages. Santosh Thota, CEO of Personaliz.ai, said that the response from candidates and voters in India was "great" and that his company had seen interest from other Southeast Asian countries and had shown its tech to politicians in several African countries. But he has not seen the same interest from the United States and Europe, he said. "People in the U.S. are skeptical of the technology," Thota said. (END OPTIONAL TRIM.) Civox, which is based in London and worked with Diemer's campaign, said it was still experimenting with the right way to reach voters with its technology. Apart from its AI voice technology, the company offers chatbot-like programs that can answer voters' questions on behalf of a campaign. Ilya Mouzykantskii, Civox's CEO, said AI is not a magic bullet for winning, but that the tools could help campaigns -- especially small ones -- "run more automated and targeted outreach." Some campaigns have been more willing to buy tech from AI companies for behind-the-scenes tasks, such as helping organize email lists and voter databases, three of the companies said. (STORY CAN END HERE. OPTIONAL MATERIAL FOLLOWS.) When Diemer initially began working with Civox on AI voice technology, he asked that his voice be used to train the AI robocall, the company said. But Civox urged him to go with a voice that was clearly artificially generated, so voters would know that AI was involved and that the campaign was acting transparently. Diemer's campaign eventually settled on an AI voice that said, "Hi, I'm Ashley, an artificial intelligence volunteer for Matt Diemer." The calls were placed in March, just before Super Tuesday. The pickup rate of robocalls, whether done by AI or by human voice, was in the single digits, Civox said. Most people hung up on the calls from Diemer's campaign in the first few seconds. Civox declined to comment on how much the tech cost. The company worked with about a dozen political campaigns over four months in the spring and made hundreds of thousands of robocalls to test its AI technology. Diemer said he didn't regret experimenting with AI. "I love AI and tech and what it could potentially do to make political campaigns more affordable and accessible for everyone," he said. "I don't think everyone got what we were trying to do, or gave it a chance to see that maybe, AI was a great tool in reaching voters." This article originally appeared in The New York Times.
Share
Share
Copy Link
Despite predictions of AI significantly influencing elections in 2023, its impact was less dramatic than anticipated. This story explores the actual role of AI in recent elections and the ongoing concerns about its potential future effects.
As 2023 drew to a close, the anticipated widespread influence of artificial intelligence (AI) on elections didn't materialize as dramatically as many had predicted. Despite concerns about AI's potential to sway voters and spread misinformation, its impact on recent elections was more subdued than expected 1.
While AI-generated content did make appearances during election campaigns, its effect was not as pervasive or disruptive as initially feared. In the United States, for instance, the 2023 off-year elections saw minimal AI-related incidents. Similarly, other countries holding elections throughout the year didn't report significant AI-driven interference 1.
Despite the relatively quiet year, experts and officials remain vigilant about AI's potential to influence future elections. The rapid advancement of AI technology, particularly in creating convincing deepfakes and generating persuasive text, continues to be a source of concern. Election authorities and tech companies are working to develop strategies to detect and combat AI-generated misinformation in anticipation of upcoming major elections 2.
Social media platforms and tech giants play a crucial role in mitigating the spread of AI-generated misinformation. Companies like Meta and Google have implemented policies to label AI-generated content and remove misleading information. However, the effectiveness of these measures and the platforms' ability to keep pace with evolving AI technologies remain subjects of debate 2.
As major elections loom in 2024, including the U.S. presidential race, concerns about AI's potential impact are intensifying. Experts warn that the technology's capabilities are advancing rapidly, and its influence could be more significant in the coming year. Election officials, tech companies, and voters alike are being urged to remain vigilant and prepared for potential AI-driven challenges to electoral integrity 1 2.
While technological solutions are crucial, human judgment and critical thinking remain essential in combating AI-generated misinformation. Experts emphasize the importance of media literacy and public awareness in recognizing and resisting manipulated content. The role of journalists, fact-checkers, and informed citizens in verifying information and maintaining electoral integrity is more critical than ever in the age of AI 2.
Reference
[1]
[2]
A comprehensive look at how AI technologies were utilized in the 2024 global elections, highlighting both positive applications and potential risks.
4 Sources
4 Sources
Artificial Intelligence is playing a significant role in the 2024 US presidential race, but not in the ways experts initially feared. Instead of deepfakes and misinformation, AI is being used for campaign organization, voter outreach, and creating viral content.
6 Sources
6 Sources
Victor Miller, a mayoral candidate in Cheyenne, Wyoming, who promised to govern with AI assistance, loses the election. The campaign raises questions about the role of AI in politics and governance.
3 Sources
3 Sources
As the 2024 U.S. presidential election approaches, artificial intelligence emerges as a powerful and potentially disruptive force, raising concerns about misinformation, deepfakes, and foreign interference while also offering new campaign tools.
6 Sources
6 Sources
Artificial intelligence poses a significant threat to the integrity of the 2024 US elections. Experts warn about the potential for AI-generated misinformation to influence voters and disrupt the electoral process.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved