TikTok algorithm systematically skewed to the right during 2024 US elections, new study reveals

2 Sources

Share

A groundbreaking study published in Nature found that TikTok's recommendation algorithm consistently favored pro-Republican content during the 2024 US presidential campaign. Researchers from New York University Abu Dhabi analyzed over 280,000 videos using 323 bot accounts across three states, discovering that Republican-aligned bots received 11.5% more reinforcing content than Democratic counterparts. The findings raise critical questions about algorithmic accountability and platform transparency as TikTok becomes a primary news source for young voters.

TikTok Algorithm Shows Systematic Imbalance During 2024 US Elections

Researchers at New York University Abu Dhabi have uncovered evidence that the TikTok algorithm systematically skewed to the right during the 2024 US elections, according to a study published in Nature

1

2

. Professors Talal Rahwan and Yasir Zaki led the investigation, which analyzed how TikTok's 'For You' recommendation algorithm distributed political content across different user profiles. The study matters because TikTok has emerged as a principal source of political information, particularly for young voters who shifted toward Republican candidate Donald Trump by 10 percentage points between the 2020 and 2024 presidential elections

1

.

The research team created 323 bot accounts designed to mimic real user behavior, training each to signal political preferences by watching videos aligned with either the Democratic Party or the Republican Party

1

. These automated accounts were artificially located through mock GPS and VPN routing in New York, Texas, and Georgia—representing strongly Democratic, strongly Republican, and competitive states respectively. Over 27 weeks during the 2024 presidential campaign, researchers collected more than 280,000 recommended videos and classified their political leanings using a combination of artificial intelligence and human review

1

.

Political Bias Revealed Through Systematic Analysis

The findings revealed a consistent systematic imbalance in how the platform distributed content. Bot accounts trained on Republican-aligned content received approximately 11.5% more politically reinforcing material compared to bots trained on Democrat-aligned content

2

. Even more striking, bots trained on Democrat-aligned content were shown about 7.5% more content from the opposing side, predominantly videos critical of the Democratic Party

1

. These patterns remained consistent across all three states and could not be explained by differences in video popularity or sharing metrics.

Source: Nature

Source: Nature

"Our finding isn't just about reinforcement; Democratic accounts were shown significantly more anti-Democratic content than Republican accounts were shown anti-Republican content," Rahwan explained . The research showed that the algorithm wasn't simply giving people what they wanted but was favoring Republican content in ways that transcended user choice. The imbalance concentrated in specific policy areas: immigration and crime for anti-Democrat content, and abortion for pro-Republican content

1

.

Shaping Political Information for Young Voters

The study gains urgency from TikTok's growing influence as a news source. According to Pew Research, about 42% of US social media users say these platforms are important for getting involved with political and social issues

2

. TikTok's unique structure makes it particularly susceptible to algorithmic influence. Unlike other platforms where users retain significant control over their feeds, TikTok's For You page is driven almost entirely by the platform's recommendation algorithm, with users having minimal ability to customize what appears

1

.

PhD student Hazem Ibrahim, who worked on the study, noted that the amplification of content designed to attack the opposing side on its weakest ground represents "a more targeted and arguably more concerning pattern than a uniform ideological drift"

2

. The research team emphasized that on TikTok, users don't need to follow anyone—the system decides based on behavioral signals like watch time, making it a uniquely clean setting for studying algorithmic influence because user self-selection is minimized.

Implications for Platform Transparency and Regulation

The findings carry significant weight for ongoing debates about algorithmic accountability and platform regulation. The European Union's Digital Services Act already requires large online platforms to assess and mitigate risks to electoral processes, whereas US law grants platforms broader editorial discretion

1

2

. Yasir Zaki stressed that "in an environment where margins are thin, systematic differences in the kind of political information recommended to tens of millions of young voters are worth taking seriously"

2

.

The study cannot determine exactly why the algorithm produces this imbalance—whether it stems from internal rules, content availability, or other factors not visible to outside researchers

1

. The researchers acknowledge limitations: their bots captured only early stages of user experience, analyzed only English-language video transcripts, and findings shouldn't be generalized beyond the three states studied. Future work could pair automated audits with data from real users, extend analyses beyond election periods, and compare TikTok's behavior with other platforms to determine whether this imbalance is specific to TikTok or more widespread

1

. A survey of 1,008 US-based TikTok users supported the bot findings, with Republican-leaning respondents reporting seeing more content aligned with their views than Democrat-leaning respondents

1

.

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
© 2026 TheOutpost.AI All rights reserved