Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
This article is part of a series, Bots and ballots: How artificial intelligence is reshaping elections worldwide, presented by Luminate.
When Hamas attacked Israel on Oct. 7, many sought updates from their main source for news: social media.
But unlike previous global conflicts, where the digital discourse was dominated by Facebook and X (formerly Twitter), the ongoing Middle East crisis has seen people flock to TikTok, in their millions, to relate news and express opinions.
Even as the video-sharing app’s popularity has ballooned, the inner workings of its complex, artificial intelligence-powered algorithms remain a mystery.
Individuals see only a fraction of what is posted daily on TikTok. And what they do see is highly curated by the company’s automated systems designed to keep people glued to their smartphones. Using AI technology known as machine learning and so-called recommender systems, these systems determine, within milliseconds, what content to display to social media users.
POLITICO set out to shed light on how TikTok’s algorithms work, and to root out which side in the war in the Middle East — Israeli or Palestinian — was winning hearts and minds on the social network now heavily favored by young people.
That’s become a hot political question after pro-Israeli groups and some Western lawmakers accused TikTok — owned by Beijing-based ByteDance — of unfairly promoting pro-Palestinian content for potential political impact. TikTok denies the accusations.
The conflict’s political effects are already evident in partisan clashes across Western democracies as people pick sides in the war — and decide how to vote. U.S. President Joe Biden’s support for Israel has drawn criticism from Arab-Americans, and it could eventually cost him the November election. In the United Kingdom, the populist independent candidate George Galloway harnessed pro-Palestinian sentiment to win a seat in the British parliament in March. University campus protests have erupted on both sides of the Atlantic.
TikTok’s algorithms are crucial to how all kinds of political content reaches social media feeds. Examining the company’s algorithms is a good proxy for how artificial intelligence is now a key player in determining what we see online.
POLITICO teamed up with Laura Edelson, a researcher at Northeastern University in Boston, to track pro-Palestinian and pro-Israeli TikTok content over four months between Oct. 7, 2023, and Jan. 29, 2024.
That involved creating a list of 50 popular hashtags like #IStandWithIsrael or #SavePalestine that could be directly associated with either side. More apolitical hashtags, like #Gaza or #Israel, were used to collect data on posts that did not have a specific leaning.
In total, Edelson analyzed 350,000 TikTok posts from the United States.
To make the data more digestible, she broke down the posts into three-day windows around specific events. That includes the initial Hamas attacks (Oct. 7-9); Israel’s invasion of Gaza (Oct. 27-29); and the release of the first Israeli hostages (Nov. 24-27.) As a control for bias, she also included Nov. 6-8 in the analysis, as a proxy for periods when no major events took place.
“TikTok, like other social media platforms, amplifies some content more than others,” said Edelson. “That can have a distorting effect on what people see in their feeds.”
What emerged was evidence of TikTok grappling with its role — in real-time — as one of the main global digital town squares where people gather to express their opinions and, often, disagree.
Over the four-month period, Edelson’s research found approximately 20 times more pro-Palestinian content produced, based on the hashtags analyzed, compared with pro-Israeli material. Yet that didn’t necessarily equate to more pro-Palestininan posts winding up in the average person’s TikTok feed.
Instead, Edelson found three distinct times when the likelihood of people seeing pro-Israeli or pro-Palestinian content in their TikTok feeds changed markedly — no matter how much overall material was being produced by either side.
TikTok did not respond to specific requests for comment about the Northeastern University research. In a blog post in April, the company said it had removed more than 3.1 million videos and suspended more than 140,000 livestreams in Israel and Palestine for violating its terms of service.
Much about how these social media algorithms work is unknown. It is unclear who within companies — engineers, policy officials or top executives — determines how they function. It’s also difficult to determine when changes are made, although regulatory efforts by the European Union and the United States are trying to shine a larger spotlight on these practices.
What follows below is an example of how, when you dig into the numbers, much of what users see on social media relies heavily on complex algorithms that are regularly tweaked with little — if any — oversight.
The TikTok posts were collected separately via Junkipedia, a repository of social media content managed by the National Conference on Citizenship, a nonprofit organization. They represent the most viewed partisan posts over each time period.