FBI Had to Slap Down CBS News Over This Fake News Piece About...
Kash Patel Becomes the Focus of Media Analysis They Consistently Get Wrong
The Deplorable Treatment of Afghan Women Is a Glimpse Into Our Future
In Record Time, Voters Are Regretting Electing Socialist Mamdani
Steven Spielberg Flees California Before Its Billionaire Wealth Tax Fleeces Him
Oklahoma Bill Would Mandate Gun Safety Training in Public Schools
Here Is the Silver Lining to the Supreme Court's Tariff Ruling
CA Bends The Knee, Newsom Will Now Mandate English Proficiency Tests for Truck...
Will The Trump Administration Be Forced to Pay Back Billions in Tariff Revenue?
Armed Man Rammed Substation Near Las Vegas in Apparent Terror Plot Before Committing...
DOJ Moves to Strip U.S. Citizenship from Former North Miami Mayor Over Immigration...
DOJ Probes Three Michigan School Districts That Allegedly Teach Gender Ideology
5th Circuit Vacates Ruling That Blocked Louisiana's Mandate to Display 10 Commandments in...
Kansas Engineer Gets 29 Months for $1.2M Kickback Scheme on Nuclear Weapons Projects
DOJ Files Antitrust Lawsuit Against Ohio Healthcare Company
Tipsheet

The WSJ Took a Closer Look at TikTok Videos on the Gaza Conflict and Here's What They Found

The WSJ Took a Closer Look at TikTok Videos on the Gaza Conflict and Here's What They Found
AP Photo/Michael Dwyer, File

A new investigation and analysis by The Wall Street Journal turned up an unsurprising but no less shameful conclusion: a "majority" of videos served to users by TikTok's algorithm "supported the Palestinian view" of the conflict in Gaza that began on October 7 when Hamas terrorists launched a barbaric attack on Israel.

Advertisement

To conduct its survey, WSJ used a "handful of automated accounts...to understand what TikTok shows young users about the conflict. Those bots, registered as 13-year-old users, browsed TikTok's For You feed, the highly personalized, never-ending stream of content curated by the algorithm."

According to WSJ's report on the outcome of its survey, it didn't take long for TikTok's CCP-controlled algorithm to begin serving "highly polarized content, reflecting often extreme pro-Palestinian or pro-Israel positions about the conflict" but "a majority supported the Palestinian view" (read: the anti-Israel view). 

Specifically, the anti-Israel content accounted for "59% of the more than 4,800 videos served to the bots that the Journal reviewed and deemed relevant to the conflict or war" while just "15% of those shown were pro-Israel."

As the Wall Street Journal explained:

Dozens of these end-of-the-world or alarmist videos were shown more than 150 times across eight accounts registered by the Journal as 13-year-old users. Some urged viewers to prepare for an attack. “If you don’t own a gun, buy one,” one warns...

Some of the accounts quickly fell into so-called rabbit holes, where half or more of the videos served to them were related to the war. On one account, the first conflict-related video came up as the 58th video TikTok served. After lingering on several videos like it, the account was soon inundated with videos of protests, suffering children and descriptions of death...

The Journal set one of the accounts to a restricted mode, which TikTok says limits content that may not be suitable for all audiences. That didn’t stop the app from inundating the user with war. Soon after signing up, the account’s feed was almost entirely dominated by vivid images and descriptions of the conflict...

Advertisement

Related:

ISRAEL

A TikTok spokesperson claimed, in a statement to WSJ, that "the Journal’s experiment 'in no way reflects the behaviors or experiences of real teens on TikTok' and that "the platform doesn’t promote one side of an issue over another."

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos