So I Got a Call From The New York Times...
The Latest Trump Move Involving Minneapolis Is Going to Trigger a Lib Meltdown
Here’s Why That ICE Agent Involved in the Minneapolis Shooting Is in Hiding
Latest NYT Piece on Mamdani Shows How Being an American Liberal Is Just...
Why the Hell Should We Care If Democrats Don’t?
Israel Misunderstood
A Quick Bible Study Vol. 303: The Best of St. Paul
You Won't Believe What These Hotels Are Doing to ICE Agents
Trump Questions Why Minnesotans Are Harassing ICE, Civilians
Men Need to Work
Greenland and the Return of Great-Power Politics
INSANITY: Mob of Leftist Rioters Stab and Beat Anti-Islam Activist in Minneapolis
U.S. Strike in Syria Kills Terrorist Linked to Murder of American Soldiers
Florida Man Convicted of $4.5M Scheme to Defraud U.S. Military Fuel Program
Chinese National Pleads Guilty to $27 Million Scam Targeting 2,000 Elderly Victims Nationw...
Tipsheet

Second Victim Joins Lawsuit Accusing Twitter of Allowing Sex Trafficking

AP Photo/Matt Rourke

Last week, a second accuser joined legal action against Twitter alleging the tech giant illegally profited from the sharing of sexually trafficked content.

Filed in January, the original lawsuit was brought by a single accuser, known as John Doe #1, and alleged that sexually trafficked content of himself as a young teenager was allowed to be monetized and shared across twitter, with the platform further denying requests to have the explicit images taken down. 

Advertisement

As of last week, a second alleged victim, referred to as John Doe #2, has joined the federal lawsuit alleging he was likewise harmed by the irresponsible actions of Twitter. 

“The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm are suing Twitter on behalf of a second survivor of child sexual abuse who was trafficked on the social media platform,” said a statement released by NCOSC. 

“Both plaintiffs,” the statement continues, ”were harmed by Twitter’s distribution of material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified.”

John Doe #1, who is now 17, says he was no older than 14 when he first exchanged sexually graphic images over Snapchat with someone he believed to be a teenage girl. In reality, the supposed girl was an adult sex trafficker, and shortly thereafter Doe #1 found himself being blackmailed into sending more explicit content. 

Doe #1 complied with the trafficker’s demands at first, but ultimately, he blocked the blackmailers and hoped to put the traumatic incident behind him. However, this was not the case as in 2019 some of the blackmailed videos found their way to Twitter. As the videos spread online, they were only brought to Doe #1’s attention when his classmates began to tease and bully him.

Advertisement

Doe #1 and his family reported the child pornography to Twitter on multiple occasions, but ultimately, the company refused to take down the explicit posts until law enforcement got involved. 

In fact, according to the lawsuit, Twitter’s original response to Doe #1’s report that underage and explicit videos of himself were being housed on their platform was stunningly underwhelming. 

“Thanks for reaching out,” wrote a Twitter support agent, “We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” This response might very well be laughable if not for its horrid practical consequence of allowing child pornography to proliferate across Twitter. 

For their part, Twitter has argued that they did not willfully neglect the requests to remove the explicit content, but instead, they say it simply takes time for such problems to be addressed in a platform as big as Twitter. 

A Twitter spokesperson clarified their position in earlier reporting, expressing, “Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy.” 

Advertisement

Going forward, the legal battle will likely focus on section 230 of the Communications Decency Act, which grants social media companies immunity against prosecution for content posted to their respective platforms. 

Twitter has filed a motion to dismiss the case based on section 230 and the court will hear the case beginning on June 4. 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement