Will AI Data Centers Cause an Eminent Domain Explosion?
John Cornyn Reverses Position on Nuking Filibuster to Pass SAVE America Act
CNN Proves False Narratives Are a Network Feature; WaPo Upset Photographers It Does...
Bombshell Federal Lawsuit Says Teachers Abused Students for Decades in Small Wisconsin Sch...
Ayatollah Khamenei Opposed His Son As His Successor As Reports Swirl He May...
The FBI Just Issued This Warning to Police Departments in California
The 3 Big Lies About the Iran War
Florida Teens Accused of Plotting to Kill Classmate to Resurrect Sandy Hook Shooter
Farm Labor Company Operator Pleads Guilty to RICO Charge in Worker Exploitation Case
Venezuelan Man Accused of Assaulting Federal Agent, Grabbing Gun During Arrest in Michigan
This Major Insurance Company Agreed to Pay $117M Over Allegedly Overcharging Medicare for...
James Carville Admits He Has 'Trump Derangement Syndrome' — Says He Prays for...
Pennsylvania Dentist Among Three Found Guilty in $30M Medicaid Fraud Conspiracy
James Talarico Quietly Deletes Endorsement Page Showcasing His Most Radical Supporters
New York Man Accused of Threatening President Trump, ICE Agents on YouTube
Tipsheet

Second Victim Joins Lawsuit Accusing Twitter of Allowing Sex Trafficking

Second Victim Joins Lawsuit Accusing Twitter of Allowing Sex Trafficking
AP Photo/Matt Rourke

Last week, a second accuser joined legal action against Twitter alleging the tech giant illegally profited from the sharing of sexually trafficked content.

Filed in January, the original lawsuit was brought by a single accuser, known as John Doe #1, and alleged that sexually trafficked content of himself as a young teenager was allowed to be monetized and shared across twitter, with the platform further denying requests to have the explicit images taken down. 

Advertisement

As of last week, a second alleged victim, referred to as John Doe #2, has joined the federal lawsuit alleging he was likewise harmed by the irresponsible actions of Twitter. 

“The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm are suing Twitter on behalf of a second survivor of child sexual abuse who was trafficked on the social media platform,” said a statement released by NCOSC. 

“Both plaintiffs,” the statement continues, ”were harmed by Twitter’s distribution of material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified.”

John Doe #1, who is now 17, says he was no older than 14 when he first exchanged sexually graphic images over Snapchat with someone he believed to be a teenage girl. In reality, the supposed girl was an adult sex trafficker, and shortly thereafter Doe #1 found himself being blackmailed into sending more explicit content. 

Doe #1 complied with the trafficker’s demands at first, but ultimately, he blocked the blackmailers and hoped to put the traumatic incident behind him. However, this was not the case as in 2019 some of the blackmailed videos found their way to Twitter. As the videos spread online, they were only brought to Doe #1’s attention when his classmates began to tease and bully him.

Advertisement

Doe #1 and his family reported the child pornography to Twitter on multiple occasions, but ultimately, the company refused to take down the explicit posts until law enforcement got involved. 

In fact, according to the lawsuit, Twitter’s original response to Doe #1’s report that underage and explicit videos of himself were being housed on their platform was stunningly underwhelming. 

“Thanks for reaching out,” wrote a Twitter support agent, “We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” This response might very well be laughable if not for its horrid practical consequence of allowing child pornography to proliferate across Twitter. 

For their part, Twitter has argued that they did not willfully neglect the requests to remove the explicit content, but instead, they say it simply takes time for such problems to be addressed in a platform as big as Twitter. 

A Twitter spokesperson clarified their position in earlier reporting, expressing, “Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy.” 

Advertisement

Going forward, the legal battle will likely focus on section 230 of the Communications Decency Act, which grants social media companies immunity against prosecution for content posted to their respective platforms. 

Twitter has filed a motion to dismiss the case based on section 230 and the court will hear the case beginning on June 4. 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos