Bucks County Dem Apologizes for Trying to Steal the PA Senate Race
Jon Stewart Rips Into Dems for Their Obnoxious Sugar-Coating of the 2024 Election
Trump's Border Czar Issues a Warning to Dem Politicians Pledging to Shelter Illegal...
Celebrate Diversity (Or Else)!
Homan Says They'll 'Absolutely' Use Land Texas Offered for Deportation Operation
For the First Time in State History, California Voters Say No to Another...
Breaking: ICC Issues Arrest Warrants for Netanyahu, Gallant
Begich Flips Alaska's Lone House Seat for Republicans
It's Hard to Believe the US Needs Legislation This GOP Senator Just Introduced,...
Newton's Third Law of Politics
By the Numbers: Trump's Extraordinary Gains Among Latinos, From Texas to...California?
John Oliver Defended Transgender Athletes Competing in Women’s Sports. JK Rowling Responde...
Restoring American Strength and Security with Trump’s Cabinet Picks
Linda McMahon to Education May Choke Foreign Influence Operations on Campus
Unburden Us From the Universities
Tipsheet

Second Victim Joins Lawsuit Accusing Twitter of Allowing Sex Trafficking

AP Photo/Matt Rourke

Last week, a second accuser joined legal action against Twitter alleging the tech giant illegally profited from the sharing of sexually trafficked content.

Filed in January, the original lawsuit was brought by a single accuser, known as John Doe #1, and alleged that sexually trafficked content of himself as a young teenager was allowed to be monetized and shared across twitter, with the platform further denying requests to have the explicit images taken down. 

Advertisement

As of last week, a second alleged victim, referred to as John Doe #2, has joined the federal lawsuit alleging he was likewise harmed by the irresponsible actions of Twitter. 

“The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm are suing Twitter on behalf of a second survivor of child sexual abuse who was trafficked on the social media platform,” said a statement released by NCOSC. 

“Both plaintiffs,” the statement continues, ”were harmed by Twitter’s distribution of material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified.”

John Doe #1, who is now 17, says he was no older than 14 when he first exchanged sexually graphic images over Snapchat with someone he believed to be a teenage girl. In reality, the supposed girl was an adult sex trafficker, and shortly thereafter Doe #1 found himself being blackmailed into sending more explicit content. 

Doe #1 complied with the trafficker’s demands at first, but ultimately, he blocked the blackmailers and hoped to put the traumatic incident behind him. However, this was not the case as in 2019 some of the blackmailed videos found their way to Twitter. As the videos spread online, they were only brought to Doe #1’s attention when his classmates began to tease and bully him.

Advertisement

Doe #1 and his family reported the child pornography to Twitter on multiple occasions, but ultimately, the company refused to take down the explicit posts until law enforcement got involved. 

In fact, according to the lawsuit, Twitter’s original response to Doe #1’s report that underage and explicit videos of himself were being housed on their platform was stunningly underwhelming. 

“Thanks for reaching out,” wrote a Twitter support agent, “We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” This response might very well be laughable if not for its horrid practical consequence of allowing child pornography to proliferate across Twitter. 

For their part, Twitter has argued that they did not willfully neglect the requests to remove the explicit content, but instead, they say it simply takes time for such problems to be addressed in a platform as big as Twitter. 

A Twitter spokesperson clarified their position in earlier reporting, expressing, “Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy.” 

Advertisement

Going forward, the legal battle will likely focus on section 230 of the Communications Decency Act, which grants social media companies immunity against prosecution for content posted to their respective platforms. 

Twitter has filed a motion to dismiss the case based on section 230 and the court will hear the case beginning on June 4. 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement