Does X (previously known as Twitter) enjoy carte blanche immunity to violate federal sex trafficking and child pornography laws?
The Supreme Court has a rare opportunity to answer this important question by granting review to the petition for certiorari filed by two boys whose child pornography images were trafficked and monetized on X when they were 13 and 14 years old. The case is John Doe 1 and John Doe 2 v. Twitter, Inc.; X Corp. When sued by the boys and called out for its actions, Twitter claimed it was absolutely immune from civil suits for criminal behavior under a law popularly known as CDA §230 of the Communications Decency Act. Remarkably, the Ninth Circuit Court of Appeals agreed. Acknowledging that Twitter's conduct amounted to a crime under federal law, the federal appellate court nonetheless gave Twitter a complete pass.
This story begins in 2017 when 13-year-old John Doe (pseudonym) engaged on Snapchat with what he thought was a teenage girl who attended his school. Or he thought so. His classmate, John Doe 2, also began interacting with her. He was 14. They sent nude photos and videos at her request. But she was not who she said she was. She was, in fact, one or more sex traffickers.
Once the traffickers realized that they had ensnared these two young boys, they convinced them to send more explicit videos. They then escalated to blackmail, demanding that John Doe 1 and John Doe 2 record sexually graphic videos and perform sexual acts. In exchange, the traffickers promised not to distribute the photos and videos to the boys' parents, coach, pastor, and others in the community. Trapped and naive, the boys did as the traffickers demanded.
But three years later, their worst nightmare became reality. A compilation video depicting the images and recorded sex acts—criminal child pornography—became public on Twitter. It was posted, re-posted, and "liked" thousands of times on Twitter. As it spread among their peers at school, the boys were bullied by those who saw it. John Doe 1 became suicidal, and John Doe 2 stopped attending school.
When John Doe 1 turned to his parents for help, his mother immediately contacted school officials, local law enforcement, and Twitter. John Doe 1 also repeatedly contacted Twitter. And so began a back-and-forth that resulted in Twitter's knowing and deliberate decision to keep the criminal child pornography on its platform.
Recommended
First, on January 21, 2020, John Doe 1 sent a complaint alerting Twitter that a user—already the subject of another complaint for posting child pornography—had posted child pornography of him. Twitter asked John Doe 1 to confirm his identity. He did so the same day by providing his driver's license, confirming that he was a minor and the child shown.
The next day, John Doe 1's mother submitted two more reports to Twitter. Twitter initially responded with mere automated messages. After waiting four days, she again reported the illegal child pornography and reminded Twitter what was at stake: "You [have allowed] child pornography to be [on] your [website] for over a week now. … We want them removed immediately." Another two days passed.
All the while, the child pornography video amassed tens of thousands of views and thousands of re-postings. In only the first two days after John Doe 1 contacted Twitter, one post was viewed 167,000 times. It was "retweeted" 2,223 times and "liked" 6,640 times. The criminal child pornography remained live for another week, resulting in substantially more views and retweets. Even the most cursory glance confirmed its criminal nature. Commenters on one thread posted, "Is that kid a minor?," "they both are," "it sure is," "Is there a continuation for this," "Nice boys," while others labeled the videos "twinks."
Finally, on January 28, 2020, Twitter responded by email to John Doe 1. Despite taking steps to verify John Doe 1 was the child in the video, this was Twitter's response:
"We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time."
Twitter actually made the knowing and deliberate decision to let the criminal child pornography proliferate on its platform. Twitter also failed to report it to the National Center for Missing & Exploited Children as federal law required. See 18 U.S.C. §2258A.
John Doe 1's mother turned to a Department of Homeland Security official for help. The official lodged a complaint with Twitter. Only then did Twitter remove it. By this point, nine days had passed since John Doe 1 alerted Twitter to the child pornography depicting him on its platform. And still, even after removing the video, Twitter failed to block the IP address or take other actions to stop the user who brought the criminal child pornography to Twitter's site. A year later, in January 2021, when the boys' lawsuit against Twitter was filed, the trafficker who posted the boys' child pornography images on Twitter was still being allowed to post criminal content victimizing minors to the site.
Here's where the profit angle comes in. Twitter financially benefited by allowing child pornography to proliferate on its platform. Twitter earns nearly all its revenue from advertising. It draws advertisers with its detailed knowledge of its users' activities, which allows advertisers to target their ads to particular users. As long as content on Twitter's platform remains live, Twitter earns money from that content, whether legal or illegal.
As alleged in the federal complaint, Twitter monetizes large amounts of human trafficking and commercial sexual exploitation material on its platform. Despite the site's stated "zero-tolerance child sexual exploitation policy," the facts of this case illuminate that the platform is popular among those wishing to view, buy, sell, or trade child pornography and other forms of sexual exploitation. The complaint states that the popularity of these criminal and sexually exploitive practices on its platform has resulted in significant revenue for Twitter.
What makes this case obviously different than others implicating bad behavior by online platforms is that Twitter knew that criminal child pornography involving John Doe 1 and John Doe 2 was proliferating on its platform. John Doe 1 himself told Twitter repeatedly. His mom told Twitter repeatedly. Twitter asked John Doe 1 to confirm his identity with a photo ID. He complied. Twitter's response? "We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time." Twitter made the knowing and deliberate decision to keep the criminal video on its platform for another week. As noted, just one post with the criminal video amassed more than 160,000 views and many thousands of retweets and "likes." Even third-party commenters posted, "Is that kid a minor?" Others responded, "[T]hey both are," and worse. Twitter changed course and removed it only after a Department of Homeland Security official took action.
The crux of this case is that knowingly possessing and distributing child pornography and knowingly benefiting from a sex-trafficking venture are federal crimes. See 18 U.S.C. §§2252A. Congress empowered victims of those crimes to seek civil remedies. See 18 U.S.C. §§2252A(f), 2255, 1595(a). The boys did so. Twitter responded that it was totally immune, invoking §230 of the Communications Decency Act. Unbelievably, the San Francisco-based Ninth Circuit sided with Twitter. It concluded that Twitter's knowing and deliberate decision to keep the criminal child pornography on its platform was immune from federal civil repercussions.
This case is the right case to begin to address "whether social-media platforms—some of the largest and most powerful companies in the world—can be held responsible for their own misconduct." Doe Through Roe v. Snap, Inc., 144 S. Ct. 2493, 2493 (2024) (Thomas, J., joined by Gorsuch, J., dissenting from the denial of cert.). This case would allow the Supreme Court to make its first ruling on whether knowingly possessing, distributing, and profiting from child pornography is immune under CDA 230.
We and many others believe the Communications Decency Act was never intended to give carte blanche immunity to platforms like Twitter to knowingly violate federal child pornography and sex trafficking law and thereby victimize countless children. There must be consequences for hurting kids in this way, particularly when Twitter knew they were doing it and seemingly didn't care. Profit always comes first. For the first time, let's put protecting our children above the almighty dollar sign. The Supreme Court should grant certiorari to this profoundly important case.
Benjamin Bull is the general counsel for the National Center on Sexual Exploitation, the leading national non-partisan organization exposing the links between all forms of sexual exploitation, such as child sexual abuse, prostitution, sex trafficking, and the public health harms of pornography. On X: @NCOSE
Editor’s Note: Do you enjoy Townhall’s conservative reporting that takes on the radical Left and woke media? Support our work so that we can continue to bring you the truth.
Join Townhall VIP and use promo code FIGHT to receive 60% off your membership.







Join the conversation as a VIP Member