The last several years have seen a tectonic shift in the way consumers, the media and policymakers think about the internet. In short, there is increasing pressure for lawmakers to do something to address the myriad harms that have long festered online. One effort marching through congress is Senators Blumenthal’s (D-CT) and Blackburn’s (R-TN) Kids Online Safety Act (KOSA) which was recently marked up by the Senate Commerce Committee. According to a one-pager on Sen. Blumenthal’s website, KOSA is motivated by harms to children on social media sites filled with user-generated content (like Facebook and YouTube) including self-harm, suicide, eating disorders, substance abuse, and sexual exploitation.
But in typical government fashion, the bill’s remedies include a laundry list of new regulations, technology mandates, Washington D.C. commissions, and an expansion of unelected, bureaucratic authority that sweep far beyond the concerns motivating the legislation and lead one to question whether the goal was ever really to help children to begin with. It also presents a serious threat to free speech by imposing a vague “duty of care” obligation requiring a huge cross section of online businesses to act “in the best interests of minors” exposing them to vast and unpredictable liability enforceable by any of 50 States Attorneys General and the FTC. In these partisan times, no business could be sure that a grandstanding state official somewhere across the 50 states wouldn’t use this provision to lash out against companies they disagree with.
Tellingly, the bill reaches far beyond the algorithm-driven social media platforms. Even after supposed narrowing amendments, the bill remains breathtakingly broad – and explicitly covers “educational services” and “video streaming services” that have nothing to do with the harmful online interactions and user-generated videos.
This overly broad definition applies the bill’s sweeping mandates, design requirements, and duties of care to any educational software, streaming service, or game that a child could potentially interact with – a huge portion of the Internet, even including mainstream TV and streaming services that allow no interaction or user postings at all and already have robust parental controls.
Recommended
That overreach won’t make kids any safer, but it will undermine innovation, drive up consumer costs, and distort a market that is healthy and serving consumers well. By forcing all “covered” entities to let government-appointed academics rummage through their internal operations and sift through their customer data (jeopardizing both their competitive advantage and their customers’ privacy), imposing new one-size-fits-all requirements on parental controls, requiring a costly written record of every minute every child spends on their platform, and giving the predatory Federal Trade Commission (FTC) new enforcement powers at a time when the agency is under heavy scrutiny for attacking American businesses across-the-board, the legislation would ultimately limit options for consumers and undermine gaming and streaming services consumers value and depend on.
It is not clear that Congress could write a bill narrowly tailored to the harms associated with user-generated content on platforms like YouTube and Facebook, but in KOSA they have not even tried do so. Is Sen. Blumenthal really worried that Khan Academy might start offering content that could encourage self-harm? Or that Disney+ might decide to start promoting smoking?
If services like those began offering content perceived to promote those activities, they would be punished swiftly and mercilessly in the marketplace.
The disconnect between the rhetoric and the legislation’s overly broad provisions demands far more scrutiny than appears to be occurring. The vast scope of the bill should convince members of Congress on the fence to vote no – no matter how tempting it might be to cast a vote in favor of “keeping kids safe online.”
Ideally, the best ideas in the bill -- enhanced parental controls and age-based opt-outs from objectionable material – would be implemented voluntarily by the major social media platforms. But if Congress decides legislation is needed, it needs to be much more narrowly targeted than the current bill, which would impose sweeping new regulations on the rest of the internet.
—Jon Decker is the executive director of American Commitment
Join the conversation as a VIP Member