


{C} ritics of the Kids Online Safety Act (KOSA) are making a last-ditch bid to scare policy-makers away by invoking the specter of political censorship. David McGarry, a policy analyst at the Taxpayers Protection Alliance, provides a fine example in National Review. Responding to my own argument that KOSA reflects a reasonable effort to abide by our shared norms about child safety (for instance, keeping kids out of dangerous environments such as nightclubs), McGarry writes, “Going online resembles walking out the front door more than entering a nightclub. The internet — much of which flows through social-media sites these days — is not a place one frequents only for a night out.” In his view, “Speech-limiting proposals like KOSA invite First Amendment scrutiny while forbidding minors to buy alcohol does not.”
Policymakers should not allow themselves to be fooled by this misdirection. Of course, McGarry is quite correct: Logging on to social media, through which we increasingly encounter the internet at large, can indeed resemble walking out the front door. He is right that social media is ubiquitous and hard to escape. It is, by design, so psychologically all-consuming that leaked internal research from Instagram indicates that, for many teenagers, the “boundary between social media and IRL [in real life] is blurred; feelings and boundaries cross between the two.” It does not, and cannot, follow, however, that because online space is ubiquitous — like “walking out the front door” — it should be unregulated.
We don’t lock kids inside — that, too, is harmful — but we do take note of the many dangers they might encounter when they walk outside, and thus regulate numerous aspects of the world outside their front door. Stop signs, crosswalks, and crossing guards regulate traffic to keep kids safe. Police restrain, detain, and deter those who would violate our shared commitments to child safety. We prohibit ten-year-olds from buying booze or even accessing otherwise-protected speech like pornography. And yes, if while outside their front door they try to walk into a nightclub, we collectively say “no,” with the force of law behind us.
Social media is not exempt from these norms just because it exists in digital rather than physical space. Kids are subject to the same — and sometimes significantly more — danger of addiction, predation, exploitation, and harm. As AEI’s Yuval Levin put it, “If Instagram and TikTok were brick-and-mortar spaces in your neighborhood, you probably would never let even your teenager go to them alone.”
McGarry’s analogy between going online and walking outside does nothing to refute KOSA’s premise that we should protect children in public spaces, which perhaps explains why he abandons it within a few sentences. Following the path trod by most KOSA critics, he moves quickly to the terrain of free speech, arguing that we cannot act, because efforts to regulate social media will inevitably fall prey to partisan censorship.
One facet of this argument holds that the FTC, to which KOSA would assign an enforcement role, cannot be trusted because the current commission has proven itself a zealous censor of conservative speech. True, the FTC understands its powers to be sweeping. But that is the case already. Legislation such as KOSA that provides a more narrow and concrete focus for regulation is an improvement.
The other facet of the free-speech concern emphasizes extreme partisan divergence on divisive issues such as hormone treatments for transgender-identifying children. Would enforcement by state attorneys general (for whom KOSA also envisions an enforcement role) devolve into partisan warfare over how to interpret the statute? On the issue of social media and children, the evidence suggests otherwise. In October, the attorneys general of 41 states and the District of Columbia filed a coordinated set of lawsuits against Meta for perpetrating extensive and documented harms on children – deep-red states such as Idaho, Indiana, Kansas, Missouri, Nebraska, Kentucky, both Dakotas, South Carolina, Louisiana standing shoulder to shoulder with deep-blue states such as California and Hawaii. All agree that Meta is knowingly harming children. In one of these suits, 33 states and Washington, D.C., all cosigned a common description of these harms that sounds very much like the specific list targeted by KOSA.
Americans can, and do, agree on some basics: Deliberate product-design choices that promote child suicide, depression, and eating disorders are bad and should be mitigated. The chief law-enforcement officials of a majority of U.S. states; 47 U.S. senators and counting, from both parties; and over 80 percent of parents from across the political spectrum are demonstrating just that.
Of course, KOSA’s critics could make a constructive contribution to the debate if, rather than merely complaining that the FTC or state AGs are the wrong choices to enforce child-safety rules, they pointed to better options. But in their opposition to the remedy KOSA proposes, they are advocating only that no one enforce child-safety rules, a position functionally equivalent to arguing that Big Tech should be free to engage with children as ordinary consumers, subject only to the market forces driving its bad behavior. In this respect, the argument has little to do with social media or the First Amendment per se, and is best understood as the standard libertarian rejection of government regulation applied in yet another realm. Society has sensibly rejected such arguments when it comes to protecting children in other contexts. The online world should be no different.
McGarry’s case for leaving the social-media giants alone appeared in National Review just a few days before documents were unsealed in the aforementioned states’ lawsuit against Meta. The evidence reveals how Meta refuses to shut down the majority of accounts belonging to children it knows are under the age of 13, collecting their personal information without parental consent in violation of existing federal law. It also outlines how Meta has attempted to deceive the public with false claims that the company “prioritized young users’ health and safety over maximizing profits,” when the truth is the other way around. (The lifetime dollar value of a 13-year-old on Instagram is $270, according to internal Meta emails, if you’re curious.)
A few days later, the Wall Street Journal reported that Instagram’s algorithm will respond to adult accounts that only follow “young gymnasts, cheerleaders and other teen and preteen influencers” by showing them a “toxic mix” of “risqué footage of children as well as overtly sexual adult videos,” alongside ads for major American brands (many have pulled their advertising in response). A few days after that, the attorney general of New Mexico filed a lawsuit alleging that Meta has permitted “Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey.” This is shocking, but not surprising. In June, the Journal reported that Meta’s “community building” algorithms actively promote connections among a “vast pedophile network,” nudging predators toward one another.
Social media’s design causes harm — by its deliberately addictive construction, through its algorithmic insistence that children be maximally entranced, even at the cost of their mental and emotional well-being or physical safety. KOSA is an attempt to offer children some basic level of protection in this currently chaotic space — to vindicate the public’s right to insist on crosswalks and crossing guards, on traffic cops and bouncers at the door, where the safety of our kids is at stake.