


NRPLUS MEMBER ARTICLE I n February, the Supreme Court heard oral arguments in two cases related to Section 230 of the Communications Decency Act of 1996. These oral arguments occurred in a broader context than the legal issues presented by the two cases. Many have called for legislative modification to (or outright repeal of) Section 230, and there are bipartisan — if incoherent — calls to “do something” about “big tech,” and reform of Section 230 fits the bill.
So, what is Section 230? In a nutshell, it does two things, both of which are extremely important to the functioning of the modern internet.
First, it absolves websites and apps of liability for content posted by individual users. If, for example, someone posts something defamatory about you on Twitter, then Section 230 means you can’t sue Twitter for defamation (but you can still sue the individual who wrote the defamatory post).
Second, it absolves websites and apps from liability for making content-moderation choices, which creates an opportunity for websites and apps to differentiate among themselves regarding how aggressively they filter content. If Facebook decides to take down a post it deems offensive, then Section 230 prevents Facebook from being sued for that decision. If 4chan decides not to take down a post that others might deem offensive, then Section 230 prevents 4chan from being sued for that decision.
The point is to encourage content moderation by websites, and by extension to encourage variation in content moderation across the internet. Section 230 is why the modern internet features heavily moderated websites and less moderated websites, and users can devote their time and attention to the websites they like best — and advertisers can make choices about the websites they patronize based upon these decisions by users. We are witnessing this dynamic in real time as Twitter, now owned by Elon Musk, makes various changes to the platform to which both users and advertisers react — market forces at work.
Why Section 230 is So Important Now
You should be skeptical of all claims — including the one I just made — that any law or regulation is “extremely important to the functioning of the modern internet.” These sorts of “the-sky-is-falling” arguments are common, and they are rarely persuasive. Just a few years ago, chicken littles across the country warned of impending doom if the FCC during the Trump administration reversed Obama-era net-neutrality regulations. The FCC did in fact reverse those regulations, and the doom never materialized. The sky remains just where it was.
Section 230 is different. Depending upon the specifics of reform, weakening or repealing Section 230 has the potential to impose costs on most internet platforms that host user-generated content, large and small. It may feel good to impose costs on certain companies for certain choices they have made, but consumers and the public will ultimately be the ones who pay when the bill comes due.
Basic economics teaches that when costs go up, market forces typically yield a new equilibrium with the following properties: (1) price goes up; and (2) quantity goes down. This doesn’t mean all cost increases are bad — if consumption of a product produces external costs not properly internalized by the transacting parties (think about the victims in automobile accidents caused by drunk driving), then a new, higher-cost equilibrium can be better than the status quo (i.e., taxes on alcohol). But we should be suspicious — most of the time, higher costs are bad.
Weakening or repealing Section 230 protection could increase costs in at least two ways. First, a website will face increased expected liability for hosting user-generated content. In other words, if the liability protections in Section 230 are weakened, then the platform will have to face and defend against lawsuits that did not exist previously because of Section 230. This will be costly even if the lawsuits against the platforms lack merit or are unsuccessful, and just the threat of lawsuits will increase costs.
Second, the platform will have to devote more resources towards content moderation or become more selective in terms of who is able to post content to reduce its now-increased expected liability for user-generated content. And, as a result, I predict we will see less variation between websites that host user-generated content. If Section 230 protections are weakened and expected liability for websites goes up, then I would expect websites’ content filtering to grow more similar compared with today.
As Henry Ford said of the Model T, you can buy the car in any color you like as long as it’s black. Repealing or modifying Section 230 will likely induce many platforms that host user-generated content to take a more similar, lawyered-up approach to content filtering.
Reforming Section 230 will almost certainly mean less user-generated content (quantity goes down), and it could mean increased price — if now-free websites decide to charge users or advertisers for access to recoup the costs associated with the increased costs of liability and content moderation. It probably also means reduced quality — websites might respond to reform by hosting content authored only by known, trusted users, which would make the internet much less open, diverse, and free (and isn’t this what the net-neutrality advocates were worried about in the first place?).
Why Section 230 Is So Important for the Future
But the logic I’ve just walked through is only what economists call a “static” analysis — what happens if you change just one variable and everything else stays the same. A “dynamic” analysis takes a longer-term view, and Section 230 reform looks much worse through that lens. It would risk reducing innovation on the internet, where much innovative activity takes place today. Platforms already invest in ways to recommend and moderate content, as well as user-reputation ratings systems — think of user ratings on eBay and Airbnb. Reforming Section 230 would almost certainly discourage this sort of innovation.
Increasing platform liability for user-generated content would necessarily increase the costs of complying with the new legal regime and would also serve as a barrier to entry for firms not yet competing in the marketplace. The effects of such cost increases are obvious: The new legal regime will favor large and incumbent platforms at the expense of small competitors and new entrants, which need strong protection against liability to enter and grow. This is precisely the effect that Europe’s General Data Protection Regulation has had based upon a thorough economic analysis. We shouldn’t compound that problem by making a similar mistake here in the United States.
Finally, permitting more lawsuits against internet platforms would invite strategic behavior. There currently are many websites, and the government can investigate only a subset of them. Removing Section 230 protections will invite efforts to steer the government toward specific targets.
Consider antitrust, where certain companies have spent more than a decade agitating the government to sue Google for antitrust violations. If Section 230 is reformed, then we should expect exactly these sorts of efforts to convince the government to sue rivals for their content-moderation choices: “My platform banned @DonaldJTrump, but my rival did not — you should sue my rival.” And, of course, Section 230 currently protects platforms against lawsuits by private entities. What if a user on an upstart social network defames Mark Zuckerberg, and Mark Zuckerberg is permitted to sue the upstart platform? The upstart would be subject to a lawsuit by its more deep-pocketed competitor. Who is better positioned to win such a case, which is likely to be repeated ad infinitum?
Granted, proponents of modifying rather than repealing Section 230 contend that underlying liability rules — e.g., the elements of the tort or other legal violation — will prevent dramatic changes while also curbing some of the socially harmful conduct facilitated through platforms. Unfortunately, predicting the effects of small changes is almost impossible, and even very small changes can have dramatic and unpredictable effects. Regardless, any changes will increase costs, and those costs will be passed to consumers through increased prices or decreased quality or quantity, as discussed above.
All in all, the case for overhauling Section 230 is incredibly weak. Section 230 currently incentivizes websites both to host user-generated content and to experiment over how to curate and moderate that content. If you don’t like one platform’s approach, then there are plenty of others to use. And increasing costs — as reforming Section 230 surely would — has a host of other likely effects, all of which we should want to avoid. Reduced innovation, reduced new entry, a stacked deck that favors large platforms over small ones and favors incumbents over new entrants — these are the things reforming Section 230 will bring. We should say “no.”