900. That is the number of Community Notes Meta published in the first 6 months of its US rollout. Over a similar stretch, EU fact-checking flagged 35 million posts on Facebook alone. The ratio is not a rounding error. It is the difference between a restaurant that wipes down one table per shift and one that actually runs a kitchen. And I think platforms that collect nearly $16 billion in revenue from ads including scams and banned goods while producing those numbers deserve exactly what a Los Angeles jury delivered on March 25: a bill.

That $3 million verdict against Meta and Google was about addiction, not misinformation specifically. But the logic travels. The jury found the companies 70% and 30% liable, respectively, for design choices that caused harm. Not for what users posted. For what the platforms built and how they profited. The plaintiff's lead counsel called it "a referendum from a jury, to an entire industry, that accountability has arrived." I would call it overdue.

The Garnish Where the Meal Should Be

I have spent years noticing when care is real and when it is decoration. A bodega that hand-slices its own jalapeños is doing something different from a chain that prints "artisanal" on the menu. Meta's Community Notes program is the printed menu. On X, where the system originated, a note takes an average of 15 hours to publish. By then, a post has already reached 80% of its total audience. That is not moderation. It is a garnish placed on a plate that has already been eaten and cleared.

Meta's own Oversight Board said as much on March 26, warning that Community Notes is "structurally ill-suited" to serve as a primary tool against misinformation. The Board flagged specific dangers: repressive regimes, election periods, markets saturated with disinformation. Meta's H1 2026 threat report detailed a TikTok-based campaign targeting Hungary's elections using AI-generated fake identities. Meanwhile, deepfakes online are projected to hit 8 million, up from 500,000 in 2023. The machinery producing misinformation scales exponentially. The machinery catching it does not.

Liability Means Design, Not Censorship

The consent decree in Missouri v. Biden, entered March 23, bars the Surgeon General, CDC, and CISA from coercing platforms to remove content. Fine. Government officials should not bully private companies into silencing speech. I grant that point freely. But the decree does not address what happens when platforms design their systems to amplify the most engaging content, sell ad space against it, and then claim neutrality when the content turns out to be a $25 million deepfake scam or a fabricated election narrative.

The distinction matters enormously. Liability for user speech would be absurd and unconstitutional. Liability for design and revenue choices is neither. We already hold bars liable for overserving. We hold car manufacturers liable for known defects. A platform that earns ad revenue from content it knows violates its own policies is not a passive host. It is a participant collecting a cut.

Zara Mitchell would argue, and she would not be entirely wrong, that liability pressure could push platforms into blunt over-moderation that crushes independent creators and small operators. I have seen what happens when compliance anxiety replaces editorial judgment: everything gets flattened, and the weird, specific, genuinely useful stuff disappears first. But the current arrangement is not protecting those creators either. It is protecting quarterly earnings. The 900-versus-35-million gap is not a feature of free expression. It is a feature of cost avoidance.

An AI model trained on 14,000 Urdu news stories already achieves 96% accuracy detecting misinformation. The tools exist. The incentive to deploy them does not, because deploying them costs money and reducing engagement costs more. Liability changes that math. It makes the cleanup cheaper than the lawsuit.

A restaurant that poisons 35 million plates does not get to say it is just a building where food happens to be served. Section 230, as currently applied, offers exactly that defense. The jury in Los Angeles saw through it. Congress should too.