A candle maker in Memphis runs her entire business through Instagram Reels. She does about $11,000 a month in revenue, has no PR team, no legal counsel, and no fallback distribution channel. If Meta starts aggressively flagging and removing posts to limit its exposure under new liability rules, her product video that mentions "natural wellness" gets caught in the same filter built to catch health misinformation. She loses reach. She loses sales. She did nothing wrong.
That is the version of liability nobody wants to talk about. I think making social media companies legally responsible for misinformation will produce more collateral damage than accountability, and the people absorbing that damage will be creators, side hustlers, and small operators who have zero leverage.
The Compliance Reflex
When companies face legal risk, they do not hire thousands of nuanced moderators. They build blunt automated systems and set the threshold to "remove first, ask questions never." We have already seen this play out. YouTube's 2019 copyright liability crackdown led to a 30% spike in false-positive takedowns within 6 months. Entire channels lost monetization over background music that was actually royalty-free. The pattern is consistent: legal exposure produces defensive over-moderation.
Scale the same reflex to misinformation liability. Platforms would need to screen billions of posts daily for false claims across health, politics, finance, science. The 96% accuracy rate on that Urdu-language AI detection model sounds impressive until you realize that 4% error rate, applied to Facebook's roughly 2 billion daily active users, means tens of millions of wrongful flags. Per day.
Small creators get buried. Large media companies with legal departments survive. The independent nutrition coach in Detroit posting meal prep tips? Flagged. The retired engineer in Tucson explaining solar panel installation with slightly outdated specs? Removed. The compliance math does not care about context. It cares about minimizing lawsuits.
Who Actually Profits From the Cleanup
The $3 million LA verdict on March 25 assigned Meta 70% liability and Google 30% for addictive design. The plaintiff's lawyer declared that "accountability has arrived." Maybe. But follow the incentives forward. If platforms face open-ended liability for what users post, the rational corporate response is not better moderation. It is restricting who gets to post.
Verification requirements go up. Algorithmic reach for unverified accounts goes down. Paid distribution becomes the only reliable way to reach an audience. That is not a hypothetical; it is already happening. Meta's organic reach for business pages dropped to roughly 5.2% in 2024. Liability pressure accelerates that decline toward zero for anyone who cannot pay.
The consent decree in Missouri v. Biden, settled March 23, already drew a line against government coercion of platforms. But private liability creates a different kind of pressure with the same censorious result. Platforms will not moderate carefully. They will moderate cheaply. Cheap moderation means algorithmic bluntness.
I will grant this: the 900-versus-35-million gap between Meta's Community Notes output and EU fact-checking labels is genuinely damning. Platforms are not trying hard enough. That criticism is fair.
But the fix for a company not trying hard enough is not a legal regime that incentivizes it to try too hard in the wrong direction. Liability for design choices, like the LA verdict addressed, is a narrower and smarter tool. Liability for the presence of misinformation on a platform with 3 billion users is a sledgehammer swung in a china shop full of small businesses.
The 38% of US adults who used social media as an information source in 2024 are not going to stop. The information environment is not going to get simpler. Deepfakes will hit 8 million. State-sponsored campaigns will keep adapting. These are real problems. But the solution that sounds most satisfying, making platforms pay for every lie that crosses their servers, will cost the candle maker in Memphis before it costs Mark Zuckerberg a single night's sleep.
Liability regimes always land on the people with the least power to absorb them. That is not a guess. That is how compliance costs work, every time.