Picture this: you're refreshing Zillow at midnight, watching the same apartment tick up $80 every few days, and nobody at the property management company made a single call to a competitor. They didn't have to. The software did it. RealPage, a pricing platform used by landlords across the US, was pulling rent data from competing properties and adjusting prices in real time. The Department of Justice eventually stepped in and banned that practice. But the playbook didn't die with RealPage. It's everywhere now.

On March 2, 2026, the UK's Competition and Markets Authority opened a probe into 3 hotel chains for sharing pricing data through a third-party analytics provider. California's AB 325 went live January 1, making it a criminal offense to use a competitor's data inside a pricing algorithm. Germany already fined Amazon €59 million for its automated pricing mechanisms. The wave is here. The question is whether the US federal government gets ahead of it or keeps playing catch-up.

When Software Became the Middleman for Price-Fixing

Here's what makes algorithmic collusion so slippery: no exec has to say "let's all charge more." The platform just... learns. It ingests competitor pricing, finds the profit-maximizing number, and every landlord or hotel manager who subscribes to the same tool ends up at the same price. Lawyers call this "hub-and-spoke" coordination. The EU had a case called Eturas that established it. Regulators in the US are now applying the same logic to AI.

The Small Business and Entrepreneurship Council argued on March 18, 2026 that AI pricing tools help small firms compete. Fair point, actually. A family-owned hotel using data tools to price against a Marriott isn't the same thing as 20 Marriotts all running the same algorithm off the same competitor data. But the proposed California surveillance pricing bill, AB 2564, would penalize any personalized pricing based on personal data surveillance, with fines up to $37,500 per intentional violation. That's a different fight, and conflating the two helps nobody except the lawyers billing for both cases.

The relevant issue is narrower: when competing businesses feed their pricing data into a shared platform and that platform coordinates prices across all of them, that is collusion. The algorithm being the coordinator doesn't make it legal. It just makes it harder to see.

Why "Trust the Market" Doesn't Work When the Market Is One Algorithm

The R Street Institute warned about "AI policy contagion" and over-regulation killing innovation. I get the instinct. I've complained about California regulating the shine off every piece of tech that crosses its border. But there's a difference between regulating how a product works and regulating whether competitors can secretly share pricing power through a shared tool. The second one has a name: it's called a cartel.

Congress kicked off its own AI pricing investigation on March 5, 2026. That's the right move, but investigations aren't law. The US needs a federal standard that does what California's AB 325 does at state level: make it explicitly illegal to train or operate a pricing algorithm on live competitor data, with enforcement teeth. The FTC and DOJ already have antitrust authority. They don't need to wait for new legislation to use it more aggressively, starting with the sectors where the harm is most concrete: housing, healthcare, groceries.

The Linklaters team put it cleanly on March 16: "Software using algorithms or AI does not provide any protection against liability." That's the standard. The companies deploying these tools knew what they were doing. The app doesn't get the cartel a pass.