My neighbor has 4 Ring cameras. I know this because I can see them. What I didn't know until last week is that his "Familiar Faces" feature has probably catalogued my face, stored it in Amazon's cloud, and will keep it for up to 180 days. I never signed anything. Nobody asked me. I just walk past his house to get my mail.
Ring's official line is reassuring: law enforcement can't track users through the system, biometric data auto-deletes, you can request deletion anytime. Fine. I'll grant that Amazon hasn't announced a direct pipeline from Ring facial data to police departments. That's a real distinction.
But here's what actually happened in February 2026. Amazon quietly cancelled a partnership between Ring and Flock Safety, a company that sells automated license plate readers to police departments, after the deal got public attention and people got loud about it. Amazon didn't cancel it because they thought it was a bad idea. They cancelled it because you found out.
The Gap Between "We Don't" and "We Won't"
Flock's own story is the one that should keep you up at night. Flock publicly states its cameras are not used to enforce traffic violations. In December 2025, Georgia State Patrol used a Flock camera to ticket a motorcyclist for holding a cell phone. A traffic violation. The gap between stated policy and actual use took about 2 years to appear.
Ring founder Jamie Siminoff's internal messages, obtained by 404Media in February 2026, describe the "Search Party" pet-finding feature as "one of the most important pieces of tech" toward the goal of zeroing out neighborhood crime. That's not a pet feature. That's a surveillance ambition wearing a golden retriever costume.
The architecture is already there: centralized cloud storage, AI analysis, 10 million-plus cameras in American neighborhoods, and a company with a demonstrated interest in law enforcement partnerships. You don't build that and then leave it alone.
The Person Nobody Is Talking About
At least 16 states have passed laws requiring opt-in consent for biometric data collection. Which sounds great until you remember that the delivery driver, the mail carrier, the kid cutting through your block on a bike, none of them opted in to anything. They just exist near someone's house.
Devon Reyes would tell you I'm catastrophizing, that the current policy has real protections and I should wait for actual harm before sounding alarms. He's not entirely wrong that the current policy is better than nothing. But "better than nothing" is a low bar for a system that scans faces without consent and stores them in a database controlled by the world's largest e-commerce company.
I spent $250 on a Ring setup two years ago and I genuinely like it. The motion alerts are useful. The two-way audio is great for telling delivery people where to leave packages. I'm not here to tell you to throw it in the trash.
I am here to tell you to go into your Ring app right now, find the Familiar Faces settings, and delete whatever biometric data is stored there. Then call your state rep and ask where they stand on biometric opt-in consent, because 34 states still don't require it.
Amazon cancelled the Flock partnership because people made noise. That's the only lever that actually works here, and the window before this gets normalized is shorter than the 180-day storage timer on your neighbor's camera.