My neighbor Karen knocked on my door last Tuesday to ask if I'd gotten a weird alert from my Ring doorbell. She had. Her husband had walked up their own front path and the app flagged him as an unrecognized person. He's lived there for 11 years. This is the product Amazon wants you to pay an extra $19.99 a month for.

Ring's "Familiar Faces" feature is currently in beta inside the AI Pro subscription tier. The pitch is simple: your doorbell learns who belongs at your house and alerts you specifically when it sees someone it doesn't recognize. On paper, that sounds genuinely useful. In practice, after a week of testing, I think it is making homes less safe, not more, and I want to be specific about why.

\h2>The Alert You Learn to Ignore

The core problem is false positives. My delivery driver got flagged 3 times in 4 days. My friend Marcus, who has been to my apartment probably 40 times, got flagged twice because the lighting changed between visits. When everything triggers an alert, nothing does. You stop looking. That is the opposite of security.

Compare this to a $120 Blink camera with no subscription and basic motion zones. It doesn't know who anyone is, but it alerts me when something moves where nothing should be moving. Simple. Reliable. I actually check it.

The facial recognition layer adds cognitive load without adding accuracy, and in security, cognitive load is the enemy. The moment you start dismissing alerts as "probably just the mailman again," you have trained yourself to ignore your own security system.

The Part Amazon Isn't Saying Out Loud

Here's what bothers me more than the false positives. Every face your Ring camera enrolls in "Familiar Faces" is biometric data. Amazon stores it. The terms of service give Amazon broad rights to use aggregated data for improving its services, which is the kind of language that has cost Meta $650 million and Snap $35 million in court. Those settlements were about facial recognition data collected without clear consent. Ring is asking you to consent, technically, buried in a subscription agreement most people click through in about 4 seconds.

Privacy advocates have been raising this concern since the feature entered beta, and they are not wrong to. I will give Amazon this: at least it requires an opt-in rather than scanning faces by default. That matters. But opting in to a beta feature does not mean you fully understand what you are handing over or how long Amazon keeps it.

The research on facial recognition accuracy consistently shows performance drops with poor lighting, aging, weight changes, and different angles. A front door camera sees all of those conditions constantly. You are basically enrolling faces under ideal conditions and then asking the system to match them under real ones.

The practical question I keep coming back to: what does this feature actually do that a $0 addition to your existing setup doesn't? Ring already sends motion alerts. It already lets you review footage. Adding a face database that Amazon owns and that misfires regularly is not an upgrade. It's a liability.

If you are already a Ring user and you got a popup about AI Pro, close it. If you want actual home security improvement for $20 a month, better outdoor lighting covers your whole property and never misidentifies your husband of 11 years. Karen's husband would agree.