Amazon’s Ring has begun rolling out its AI-powered “Familiar Faces” facial-recognition feature to video doorbells in the United States, following an announcement in September. The feature enables users to create a catalog of up to 50 faces for personalized notifications when recognized individuals approach the door.
The “Familiar Faces” functionality allows Ring device owners to identify visitors such as family members, friends, neighbors, delivery drivers, and household staff. Users build the catalog by labeling faces captured by the Ring camera. Once labeled, the system recognizes the individual upon approach and sends a customized alert through the Ring app, such as “Mom at Front Door,” rather than a generic “a person is at your door” notification. This personalization extends to the app’s timeline and Event History sections, where labeled faces appear with their assigned names.
To activate the feature, users must manually enable it within the Ring app’s settings, as it remains disabled by default. Labeling occurs directly from the Event History, which logs past detections, or from the dedicated Familiar Faces library introduced with the rollout. This library serves as a centralized repository for managing the catalog. Users access tools within the app to edit names associated with faces, merge entries that represent the same individual due to variations in lighting or angle, or delete faces entirely. These management options ensure ongoing control over the stored data.
Amazon emphasizes security measures for the face data. The company states that all biometric information undergoes encryption during storage and transmission. Furthermore, Amazon asserts that this data remains confined to the user’s account and is never shared with third parties, including law enforcement or other entities without explicit user consent. For faces that users do not label, the system automatically deletes the data after 30 days, preventing indefinite retention of unidentified captures.
The introduction of “Familiar Faces” has elicited concerns from privacy advocates and lawmakers. The Electronic Frontier Foundation (EFF), a consumer protection organization, has criticized the feature for potential risks to personal privacy. U.S. Senator Ed Markey, a Democrat from Massachusetts, has urged Amazon to abandon the rollout entirely, citing worries over biometric data handling in surveillance contexts.
Regulatory barriers have already limited the feature’s availability. Privacy laws in Illinois, Texas, and the city of Portland, Oregon, prohibit the deployment of facial-recognition technologies without specific safeguards, leading Amazon to withhold “Familiar Faces” in those jurisdictions. The EFF highlighted these restrictions as evidence of broader legal scrutiny on such tools.
Amazon’s history with data security contributes to the skepticism surrounding the feature. In 2023, the U.S. Federal Trade Commission imposed a $5.8 million fine on Ring after an investigation revealed that employees and contractors maintained broad, unrestricted access to customers’ video recordings for several years. This breach exposed sensitive footage from homes across the country. Additionally, the Ring Neighbors app, which facilitates community sharing of footage, inadvertently disclosed users’ precise home addresses and locations to the public.
Security lapses extended beyond internal access. Reports indicate that Ring user passwords have circulated on the dark web for years, increasing vulnerability to unauthorized account takeovers. These incidents underscore patterns in Ring’s data protection practices, particularly in light of the company’s expansions into surveillance ecosystems.
Amazon has established ties with law enforcement agencies through various programs. Previously, the company enabled police and fire departments to request doorbell footage directly from users via the Neighbors app, streamlining access to private videos. More recently, Amazon partnered with Flock Safety, a provider of AI-powered surveillance cameras deployed by police departments, federal law enforcement, and U.S. Immigration and Customs Enforcement (ICE). This collaboration integrates Ring devices into larger networks of automated monitoring.
In addressing inquiries from the EFF, Amazon provided details on its data processing protocols. The company explained that users’ biometric data from “Familiar Faces” is analyzed in the cloud to perform recognitions. Amazon stated that it does not utilize this data to train artificial intelligence models, thereby limiting its reuse for broader algorithmic development. Regarding law enforcement requests, Amazon claimed a technical inability to compile a comprehensive history of a person’s detections across multiple locations, even if subpoenaed.
This assertion contrasts with Ring’s existing “Search Party” feature, which scans a neighborhood’s interconnected cameras to locate lost pets by matching images. The similarity in cross-device image analysis raises questions about the feasibility of aggregated location tracking, though Amazon maintains that full histories remain inaccessible.
F. Mario Trujillo, a staff attorney at the EFF, commented on the rollout: “Knocking on a door, or even just walking in front of it, shouldn’t require abandoning your privacy. With this feature going live, it’s more important than ever that state privacy regulators step in to investigate, protect people’s privacy, and test the strength of their biometric privacy laws.”
The feature also offers practical utilities for users, such as disabling alerts for specific individuals to reduce unnecessary notifications. For example, owners can configure the system to ignore detections of household members arriving home, tailoring the experience to daily routines. Per-face alert settings allow granular control, ensuring notifications align with user preferences without overwhelming the app.
Overall, the rollout proceeds amid these debates, with Amazon positioning “Familiar Faces” as an enhancement for home security through targeted identifications. Users in eligible areas receive the update via the app, prompting them to opt in if desired.





