4 Sources
4 Sources
[1]
Amazon's Ring rolls out controversial, AI-powered facial recognition feature to video doorbells | TechCrunch
Dystopian or useful? Amazon's Ring doorbells will now be able to identify your visitors through a new AI-powered facial recognition feature, the company said on Tuesday. The controversial feature, dubbed "Familiar Faces," was announced earlier this September and is now rolling out to Ring device owners in the United States. Amazon says the feature lets you identify the people who regularly come to your door by creating a catalog of up to 50 faces. These could include family members, friends and neighbors, delivery drivers, household staff, and others. After you label someone in the Ring app, the device will recognize them as they approach the Ring's camera. Then, instead of alerting you that "a person is at your door," you'll receive a personalized notification, like "Mom at Front Door," the company explains in its launch announcement. The feature has already received pushback from consumer protection organizations, like the EFF, and a U.S. Senator. Amazon Ring owners can use the feature to help them disable alerts they don't want to see -- like those notifications referencing their own comings and goings, for instance, the company says. And they can set these alerts on a per-face basis. The feature is not enabled by default. Instead, users will need to turn it on in their app's settings. Meanwhile, faces can be named in the app directly from the Event History section or from the new Familiar Faces library. Once labeled, the face will be named in all notifications, in the app's timeline, and in the Event History. These labels can be edited at any time, and there are tools to merge duplicates or delete faces. Amazon claims the face data is encrypted and never shared with others. Plus, it says unnamed faces are automatically removed after 30 days. Despite Amazon's privacy assurances, the addition of the feature raises concerns. The company has a history of forging partnerships with law enforcement , and even once gave police and fire departments the ability to request data from the Ring Neighbors app by asking Amazon directly for people's doorbell footage. More recently, Amazon partnered with Flock, the maker of AI-powered surveillance cameras used by police, federal law enforcement, and ICE. Ring's own security efforts have fallen short in the past. Ring had to pay a $5.8 million fine in 2023 after the U.S. Federal Trade Commission found that Ring employees and contractors had broad and unrestricted access to customers' videos for years. Its Neighbors app also exposed users' home addresses and precise locations, and users' Ring passwords have been floating around the dark web for years. Given Amazon's willingness to work with law enforcement and digital surveillance providers, combined with its poor security track record, we'd suggest Ring owners, at the very least, be careful about identifying anyone using their proper name; better yet, keep the feature disabled and just look to see who it is. Not everything needs an AI upgrade. As a result of the privacy concerns, Amazon's Ring has already faced calls from U.S. Senator Ed Markey (D-Mass.) to abandon this feature, and is facing backlash from consumer protection organizations, like the EFF. Privacy laws are preventing Amazon from launching the feature in Illinois, Texas, and Portland, Oregon, the EFF had also noted. In response to questions posed by the organization, Amazon said the users' biometric data will be processed in the cloud, and claimed it doesn't use the data to train AI models. It also claimed it wouldn't be able to identify all the locations where a person had been detected, from a technical standpoint, even if law enforcement requested this data. However, it's unclear why that would not be the case, given the similarity to the "Search Party" feature that looks across a neighborhood's network of Ring cameras to find lost dogs and cats.
[2]
Why Amazon's new facial-recognition AI for Ring doorbells has privacy experts worried
Amazon has launched its new Familiar Faces feature.It lets Ring users save people's faces to a library in the app.Critics argue it's a dangerous violation of privacy. Amazon has launched a new feature that enables Ring doorbell cameras to recognize and catalog faces of people using AI. While the company is promoting the technology as a convenient way for homeowners to customize notifications and boost security, some are calling it a dangerous violation of privacy and a stepping stone to mass surveillance. Introduced in September, the Familiar Faces feature is not turned on by default. But once Ring users opt in, it can automatically scan the faces of guests and passersby using facial-recognition technology, or FRT. FRT works by scanning your face and, with the help of AI, translating it into a unique patchwork of numbers, also known as a "faceprint." Also: Ring vs. Blink: I compared the two most popular security camera brands, and here's who wins While many tech companies have begun integrating the technology into products -- it's commonly used to unlock iPhones, for example -- it's drawn growing criticism from consumer advocacy groups, industry watchdogs, and lawmakers. Some experts are now arguing that Amazon's new feature poses a particularly high risk, both due to the fact that it collects biometric data from anyone who's within sight of a Ring camera and in light of Amazon's previous partnerships with law enforcement agencies, including with Flock, a surveillance company that shares footage with ICE. "Amazon's system forces non-consenting bystanders into a biometric database without their knowledge or consent," Massachusetts senator Edward Markey wrote in a public letter published in October. "This is an unacceptable privacy violation." Familiar Faces is now being rolled out to Ring users across the United States. The new feature is ostensibly designed to make the Ring App more personalized. An AI algorithm scans faces picked up by the doorbell's camera, and in the app, users will then have the option to label and save up to 50 of them in the Event History or Familiar Faces section. Adding a name to a person's face will then cause the app to deliver a more specific notification: "Laura at front door," for example, rather than "Person at front door." The technology is useful for "eliminating guesswork and making it effortless to find and review important moments involving specific familiar people across the Ring App experience," Amazon wrote in its September announcement. Also: Own a Ring camera? This new update fixes its biggest annoyance for free In that announcement, however, the mention of the upcoming launch of Familiar Faces was somewhat buried; the main focus was the launch of a higher-resolution camera, as well as Search Party, another AI feature that helps homeowners and neighborhoods find lost pets. Search Party "reflects Ring's vision of using AI not just to power individual devices, but to transform them into simple tools that make it easier for neighbors to look out for each other, and create safer, more connected communities," Amazon wrote. Some experts, however, argue that Amazon's deployment of its new AI-powered FRT will do anything but build safer communities. "Today's feature to recognize your friend at your front door can easily be repurposed tomorrow for mass surveillance," the Electronic Frontier Foundation (EFF), a nonprofit focused on digital privacy and free speech, wrote in a November blog post titled "The Legal Case Against Ring's Face Recognition Feature." Also: This $99 smart sensor is one of the best investments I've made for my home security According to the EFF, Amazon may retain a person's biometric data for up to six months even if they're not saved by a Ring user in the Familiar Faces library, though that data will not be used for algorithmic training purposes. Amazon did not immediately respond to ZDNET's request for comment. A Ring spokesperson told The Washington Post that the new feature will not be available in Texas or Illinois, both of which require companies to obtain permission before collecting biometric data, or in Portland, Oregon, which has laws in place restricting the use of FRT. As you've probably noticed, we're very much in the midst of the "AI upgrade" era; every product and service that can be fused with some kind of AI probably will be, if it hasn't already. Some of these enhancements are genuinely useful; a chatbot that can help you find deals on flights, for example, or an agent that can manage your email inbox, can save time on what in the past were mundane and sometimes stressful tasks. But every time you opt in to a new AI tool -- or, as is sometimes the case, just start using it one day without providing explicit consent -- you're also agreeing to hand over more of your personal data to tech companies. This can lead to more targeted advertising, more addictive products, and sometimes, data breaches. Anytime you use AI, ask yourself: Is this worth handing over more of my data? In the case of Ring's new FRT technology, is the loss of privacy -- for you and for the many more people that come to or just pass by your front door -- worth the relatively small convenience boost resulting from notifications labeled with a person's name?
[3]
Ring doorbells can now identify faces
Ring has started rolling out its AI-powered facial recognition feature, Familiar Faces, to video doorbell owners across the United States. This addition means your smart doorbell can now identify your visitors, a major step forward for personalization that also brings significant privacy concerns. The new feature, which was announced back in September, is designed to identify people who regularly approach the Ring camera. This lets you build a catalog of up to 50 faces, including those of family members, friends, delivery drivers, or just regular visitors. Instead of a generic motion alert, you'll get a specific notification identifying who is there, such as 'Mom at Front Door.' This level of detail is clearly intended to cut down on notification fatigue. Ring says users can set these alerts on a per-face basis, which is great if you want to disable notifications entirely when you or your kids are coming and going multiple times a day. However, you're handing the app a labeled list of your associates, which is a massive privacy trade-off. If you're worried about data security, tagging visitors with their legal names is risky. It's safer to keep the Familiar Faces feature disabled entirely. Not every home security feature needs this level of AI integration, especially when that integration requires giving a corporation access to your biometric data. If you'd like to use it, you'll need to turn the feature on in the app settings, since it isn't enabled by default. You can label and manage faces directly from the Event History section or the new Familiar Faces library, making it pretty straightforward to get started. Once labeled, that name appears in all notifications, the app timeline, and the Event History. This feature is arriving alongside Ring's newest hardware lineup, including the Wired Doorbell Pro and the Spotlight Cam Pro. These new devices feature what Ring calls Retinal 4K Vision. This uses AI tuning to optimize the video, offering features like 10x zoom and improved low-light performance. The company has also introduced Retinal 2K Vision on devices like the Indoor Cam Plus. Familiar Faces is not the only new AI tool Ring is introducing. Amazon also recently released Alexa+ Greetings, which turns Alexa into a digital doorman that handles deliveries and sends away solicitors for you. Another new feature is Search Party for dogs, which uses AI to scan nearby outdoor Ring cameras when a neighbor reports a lost dog, scanning for potential matches to help locate lost pets. While the prospect of personalized alerts and sharper video sounds great, the addition of facial recognition raises serious questions that we shouldn't ignore. Consumer protection organizations like the EFF have already pushed back against Familiar Faces. In fact, privacy laws have prevented Amazon from launching this feature in certain areas, including Illinois, Texas, and Portland, Oregon. Amazon claims face data is encrypted and that it never shares the data with others. The company also claims that any unnamed faces are automatically removed after 30 days and that the data isn't used to train its AI models. Source: TechCrunch
[4]
Amazon brings facial recognition to Ring doorbells in the US
Amazon's Ring has begun rolling out its AI-powered "Familiar Faces" facial-recognition feature to video doorbells in the United States, following an announcement in September. The feature enables users to create a catalog of up to 50 faces for personalized notifications when recognized individuals approach the door. The "Familiar Faces" functionality allows Ring device owners to identify visitors such as family members, friends, neighbors, delivery drivers, and household staff. Users build the catalog by labeling faces captured by the Ring camera. Once labeled, the system recognizes the individual upon approach and sends a customized alert through the Ring app, such as "Mom at Front Door," rather than a generic "a person is at your door" notification. This personalization extends to the app's timeline and Event History sections, where labeled faces appear with their assigned names. To activate the feature, users must manually enable it within the Ring app's settings, as it remains disabled by default. Labeling occurs directly from the Event History, which logs past detections, or from the dedicated Familiar Faces library introduced with the rollout. This library serves as a centralized repository for managing the catalog. Users access tools within the app to edit names associated with faces, merge entries that represent the same individual due to variations in lighting or angle, or delete faces entirely. These management options ensure ongoing control over the stored data. Amazon emphasizes security measures for the face data. The company states that all biometric information undergoes encryption during storage and transmission. Furthermore, Amazon asserts that this data remains confined to the user's account and is never shared with third parties, including law enforcement or other entities without explicit user consent. For faces that users do not label, the system automatically deletes the data after 30 days, preventing indefinite retention of unidentified captures. The introduction of "Familiar Faces" has elicited concerns from privacy advocates and lawmakers. The Electronic Frontier Foundation (EFF), a consumer protection organization, has criticized the feature for potential risks to personal privacy. U.S. Senator Ed Markey, a Democrat from Massachusetts, has urged Amazon to abandon the rollout entirely, citing worries over biometric data handling in surveillance contexts. Regulatory barriers have already limited the feature's availability. Privacy laws in Illinois, Texas, and the city of Portland, Oregon, prohibit the deployment of facial-recognition technologies without specific safeguards, leading Amazon to withhold "Familiar Faces" in those jurisdictions. The EFF highlighted these restrictions as evidence of broader legal scrutiny on such tools. Amazon's history with data security contributes to the skepticism surrounding the feature. In 2023, the U.S. Federal Trade Commission imposed a $5.8 million fine on Ring after an investigation revealed that employees and contractors maintained broad, unrestricted access to customers' video recordings for several years. This breach exposed sensitive footage from homes across the country. Additionally, the Ring Neighbors app, which facilitates community sharing of footage, inadvertently disclosed users' precise home addresses and locations to the public. Security lapses extended beyond internal access. Reports indicate that Ring user passwords have circulated on the dark web for years, increasing vulnerability to unauthorized account takeovers. These incidents underscore patterns in Ring's data protection practices, particularly in light of the company's expansions into surveillance ecosystems. Amazon has established ties with law enforcement agencies through various programs. Previously, the company enabled police and fire departments to request doorbell footage directly from users via the Neighbors app, streamlining access to private videos. More recently, Amazon partnered with Flock Safety, a provider of AI-powered surveillance cameras deployed by police departments, federal law enforcement, and U.S. Immigration and Customs Enforcement (ICE). This collaboration integrates Ring devices into larger networks of automated monitoring. In addressing inquiries from the EFF, Amazon provided details on its data processing protocols. The company explained that users' biometric data from "Familiar Faces" is analyzed in the cloud to perform recognitions. Amazon stated that it does not utilize this data to train artificial intelligence models, thereby limiting its reuse for broader algorithmic development. Regarding law enforcement requests, Amazon claimed a technical inability to compile a comprehensive history of a person's detections across multiple locations, even if subpoenaed. This assertion contrasts with Ring's existing "Search Party" feature, which scans a neighborhood's interconnected cameras to locate lost pets by matching images. The similarity in cross-device image analysis raises questions about the feasibility of aggregated location tracking, though Amazon maintains that full histories remain inaccessible. F. Mario Trujillo, a staff attorney at the EFF, commented on the rollout: "Knocking on a door, or even just walking in front of it, shouldn't require abandoning your privacy. With this feature going live, it's more important than ever that state privacy regulators step in to investigate, protect people's privacy, and test the strength of their biometric privacy laws." The feature also offers practical utilities for users, such as disabling alerts for specific individuals to reduce unnecessary notifications. For example, owners can configure the system to ignore detections of household members arriving home, tailoring the experience to daily routines. Per-face alert settings allow granular control, ensuring notifications align with user preferences without overwhelming the app. Overall, the rollout proceeds amid these debates, with Amazon positioning "Familiar Faces" as an enhancement for home security through targeted identifications. Users in eligible areas receive the update via the app, prompting them to opt in if desired.
Share
Share
Copy Link
Amazon has rolled out its AI-powered Familiar Faces feature to Ring doorbells across the United States, allowing users to catalog up to 50 faces for personalized notifications. While the company promotes convenience and security, privacy advocates including the Electronic Frontier Foundation and Senator Ed Markey warn of mass surveillance risks, especially given Amazon's partnerships with law enforcement and past data security failures that resulted in a $5.8 million FTC fine.
Amazon has officially launched Familiar Faces, an AI-powered facial recognition feature for Ring doorbells across the United States, marking a significant expansion of biometric surveillance capabilities in home security devices
1
. First announced in September, the feature enables Ring doorbells to identify faces and deliver personalized notifications instead of generic alerts. Users can now build a catalog of up to 50 faces, including family members, friends, delivery drivers, and household staff2
. Instead of receiving a standard "person at your door" alert, Ring owners get specific notifications like "Mom at Front Door," designed to reduce notification fatigue and streamline home monitoring3
.
Source: TechCrunch
The facial recognition system works by scanning faces captured by Ring's camera and translating them into unique numerical patterns called "faceprints" using AI algorithms
2
. Users must manually enable the opt-in feature through the Ring app's settings, as it remains disabled by default4
. Labeling occurs directly from the Event History section or the dedicated Familiar Faces library, where users can edit names, merge duplicate entries, or delete faces entirely1
. Once labeled, these names appear across all notifications, the app timeline, and Event History. The feature arrives alongside Ring's newest hardware lineup featuring Retinal 4K Vision, which uses AI tuning to optimize video quality with 10x zoom and improved low-light performance3
.
Source: ZDNet
The rollout has triggered intense criticism from privacy advocates and lawmakers who view it as a dangerous step toward mass surveillance. "Amazon's system forces non-consenting bystanders into a biometric database without their knowledge or consent," wrote Senator Ed Markey in an October letter calling for Amazon to abandon the feature
2
. The Electronic Frontier Foundation warned that "today's feature to recognize your friend at your front door can easily be repurposed tomorrow for mass surveillance"2
. Privacy laws in Illinois, Texas, and Portland, Oregon have already blocked the feature's deployment, requiring companies to obtain explicit user consent before collecting and storing biometric data1
.
Source: How-To Geek
Amazon's history of partnerships with law enforcement agencies intensifies scrutiny around the Familiar Faces feature. The company previously enabled police and fire departments to request doorbell footage directly from users through the Ring Neighbors app
4
. More recently, Amazon partnered with Flock Safety, which manufactures AI-powered surveillance cameras used by police departments, federal law enforcement, and ICE1
. When questioned by the Electronic Frontier Foundation, Amazon claimed it lacks the technical capability to compile a comprehensive history of where a person has been detected across multiple locations, even if law enforcement requests such data. However, critics point to Ring's existing Search Party feature, which scans neighborhood networks of cameras to locate lost pets, demonstrating similar cross-device tracking capabilities1
.Related Stories
Amazon's track record on data security amplifies concerns about the new feature. In 2023, the U.S. Federal Trade Commission imposed a $5.8 million fine on Ring after discovering that employees and contractors maintained broad, unrestricted access to customers' video recordings for years
1
. The Ring Neighbors app also inadvertently exposed users' precise home addresses and locations, while Ring user passwords have circulated on the dark web for years4
. Despite these incidents, Amazon insists that biometric data undergoes data encryption during storage and transmission, remains confined to user accounts, and is never shared with third parties without explicit user consent4
. The company also states that unnamed faces are automatically deleted after 30 days and that the data isn't used to train AI models3
.As facial recognition becomes embedded in everyday devices, users face critical decisions about trading convenience for privacy. Experts advise Ring owners to avoid labeling visitors with legal names or keep the feature disabled entirely
1
. The feature exemplifies broader tensions in the AI upgrade era, where products integrate artificial intelligence capabilities that require handing over increasing amounts of personal data to tech companies2
. While personalized notifications may reduce alert fatigue, they come at the cost of creating a labeled catalog of associates accessible to a corporation with documented security vulnerabilities. The long-term implications extend beyond individual homes, as interconnected networks of facial recognition-enabled devices could enable tracking across neighborhoods without meaningful oversight or accountability mechanisms.Summarized by
Navi
[1]
[3]
09 Oct 2025•Technology

25 Jun 2025•Technology

30 Sept 2025•Technology

1
Science and Research

2
Policy and Regulation

3
Technology
