Curated by THEOUTPOST
On Sat, 15 Mar, 4:01 PM UTC
24 Sources
[1]
Everything you say to your Echo will be sent to Amazon starting on March 28
Since Amazon announced plans for a generative AI version of Alexa, we were concerned about user privacy. With Alexa+ rolling out to Amazon Echo devices in the coming weeks, we're getting a clearer view at the privacy concessions people will have to make to maximize usage of the AI voice assistant and avoid bricking functionality of already-purchased devices. In an email sent to customers today, Amazon said that Echo users will no longer be able to set their devices to process Alexa requests locally and, therefore, avoid sending voice recordings to Amazon's cloud. Amazon apparently sent the email to users with "Do Not Send Voice Recordings" enabled on their Echo. Starting on March 28, recordings of everything spoken to the Alexa living in Echo speakers and smart displays will automatically be sent to Amazon and processed in the cloud. Attempting to rationalize the change, Amazon's email said: As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature. One of the most marketed features of Alexa+ is its more advanced ability to recognize who is speaking to it, a feature known as Alexa Voice ID. To accommodate this feature, Amazon is eliminating a privacy-focused capability for all Echo users, even those who aren't interested in the subscription-based version of Alexa or want to use Alexa+ but not its ability to recognize different voices. However, there are plenty of reasons why people wouldn't want Amazon to receive recordings of what they say to their personal device. For one, the idea of a conglomerate being able to listen to personal requests made in your home is, simply, unnerving. Further, Amazon has previously mismanaged Alexa voice recordings. In 2023, Amazon agreed to pay $25 million in civil penalties over the revelation that it stored recordings of children's interactions with Alexa forever. Adults also didn't feel properly informed of Amazon's inclination toward keeping Alexa recordings unless prompted not to until 2019 -- five years after the first Echo came out. If that's not enough to deter you from sharing voice recordings with Amazon, note that the company allowed employees to listen to Alexa voice recordings. In 2019, Bloomberg reported that Amazon employees listened to as many as 1,000 audio samples during their nine-hour shifts. Amazon says it allows employees to listen to Alexa voice recordings to train its speech recognition and natural language understanding systems. Other reasons why people may be hesitant to trust Amazon with personal voice samples include the previous usage of Alexa voice recordings in criminal trials and Amazon paying a settlement in 2023 in relation to allegations that it allowed "thousands of employees and contractors to watch video recordings of customers' private spaces" taken from Ring cameras, per the Federal Trade Commission. Save recordings or lose functionality Likely looking to get ahead of these concerns, Amazon said in its email today that by default, it will delete recordings of users' Alexa requests after processing. However, anyone with their Echo device set to "Don't save recordings" will see their already-purchased devices' Voice ID feature bricked. Voice ID enables Alexa to do things like share user-specified calendar events, reminders, music, and more. Previously, Amazon has said that "if you choose not to save any voice recordings, Voice ID may not work." As of March 28, broken Voice ID is a guarantee for people who don't let Amazon store their voice recordings. Amazon's email says: Alexa voice requests are always encrypted in transit to Amazon's secure cloud, which was designed with layers of security protections to keep customer information safe. Customers can continue to choose from a robust set of controls by visiting the Alexa Privacy dashboard online or navigating to More > Alexa Privacy in the Alexa app. Amazon is forcing Echo users to make a couple of tough decisions: Grant Amazon access to recordings of everything you say to Alexa or stop using an Echo; let Amazon save voice recordings and have employees listen to them or lose a feature set to become more advanced and central to the next generation of Alexa. However, Amazon is betting big that Alexa+ can dig the voice assistant out of a financial pit. Amazon has publicly committed to keeping the free version of Alexa around, but Alexa+ is viewed as Amazon's last hope for keeping Alexa alive and making it profitable. Anything Amazon can do to get people to pay for Alexa takes precedence over other Alexa user demands, including, it seems, privacy.
[2]
Amazon's Echo will send all voice recordings to the cloud, starting March 28 | TechCrunch
Amazon Echo users will no longer have the option to process their Alexa requests locally, which means all of their voice recordings will be sent to the company's cloud. Ars Technica reports that on Friday, Amazon sent an email to customers who have "Do Not Send Voice Recordings" enabled on their Echo smart speakers and displays, stating the company would stop supporting the privacy-enhancing feature on March 28. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature," the email said. This comes as Amazon is rolling out a new version of its voice-controlled AI assistant, now known as Alexa+. Consumers and regulators have raised concerns about Alexa's privacy implications in the past, with Amazon agreeing to pay a $25 million in 2023 settlement with the Federal Trade Commission over children's privacy.
[3]
Everything You Say to Your Echo Will Soon Be Sent to Amazon, and You Can't Opt Out
Amazon is killing its "Do Not Send Voice Recordings" privacy feature on March 28 as the company aims to bolster Alexa+, its new subscription assistant. Since Amazon announced plans for a generative AI version of Alexa, we were concerned about user privacy. With Alexa+ rolling out to Amazon Echo devices in the coming weeks, we're getting a clearer view of the privacy concessions people will have to make to maximize usage of the AI voice assistant and avoid bricking functionality of already-purchased devices. In an email sent to customers today, Amazon said that Echo users will no longer be able to set their devices to process Alexa requests locally and, therefore, avoid sending voice recordings to Amazon's cloud. Amazon apparently sent the email to users with "Do Not Send Voice Recordings" enabled on their Echo. Starting on March 28, recordings of every command spoken to the Alexa living in Echo speakers and smart displays will automatically be sent to Amazon and processed in the cloud. Attempting to rationalize the change, Amazon's email said: "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." One of the most marketed features of Alexa+ is its more advanced ability to recognize who is speaking to it, a feature known as Alexa Voice ID. To accommodate this feature, Amazon is eliminating a privacy-focused capability for all Echo users, even those who aren't interested in the subscription-based version of Alexa or want to use Alexa+ but not its ability to recognize different voices. However, there are plenty of reasons people wouldn't want Amazon to receive recordings of what they say to their personal device. For one, the idea of a conglomerate being able to listen to personal requests made in your home is, simply, unnerving. Further, Amazon has previously mismanaged Alexa voice recordings. In 2023, Amazon agreed to pay $25 million in civil penalties over the revelation that it stored recordings of children's interactions with Alexa forever. Adults also didn't feel properly informed of Amazon's inclination to keep Alexa recordings unless prompted not to until 2019 -- five years after the first Echo came out. If that's not enough to deter you from sharing voice recordings with Amazon, note that the company allowed employees to listen to Alexa voice recordings. In 2019, Bloomberg reported that Amazon employees listened to as many as 1,000 audio samples during their nine-hour shifts. Amazon says it allows employees to listen to Alexa voice recordings to train its speech recognition and natural language understanding systems. Other reasons people may be hesitant to trust Amazon with personal voice samples include the previous usage of Alexa voice recordings in criminal trials and Amazon paying a settlement in 2023 in relation to allegations that it allowed "thousands of employees and contractors to watch video recordings of customers' private spaces" taken from Ring cameras, per the Federal Trade Commission. Likely looking to get ahead of these concerns, Amazon said in its email today that by default, it will delete recordings of users' Alexa requests after processing. However, anyone with their Echo device set to "Don't save recordings" will see their already-purchased devices' Voice ID feature bricked. Voice ID enables Alexa to do things like share user-specified calendar events, reminders, music, and more. Previously, Amazon has said that "if you choose not to save any voice recordings, Voice ID may not work." As of March 28, broken Voice ID is a guarantee for people who don't let Amazon store their voice recordings. Amazon's email says: Amazon is forcing Echo users to make a couple of tough decisions: Grant Amazon access to recordings of everything you say to Alexa or stop using an Echo; let Amazon save voice recordings and have employees listen to them or lose a feature set to become more advanced and central to the next generation of Alexa. However, Amazon is betting big that Alexa+ can dig the voice assistant out of a financial pit. Amazon has publicly committed to keeping the free version of Alexa around, but Alexa+ is viewed as Amazon's last hope for keeping Alexa alive and making it profitable. Anything Amazon can do to get people to pay for Alexa takes precedence over other Alexa user demands, including, it seems, privacy.
[4]
Amazon Is Removing a Key Alexa Privacy Setting: Should You Worry?
Expertise Smart home | Smart security | Home tech | Energy savings | A/V Ahead of Amazon's big AI upgrade to the Alexa voice assistant, Alexa Plus, the company has announced a worrying change to its privacy policy: Starting on March 28, owners of Echo smart speakers and Echo Show smart displays will no longer have the option to block voice command recordings from being sent to Amazon. Keeping voice recordings local is an important privacy feature for any voice assistant, and removing it raises serious questions about what Amazon is listening to. And even if you didn't know about these privacy settings before, I bet you're worried about voice assistant privacy in general: Our CNET Survey found over 70% of people have privacy concerns about adding more AI to home voice assistants -- and now we're seeing those fears in action. So let's cover basic questions about what Amazon is doing here and what you should know about if you're using Alexa around the home. Read more: Home Security Questions You Can Ask Alexa and Siri Amazon's news concerns two specific Alexa privacy options: "Do not send voice recordings" and "Do not save voice recordings," which can be found deep in the Alexa app and Echo device settings. Starting March 28, Amazon will be removing the "Do not send voice recordings" option, which means all recorded voice commands will be automatically sent to Amazon for processing and analysis. The company is also changing how the "Do not save voice recordings" option works, limiting Alexa features if you don't want to save recordings locally. The company also reported that voice recordings will be deleted after processing is complete. Everything Alexa records when it hears its wake word will automatically be sent to the Amazon cloud for processing. Amazon's message to owners indicated that it will be using this data to train Alexa Plus to hold better conversations and understand people more accurately -- which has always been a primary reason companies want this kind of voice data. Now Amazon isn't giving people a choice. The next question is "Does that mean a real human employee will be listening to my voice recordings?" That's harder to answer. This type of processing probably doesn't require human ears to listen to recordings. However, in 2019, Amazon was discovered ordering employees to listen and manually annotate parts of Alexa voice recordings, so the company doesn't have a great track record here. Doing this sort of labeling "by hand" isn't really efficient, and employees reported hearing everything from bad singing to possible sexual assaults. Still, Amazon did not indicate at the time it would be making any changes to its processing. Only the "Do not send voice recordings" and "Do not save voice recordings" options appear to have been changed. Other Alexa privacy settings, such as Alexa Skill permissions, were not mentioned, so they should still be intact after the deadline. It's mostly about Alexa Plus, Amazon's conversational AI upgrade ($20 per month, free for Prime users) coming this spring. Amazon really, really needs Alexa Plus to bring in more revenue and, hopefully, new customers, so the company is pulling out all the stops to prepare for its spring release. One of those stops was these privacy settings. In its notification, Amazon simply said, "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." In other words, the more voice data available, the more Alexa Plus can learn, and Amazon is giving it access to all voice data possible. Amazon reports that it encrypts voice data "in transit" to Amazon's cloud. Once there, it is most likely decrypted for analysis. Based on what Amazon has said, no Echo owner will have a choice, even if you don't intend to use Alexa Plus. Your voice recordings get sent to the company either way. This probably means that other services like Live Translations and Adaptive Listening will be automatically enabled, whereas before using this privacy setting disabled such features. You will still have access to the privacy setting "Do not save voice recordings." However, Amazon has made it clear that turning this setting on means Voice ID settings will not work. Voice ID is Alexa's ability to recognize different voices in a household and give personalized answers based on their profiles and added devices. Voice ID is becoming even more important with Alexa Plus, so Amazon wants to encourage people to use this feature as much as possible. But voice recognition features can't work if you don't save recordings. Amazon Web Services has experienced cloud data breaches and vulnerabilities in the past, but these were usually caused by problems created by third-party users like Capital One, Pegasus Airlines and others. Breaches into Amazon's directly controlled customer cloud data are largely unheard of. There's a more direct risk when it comes to your Echo voice data. In 2023, Amazon paid a penalty of $25 million for breaking a children's privacy law by keeping children's voice recordings permanently instead of getting rid of them as required. Echo users should be most wary of this kind of misuse of data. We've found nothing so far, although we'll test out the new features when they arrive. The change is sort of a privacy ultimatum: Either let Amazon use your voice recordings or stop using Alexa. Non-English speakers didn't have the same privacy settings available in the first place, and they're not likely to see notable changes.
[5]
Amazon Is Canceling a Major Alexa Privacy Feature on March 28: Should You Worry?
Expertise Smart home | Smart security | Home tech | Energy savings | A/V As Amazon readies the huge AI upgrade to Alexa called Alexa Plus, the retail juggernaut has revealed a disturbing change in its privacy policies. Beginning on March 28th, users of Echo smart speakers and Echo Show displays won't be able to block their devices from sending all voice recordings to Amazon for analysis. Keeping voice recordings local is an important privacy feature for any voice assistant, and removing it raises serious questions about what Amazon is listening to. And even if you didn't know about these privacy settings before, I bet you're worried about voice assistant privacy in general: Our CNET Survey found over 70% of people have privacy concerns about adding more AI to home voice assistants -- and now we're seeing those fears in action. So let's cover basic questions about what Amazon is doing here and what you should know about if you're using Alexa around the home. Read more: Home Security Questions You Can Ask Alexa and Siri Amazon's news concerns two specific Alexa privacy options: "Do not send voice recordings" and "Do not save voice recordings," which can be found deep in the Alexa app and Echo device settings. Starting March 28, Amazon will be removing the "Do not send voice recordings" option, which means all recorded voice commands will be automatically sent to Amazon for processing and analysis. The company is also changing how the "Do not save voice recordings" option works, limiting Alexa features if you don't want to save recordings locally. The company also reported that voice recordings will be deleted after processing is complete. Everything Alexa records when it hears its wake word will automatically be sent to the Amazon cloud for processing. Amazon's message to owners indicated that it will be using this data to train Alexa Plus to hold better conversations and understand people more accurately -- which has always been a primary reason companies want this kind of voice data. Now Amazon isn't giving people a choice. The next question is "Does that mean a real human employee will be listening to my voice recordings?" That's harder to answer. This type of processing probably doesn't require human ears to listen to recordings. However, in 2019, Amazon was discovered ordering employees to listen and manually annotate parts of Alexa voice recordings, so the company doesn't have a great track record here. Doing this sort of labeling "by hand" isn't really efficient, and employees reported hearing everything from bad singing to possible sexual assaults. Still, Amazon did not indicate at the time it would be making any changes to its processing. Only the "Do not send voice recordings" and "Do not save voice recordings" options appear to have been changed. Other Alexa privacy settings, such as Alexa Skill permissions, were not mentioned, so they should still be intact after the deadline. It's mostly about Alexa Plus, Amazon's conversational AI upgrade ($20 per month, free for Prime users) coming this spring. Amazon really, really needs Alexa Plus to bring in more revenue and, hopefully, new customers, so the company is pulling out all the stops to prepare for its spring release. One of those stops was these privacy settings. In its notification, Amazon simply said, "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." In other words, the more voice data available, the more Alexa Plus can learn, and Amazon is giving it access to all voice data possible. Amazon reports that it encrypts voice data "in transit" to Amazon's cloud. Once there, it is most likely decrypted for analysis. Based on what Amazon has said, no Echo owner will have a choice, even if you don't intend to use Alexa Plus. Your voice recordings get sent to the company either way. This probably means that other services like Live Translations and Adaptive Listening will be automatically enabled, whereas before using this privacy setting disabled such features. You will still have access to the privacy setting "Do not save voice recordings." However, Amazon has made it clear that turning this setting on means Voice ID settings will not work. Voice ID is Alexa's ability to recognize different voices in a household and give personalized answers based on their profiles and added devices. Voice ID is becoming even more important with Alexa Plus, so Amazon wants to encourage people to use this feature as much as possible. But voice recognition features can't work if you don't save recordings. Amazon Web Services has experienced cloud data breaches and vulnerabilities in the past, but these were usually caused by problems created by third-party users like Capital One, Pegasus Airlines and others. Breaches into Amazon's directly controlled customer cloud data are largely unheard of. There's a more direct risk when it comes to your Echo voice data. In 2023, Amazon paid a penalty of $25 million for breaking a children's privacy law by keeping children's voice recordings permanently instead of getting rid of them as required. Echo users should be most wary of this kind of misuse of data. We've found nothing so far, although we'll test out the new features when they arrive. The change is sort of a privacy ultimatum: Either let Amazon use your voice recordings or stop using Alexa. Non-English speakers didn't have the same privacy settings available in the first place, and they're not likely to see notable changes.
[6]
Amazon to kill Echo privacy feature and send all your Alexa recordings to the cloud
Come March 28, Amazon is disabling an option that allows your Alexa voice conversations to be processed locally instead of in the cloud. Amazon is curtailing a privacy-minded feature that will affect owners of certain Echo devices. In an email sent last Friday to a number of customers, Amazon revealed that as of March 28, it will remove an opt-in setting that prevented audio of your Alexa requests from being shared with the company. Available to users with certain Echo devices, this option processes your Alexa requests locally instead of sending their recordings to the cloud. Also: Echo Pop vs Echo Dot: Which Alexa speaker should you choose? Though this option sounds like it should have been available to all Alexa users, it was limited to those with the Echo Dot 4th generation, Echo Show 10, and Echo Show 15 devices and only to people in the US with English set as their language. To enable this in the Alexa app, at least until March 28, select the supported Echo device and tap Settings. From there, turn on the switch for "Do Not Send Voice Recordings." With this option soon to be disabled, does that open up Alexa users to privacy risks? To address such concerns, Amazon will automatically update the privacy settings for affected users to not save voice recordings, a company spokesperson told ZDNET. In this case, the recordings are still shared with Amazon. But those recordings will then be deleted after the request has been answered. Amazon will also remove any previous recordings still accessible. Even with the "Do Not Send Voice Recordings" setting turned on, your requests to Alexa have never been fully private. Yes, the audio of your conversations aren't sent to the cloud. However, text transcripts of your requests are still shared so that Alexa can respond to you, says an Amazon help page. Further, audio of certain Alexa requests, such as making a phone call or sending a message, are still sent to the cloud. Another feature requiring cloud access is voice ID, which helps Alexa recognize your voice to provide more personalized responses. When you set up a voice ID, the audio recordings that teach Alexa your voice are shared with Amazon. If you don't allow your voice recordings to be sent to Amazon's cloud, voice ID won't work. Sharing your requests will be necessary if you want to use Alexa and your Echo as fully as possible, for better or worse. But that doesn't mean you're wrong to be concerned about your privacy, especially given Amazon's track record. Also: Everything you need to know about Alexa+, Amazon's new generative AI assistant In 2023, the company was fined $25 million after the FTC and DOJ accused it of misleading parents and users about Alexa's data deletion practices. In 2021, researchers discovered that only a small number of Alexa skills had a privacy policy at the time. In 2019, a report found that Amazon employees were eavesdropping on Alexa queries to enhance its accuracy. That same year, the company acknowledged that voice recordings were held forever unless users manually removed them. And what about now? To try to placate customers with privacy worries, an Amazon spokesperson shared the following statement: "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing. We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We'll continue learning from customer feedback and building privacy features on their behalf." Also: Google is officially replacing Assistant with Gemini - and there's only one way to keep it Privacy will be another important factor as Amazon launches its new Alexa+ option. Tapping into AI, the new service will be able to handle requests and chats more like a ChatGPT or Google Gemini. That means it will respond with a more natural cadence, conduct longer conversations, handle multiple prompts, generate content, and process documents. And if you want all that, giving up a little bit of your privacy may be the price you'll have to pay.
[7]
All your Alexa recordings will go to the cloud soon, as Amazon sunsets Echo privacy
Come March 28, Amazon is disabling an option that allows your Alexa voice conversations to be processed locally instead of in the cloud. Amazon is curtailing a privacy-minded feature that will affect owners of certain Echo devices. In an email sent last Friday to a number of customers, Amazon revealed that as of March 28, it will remove an opt-in setting that prevented audio of your Alexa requests from being shared with the company. Available to users with certain Echo devices, this option processes your Alexa requests locally instead of sending their recordings to the cloud. Also: Echo Pop vs Echo Dot: Which Alexa speaker should you choose? Though this option sounds like it should have been available to all Alexa users, it was limited to those with the Echo Dot 4th generation, Echo Show 10, and Echo Show 15 devices and only to people in the US with English set as their language. To enable this in the Alexa app, at least until March 28, select the supported Echo device and tap Settings. From there, turn on the switch for "Do Not Send Voice Recordings." With this option soon to be disabled, does that open up Alexa users to privacy risks? To address such concerns, Amazon will automatically update the privacy settings for affected users to not save voice recordings, a company spokesperson told ZDNET. In this case, the recordings are still shared with Amazon. But those recordings will then be deleted after the request has been answered. Amazon will also remove any previous recordings still accessible. Even with the "Do Not Send Voice Recordings" setting turned on, your requests to Alexa have never been fully private. Yes, the audio of your conversations aren't sent to the cloud. However, text transcripts of your requests are still shared so that Alexa can respond to you, says an Amazon help page. Further, audio of certain Alexa requests, such as making a phone call or sending a message, are still sent to the cloud. Another feature requiring cloud access is voice ID, which helps Alexa recognize your voice to provide more personalized responses. When you set up a voice ID, the audio recordings that teach Alexa your voice are shared with Amazon. If you don't allow your voice recordings to be sent to Amazon's cloud, voice ID won't work. Sharing your requests will be necessary if you want to use Alexa and your Echo as fully as possible, for better or worse. But that doesn't mean you're wrong to be concerned about your privacy, especially given Amazon's track record. Also: Everything you need to know about Alexa+, Amazon's new generative AI assistant In 2023, the company was fined $25 million after the FTC and DOJ accused it of misleading parents and users about Alexa's data deletion practices. In 2021, researchers discovered that only a small number of Alexa skills had a privacy policy at the time. In 2019, a report found that Amazon employees were eavesdropping on Alexa queries to enhance its accuracy. That same year, the company acknowledged that voice recordings were held forever unless users manually removed them. And what about now? To try to placate customers with privacy worries, an Amazon spokesperson shared the following statement: "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing. We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We'll continue learning from customer feedback and building privacy features on their behalf." Also: Google is officially replacing Assistant with Gemini - and there's only one way to keep it Privacy will be another important factor as Amazon launches its new Alexa+ option. Tapping into AI, the new service will be able to handle requests and chats more like a ChatGPT or Google Gemini. That means it will respond with a more natural cadence, conduct longer conversations, handle multiple prompts, generate content, and process documents. And if you want all that, giving up a little bit of your privacy may be the price you'll have to pay.
[8]
Amazon to Axe Setting That Lets You Store Alexa Voice Recordings Locally
Amazon Echo users are set to lose the option to store and process their Alexa requests locally, meaning all of their voice recordings will be sent to Amazon's cloud. After March 28, Amazon will discontinue its "Do Not Send Voice Recordings" setting on Echo devices that allowed people to store their recordings solely on their own device. After that, unless you proactively change your settings, all your voice recordings will be sent directly to Amazon's cloud for processing before being deleted. In a recent email sent to customers, Amazon attributed the decision to stop supporting the feature to its expansion of "Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud." At an event in New York last month, Amazon displayed some of the new AI functionality that the updated Alexa+ is capable of. These included features like being able to search and summarize documents or emails you've shared or look up information, as part of a generally less stiff and more human-like experience. However, these new AI-driven features won't be accessible for consumers who aren't happy about Amazon processing their data in the cloud. In the email to customers, Amazon said that if voice recording settings are set to "Don't save recordings," voice ID will not work, and users won't be able to access some of Alexa+'s more personalized features. Numerous Alexa users are unhappy with the move. One user on Reddit dubbed this a "great opportunity to discontinue Amazon Alexa!" Amazon's record when it comes to using the data collected by Alexa isn't exactly picture-perfect. In 2023, the tech giant was hit with a $25 million civil settlement after failing to disclose it stored recordings of children's conversations with Alexa indefinitely. We've also seen Amazon compelled to hand over data to be used as evidence in criminal trials in various states in the US and Germany. If you're concerned about how your data is being used, or if you're not interested Alexa+'s $20 monthly subscription fee, check out PCMag's guide to the best smart speaker alternatives to the Amazon Echo.
[9]
Amazon kills off on-device Alexa processing for Echo owners
Web souk says Echo hardware doesn't have the oomph for next-gen AI anyway Come March 28, those who opted to have their voice commands for Amazon's AI assistant Alexa processed locally on their Echo devices will lose that option, with all spoken requests pushed to the cloud for analysis. Amazon hasn't formally announced the change, and the help page for the feature still makes no mention of the March 28 deprecation. But the internet souk confirmed to The Register that emails to users about the update, which caused a stir on social media over the weekend, are indeed legit. "We are reaching out to let you know that the Alexa feature 'Do Not Send Voice Recordings' that you enabled on your supported Echo device(s) will no longer be available beginning March 28, 2025," a copy of the email sent to Echo users relayed to El Reg read. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." So there it is, apparently: Alexa's latest generative AI tricks are too demanding for the hardware on the handful of Echo devices that support local processing -- the 4th-gen Echo Dot, Echo Show 10, and Show 15. Less powerful Echo gadgets don't have any option for processing locally. Therefore, all spoken Alexa requests are going up into Amazon systems for remote processing. Privacy-conscious users who enabled the "Do Not Send Voice Recordings" setting won't get a say; it's being disabled automatically. Not that many people bothered enabling the local option in the first place, Amazon claims. "If you do not take action, your Alexa Settings will automatically be updated to 'Don't save recordings,'" Amazon told affected users. "Starting on March 28, your voice recordings will be sent to and processed in the cloud, and they will be deleted after Alexa processes your requests," the email continued. "If your voice recordings setting is updated to 'Don't save recordings,' voice ID will not work and you will not be able to create a voice ID for individual users to access more personalized features." In other words, unless you let Amazon store your recordings, you're stuck with a feature-limited Alexa. And all voice commands are going up regardless. While Echo owners may not be happy about about losing the option for on-device audio processing, the soon-to-be-scrapped feature wasn't exactly airtight with its privacy protection to begin with. Another Amazon help page that gives more detail on the Do Not Send Voice Recordings option notes that even when audio recordings stay local, a text transcript of each request still gets shipped off to Amazon's cloud for processing anyway. Those transcripts are stored right alongside voice recordings and don't auto-delete -- you have to manually purge them via your Voice History, assuming you knew they existed in the first place. Amazon customers shouldn't be surprised, though: The tech titan's approach to privacy has long raised eyebrows, particularly when it comes to Alexa and the other gadgets it plants in homes to gain an audio and visual foothold. Studies have claimed that Amazon uses Alexa voice interaction data to help target ads -- both on Echo devices and across the web. Third-party apps available for Alexa-enabled devices don't offer much comfort either, at times lacking clear privacy policies or adequate safeguards on how user data is handled. Then there's last year's drama in America surrounding Ring cameras, when the FTC claimed the super-corp's lax security controls allowed Amazon employees and contractors to access customers' private video feeds. The agency also alleged that Amazon unlawfully retained Alexa voice recordings of children indefinitely, violating child privacy laws. Amazon, naturally, denies that eliminating the on-device processing feature will impede user privacy. "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing," an Amazon spokesperson told us. "We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud." It's those generative AI updates, revealed in late February alongside the launch of Alexa+, that appear to be driving the change. The three devices mentioned by Amazon as supporting local voice processing (Dot 4, Show 10, Show 15) are all in the lineup of devices supported by Alexa+, which Amazon is no doubt keen to push users to adopt. Unlike the classic Alexa, which Amazon said will continue to be available, generative AI through Alexa+ - and the fresh stream of user data required to fuel them - will only be available to Amazon Prime subscribers, or anyone willing to shell out $19.99 per month without Prime. Whether you're sticking with Alexa or using Alexa+, commands are processed remotely. Amazon told us customers will still have plenty of privacy options available, "including the option to not save their voice recordings at all." That feature, as we noted above, means losing out on many essential Alexa features, like the voice assistant being able to recognize an individual speaker and respond based on their preferences, which for many multi-user households is an essential feature. "We'll continue learning from customer feedback, and building privacy features on their behalf," Amazon said. ®
[10]
Amazon is Going to Listen to All Your Voice Recordings on Alexa+_
Amazon is nixing one of the few privacy protections against accessing users' voice data, and you can blame AI for the change. Amazon’s AI-enhanced Alexa assistant is going to need all your voice recordings, and there’s nothing you can do about it. An email sent to Alexa users notes the online retail giant is ending one of its few privacy provisions about recorded voice data in the lead up to Alexa+. The only way to make sure Amazon doesn't get a hold of any of your vocals may be to quit using Alexa entirely. Gizmodo reached out to Amazon for confirmation, though we did not immediately hear back. You can find the full email on Reddit (as first reported by Ars Technica), which plainly states the “Do Not Send Voice Recordings†setting on Alexa is being discontinued on March 28. Anybody who has the setting enabled will have it automatically revoked, and Amazon will then be able to process your voice recordings. Amazon claims it will delete the recordings once it's done processing your request. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature," the email reads. "If you do not take action, your Alexa Settings will automatically be updated to 'Don't save recordings.' This means that, starting on March 28, your voice recording will be sent to and processed in the cloud, and they will be deleted after Alexa processes your requests. any previously saved voice recordings will also be deleted." Alexa+, Amazon’s upcoming AI version of its normally inconsistent voice assistant, is supposed to allow for far more utility than it had in the past. The new assistant should be able to order groceries via multiple apps including Amazon Fresh and Instacart for you based on broad requests like “get me all the ingredients I need to make a pizza at home.†It's supposed to set smart home routines, access your security footage, and look for Prime Video content in a conversational manner. The other big headline feature is Voice ID, where Amazon claims Alexa can identify who is speaking to it. The AI theoretically should learn users' habits over time and tailor its responses to each individual. Alexa+ is supposed to come to all current Echo Show devices and will supposedly make its way to future Echo products as well. If you have an Amazon Prime account, you’ll get immediate access to Alexa+. Without the subscription, you’ll need to cough up another $20 a month for the sake of talking to AI-infused Alexa. The tradeoff is now you will have to offer your vocals to the online retail giant for it to do as it pleases. There are more than a few reasons you don’t want Amazon anywhere near your voice data. For years, Amazon’s default setting gave workers access to user data, even offering some the ability to listen to users' Alexa recordings. In 2023, the company paid out $25 million to the Federal Trade Commission over allegations it gave employees access to children' s voice data and Ring camera footage. For its part, Amazon said it had changed its data practices to comply with the Children’s Online Privacy Protection Act, aka COPPA. Amazon’s privacy track record is spotty, at best. The company has long been obsessed with users’ voice data. In 2023, Amazon revealed it was using Alexa voice recordings to help train its AI. Gizmodo reached out to Amazon to confirm whether Alexa+ voice recordings will also be used to train the company’s AI models. We will update this story once we hear back. Unlike Apple, which made big claims about data protections with its “private cloud compute†system for processing cloud-based AI requests anonymously, Amazon has made far fewer overtures to keeping user data safe. Smaller AI models can run on-device, but those few examples we have of on-device capabilities from the likes of Windows Copilot+ laptops or Gemini on Samsung Galaxy S25 phones areâ€"in their current iterationâ€"little more than gimmicks. Alexa+ wants to be the first instance of true assistant AI with cross-app capabilities, but it may also prove a privacy nightmare from a company that has routinely failed to protect users' data.
[11]
Smart Home Privacy Takes a Blow as Amazon Kills Alexa Local Processing
7 Smart Home Devices I Wouldn't Buy Again (and What I'd Get Instead) Alexa devices will no longer offer a "do not send voice recordings" setting after March 28th. Future Alexa recordings must be sent to the Amazon cloud, though you can still ask Amazon to automatically delete voice requests after they're processed. The "do not send voice recordings" setting slightly increases Amazon customers' privacy by processing Alexa audio data on-device. It also reduces Amazon's ability to utilize customer voice data for AI training or other purposes, and it may alleviate some customers' concerns about "spying." That said, I don't want to place too much weight on this feature's importance. "Do not send voice recordings" is only available on three Echo devices -- the Echo Dot (4th Gen), Echo Show 10, and Echo Show 15 -- and it doesn't provide 100% local processing. It simply transcribes your voice requests into text, which are then sent to Amazon and saved to the cloud. Customers affected by this change will be automatically transitioned to the "don't save recordings" setting, which automatically deletes recordings from the cloud after they have been processed. If you want to manually review any audio recordings or text transcriptions that Amazon has saved to the cloud, check your Voice History panel. (I suggest that you regularly check Voice History regardless of your account preferences, as you won't always know when Amazon makes changes to available settings or policies.) "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing. We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We'll continue learning from customer feedback and building privacy features on their behalf." Amazon has not provided a reason for the "do not send voice recordings" feature's removal. However, the company's statement to The Verge, shown above, suggests that "generative AI experiences" are to blame. An AI-enhanced Alexa may be able to understand tone of voice or inflection, which don't really exist in written text, hence the need for voice recordings. And some Alexa+ features, like voice recognition, just can't work without audio recordings. I should also point out that Amazon uses voice recordings to train AI. Removing the option to withhold voice recording telemetry may be nothing more than a data grab. And I suspect that the setting's name, "do not send voice recordings," was less than ideal from Amazon's perspective. It naturally implied that Alexa collects more data than it should. Related 7 Uses for Smart Plugs Beyond Just Turning Devices On or Off Monitor energy consumption, get alerts when appliances stop working, trigger work automations, and more! Posts 25 In any case, Amazon has been repeatedly criticized for the way that it handles voice recordings and other private data. The company was recently sued by the FTC for allegedly retaining and utilizing children's voice data without explicit parental consent -- a potential violation of the Children's Online Privacy Protection Act. In a separate federal lawsuit, Amazon was accused of using customers' Ring camera footage (including footage from indoor cameras) to train algorithms. Both cases were settled to the tune of several million dollars. If you want a more private smart home experience, consider setting up Home Assistant. The open-source Home Assistant Voice platform makes it easy to run voice commands locally, though you can also go for a super-customized setup by integrating Home Assistant with local LLMs. I realize that this isn't an easy solution, but if you want a voice assistant that respects your privacy, you have to host it yourself. Source: Amazon via The Verge
[12]
Alexa is getting creepier. Take this one step to improve your privacy.
Amazon is ditching an option to opt out of sending Alexa voice commands to the company. It highlights the growing hunger for our personal data in the AI age. It's a good time to revisit the terms of your relationship with Alexa. If you own one of Amazon's voice assistant gadgets, everything you say to Alexa is beamed to Amazon's cloud and saved forever on the company's computer systems. Amazon uses those Alexa voice recordings to answer your commands and train its artificial intelligence. Now Amazon is removing a setting that gave some Alexa device owners a more privacy-preserving option. Few people used that setting, but the change is a reminder that Alexa is a data hog and likely growing more so. I'll walk you through what Amazon is changing and help you evaluate how to live with an Alexa device without losing complete control of your privacy. (Amazon founder Jeff Bezos owns The Washington Post.) Even if you don't own an Alexa device, Amazon shows the potential personal toll in the age of AI. Digital bits of ourselves are being fed into corporations' computers and we can't know how they might be misused. We may need personal empowerment and regulation to wrest back some control. What Alexa is changing When your email, photos or Alexa voice commands are in the cloud, that means they're saved on someone else's computers. That's useful but comes with risks. Skip to end of carousel Shira Ovide (Patrick Dias for The Washington Post) Tech Friend writer Shira Ovide gives you advice and context to make technology work for you. Sign up for the free Tech Friend newsletter. Contact her securely on Signal at ShiraOvide.70 End of carousel You don't know when humans are listening to your recordings saved in Amazon's cloud. You don't know whether Amazon is accidentally sending your voice recordings to a stranger or saving what you say even if you don't trigger Alexa with a "wake" word. That's all happened before. What's new are reports in recent days that Amazon, as of March 28, will ditch one option to limit some of Alexa's privacy risks. Amazon said it's focusing on privacy controls that its customers use the most and that work well with the planned debut of a revamped AI-powered Alexa. People in the United States who own a handful of Amazon device models have been able to change a setting so some commands like checking the weather are handled by the computer brain on the Alexa device. Voice commands aren't sent to Amazon's cloud, although text of the command may be. Amazon said very few people used this privacy setting. Removing it is still a step in the wrong direction for Alexa's data hoarding. When Post technology columnist Geoffrey A. Fowler looked at all the Alexa recordings that Amazon saved from inside his home, there were thousands of voice files, from the mundane (setting timers) to the unnerving, including family discussions about medication and a friend conducting a business deal. How to improve your Alexa privacy Top tip: Stop your Alexa device from saving your voice recordings. You get few benefits, and mostly potential risks, from Amazon saving in its cloud everything you say to Alexa. To do this from the Alexa phone app: Settings → Alexa Privacy → Manage Your Alexa Data → Don't save recordings. (Amazon has detailed instructions here.) Why you should do this: Companies collect and save as much data as they can about you because there's little downside for them. But there is some for you. "The more you save today, the more that can be breached tomorrow," said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, a privacy advocacy group. "Whether you're worried about hackers breaking in or the government forcing its way in, you can protect yourself down the road by disabling data retention now." Amazon says that if it doesn't save your audio recordings, Alexa's software can't distinguish among the voices of people in your household. What Amazon can do: It would be better if Amazon did not save recordings from people's Alexa devices, period. Technology companies say their privacy settings give you control, but that lets them appear to be good guys while they also know few people will change the standard setting. Amazon said that "the Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing." Other privacy-improvement steps: Consider moving your Alexa device to less intimate areas of your home. If you can, Cahn recommended not putting an Alexa device in parts of your home where recordings could be compromising or embarrassing, such as a bedroom or bathroom. Unplug your Alexa device when it's feasible. Consider doing so if you have guests over or are having intimate family discussions about your finances or health. If you're handy, you can use a light switch-controlled wall outlet to flip your Alexa device on and off, suggested Rory Mir, associate director of community organizing at the Electronic Frontier Foundation consumer advocacy group. Consider ditching your Alexa device. Some of you will say that if people are worried about privacy, don't own a home gadget with an always-listening microphone. It's a reasonable point but not an entirely fair one. You should not have to choose between digital conveniences or autonomy over your personal information. We deserve both. Read more: More Alexa privacy settings to change if you have kids.Your phone is always listening to you but probably not to target ads.A 2020 research analysis found voice assistants mostly aren't recording you without your knowledge. You can also make Alexa give an audible signal when it starts recording.
[13]
Amazon is removing this privacy feature from its Echo smart speakers on March 28 -- what you need to know
Amazon is removing a key privacy feature from its Echo smart speakers -- and Alexa+ is to blame. On March 28, Amazon will remove the ability to have Alexa process your voice requests locally, and will start sending all of your recordings to the cloud to be processed there, rather on your Echo device itself. The update, first reported by Ars Technica, comes from an email Amazon sent to Echo owners who had local processing enabled. In the email, Amazon states that the increased processing power required of the generative AI tools of Alexa+ necessitates the change. This change only affects those who have an Echo Dot (4th Gen), Echo Show 10, or Echo Show 15, as those were the only devices that supported local Alexa, and was only available to customers in the U.S. with devices set to English. "Starting on March 28th, your voice recordings will be sent to and processed in the cloud, and they will be deleted after Alexa processes your requests," reads the email from Amazon. "Any previously saved voice recordings will also be deleted. If your voice recordings setting is updated to 'Don't save recordings,' voice ID will not work and you will not be able to create a voice ID for individual users to access more personalized features." Amazon's email also stated that it will delete every recording after it's processed, and that "Alexa voice requests are always encrypted in transit to Amazon's secure cloud, which was designed with layers of security protections to keep customer information safe." Currently, you can opt for Amazon to delete any voice recording immediately, but doing so also removes Alexa's ability to create voice profiles for your account, which means you won't be able to get personalized recommendations for such things as music and calendar events. Though it's not alone among tech companies, Amazon has had issues surrounding data privacy. In 2023, it was fined $25 million for not deleting recordings of children and location data, even after it was requested to do so. Ring, which is owned by Amazon, was also fined $5.8 million for allowing third-party contractors access to customers' videos. "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing," said an Amazon spokesperson in a statement to Tom's Guide. "We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud. "Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We'll continue learning from customer feedback, and building privacy features on their behalf." While their smart speaker offerings are much more limited than Amazon's, both Apple and Google have stated that at least some of the processing that's done for their AIs happens on-device, rather than in the cloud. Apple in particular is taking a privacy-first approach to Apple Intelligence, though it is lagging far behind its competitors. To be sure, this is done not just for privacy, but also to speed up the responses delivered by AI. However, the latest iPhones as well as Pixel devices have much more powerful chips than your typical Echo Dot, so it's a lot easier for Apple and Google to enable on-device processing for AI than it is for Amazon to do so on its smart speakers, all of which cost hundreds less. If you do not want to set your voice recordings setting to 'Don't save recordings,' please follow these steps before March 28th:
[14]
Your Amazon Echo will start reporting to Amazon on March 28
Owners of the Amazon Echo have long had the option for the device to process requests locally, thereby keeping their information off of Amazon's servers. That functionality is going away starting on March 28. The company sent out emails to customers to explain the changes. "We are reaching out to let you know that the Alexa feature 'Do Not Send Voice Recordings' that you enabled on your supported Echo device(s) will no longer be available beginning March 28, 2025," the email reads. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." Per Ars Technica, the changes don't stop there. Users who have the "Don't save recordings" feature enabled will also lose access to Voice ID, a feature that allows Alexa to share user-specific things like calendar events, music preferences, and more. Thus, in order to keep that functionality, users will have to manually change that setting as well. In short, everything you say to your Echo devices after March 28 will be sent to Amazon's cloud, and there isn't anything anyone can do about it. If users continue to make Amazon delete recordings, they will also lose access to features their devices had by default when they were purchased. Amazon attempts to assuage security concerns by telling users in the email that "Alexa voice requests are always encrypted in transit to Amazon's secure cloud, which was designed with layers of security protections to keep customer information safe." However, as Ars points out, once the information gets there, it'll be used by Amazon and its employees to do as they please. The announcement has not gone over well, with many Alexa users storming Reddit over the weekend to voice their displeasure. The news comes a couple of weeks after Amazon's announcement of Alexa+, a subscription service that adds AI enhancements. The retail giant had delayed the release of Alexa+ due to a severe setback, but it seems to be on pace to release the service this year. Amazon says Alexa+ will offer a more seamless and context-aware experience and also work with other Alexa-enabled devices like Ring video doorbells and other tech.
[15]
No, Amazon isn't changing how all Echos process your voice requests to satisfy Alexa+'s more powerful models
Amazon is turning off the ability to process voice requests locally. It's a seemingly major privacy pivot and one that some Alexa users might not appreciate. However, this change affects exactly three Echo devices and only if you actively enabled Do Not Send Voice Recordings in the Alexa app settings. Right. It's potentially not that big of a deal and, to be fair, the level of artificial intelligence Alexa+ is promising, let alone the models it'll be using, all but precludes local processing. It's pretty much what Daniel Rausch, Amazon's VP of Alexa and Echo, told us when he explained that these queries would be encrypted, sent to the cloud, and then processed by Amazon's and partner Antrhopic's AI models at servers far, far away. That's what's happening, but let's unpack the general freakout. After Amazon sent an email to customers, actually only those it seems who own an Echo Dot 4, Echo Show 10 (3rd Gen), and Echo Show 15, that the option to have Alexa voice queries processed on device would end on March 28, some in the media cried foul. They had a point: Amazon didn't have the best track record when it comes to protecting your privacy. In 2019, there were reports of Amazon employees listening to customer recordings. Later, there were concerns that Amazon might hold onto recordings of, say, you yelling at Alexa because it didn't play the right song. Amazon has since cleaned up its data act with encryption and, with this latest update, promises to delete your recordings from its servers. A change for the few This latest change, though, sounded like a step back because it takes away a consumer control, one that some might've been using to keep their voice data off Amazon's servers. However, the vast majority of Echo devices out there aren't even capable of on-device voice processing, which is why most of them didn't even have this control. A few years ago, Amazon published a technical paper on its efforts to bring "On-device speech processing" to Echo devices. They were doing so to put "processing on the edge," and reduce latency and bandwidth consumption. Turns out it wasn't easy - Amazon described it as a massive undertaking. The goal was to put automatic speech recognition, whisper detection, and speech identification locally on a tiny, relatively low-powered smart speaker system. Quite a trick, considering that in the cloud, each process ran "on separate server nodes with their own powerful processors." The paper goes into significant detail, but suffice it to say that Amazon developers used a lot of compression to get Alexa's relatively small AI models to work on local hardware. It was always the cloud In the end, the on-device audio processing was only available on those three Echo models, but there is a wrinkle here. The specific feature Amazon is disabling, "Do Not Send Voice Recordings," never precluded your prompts from being handled in the Amazon cloud. The processing power that these few Echos had was not to handle the full Alexa query locally. Instead, the silicon was used to recognize the wake word ("Alexa"), record the voice prompt, use voice recognition to make a text transcription of the prompt, and send that text to Amazon's cloud, where the AI acts on it and sends a response. Granted, this is likely how everyone would want their Echo and Alexa experience to work. Amazon gets the text it needs but not the audio. But that's not how the Alexa experience works for most Echo owners. I don't know how many people own those particular Echo models, but there are almost two dozen different Echo devices, and this affects just three of them. Even if those are the most popular Echos, the change only affects people who dug into Alexa settings to enable "Do Not Send Voice Recordings." Most consumers are not making those kinds of adjustments. This brings us back to why Amazon is doing this. Alexa+ is a far smarter and more powerful AI with generative, conversational capabilities. Its ability to understand your intentions may hinge not only on what you say, but your tone of voice. It's true that even though your voice data will be encrypted in transit, it surely has to be decrypted in the cloud for Alexa's various models to interpret and act on it. Amazon is promising safety and security, and to be fair, when you talk to ChatGPT Voice and Gemini Live, their cloud systems are listening to your voice, too. When we asked Amazon about the change, here's what they told us: "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing. We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We'll continue learning from customer feedback, and building privacy features on their behalf." For as long as the most impactful models remain too big for local hardware, this will be the reality of our Generative AI experience. Amazon is simply falling into line in preparation for Alexa+. It's not great news, but also not the disaster and privacy and data safety nightmare it's made out to be. You may also like
[16]
Voice data sharing to become mandatory for Echo users
Amazon will soon disable users' ability to process voice requests on-device. If you own an Echo smart speaker, it will soon lose a key privacy feature -- and be replaced with cloud processing of all voice commands. Users of Amazon's Echo smart speaker will soon receive an email from the company detailing changes to the device's voice request processing. Beginning March 28, everything users say to Echo's Alexa engine will be sent to Amazon. The move will disable local processing of voice requests in favor of a new Alexa+ feature called Amazon Voice ID, notes Ars Technica. This feature will allow Echo devices to recognize different users, and will be implemented for all existing Echo devices. The change allows different users to check or modify their own calendars, music libraries, or reminders. By contrast, Apple devices that support Siri have been able to recognize multiple users since 2019. That said, prior to 2021 Siri had to ping Apple servers first when processing voice requests. With the debut of iOS 15, all speech processing and personalization can be handled without an internet connection using machine learning -- including setting alarms, launching apps, controlling podcast/music playback, and system settings. Internet access is still required on smart speakers across all brands for current answers to topical queries, and for installing system updates. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support [on-device processing]," the company said in its email to users. Although Amazon's email reminds Echo users that they will be required to turn off the existing "Don't save recordings" feature. The company warns that if users do not disable that feature, "Voice ID may not work" -- effectively bricking the Echo as of March 28. Previous reports have said that Amazon employees can also listen to personal requests. Bloomberg noted in 2019 that workers listened to as many as 1,000 audio samples per day to help train natural-language understanding and speech recognition systems. The company also has a history of paying fines for privacy violations. In 2023, the Federal Trade Commission charged Amazon with allowing employees and contractors to view customers' video recordings from Ring cameras. Later that same year, the company paid a fine of $30 million in penalties for the Ring privacy violations. The fine also covered accusations of Amazon storing recordings of child interactions with Alexa devices. In its new email, the company says it will now delete recordings after cloud processing. However, users must disable the "Don't save recordings" control on their local devices in order for the Alexa+ feature to work. Amazon appears to be setting the stage for further subscriptions, as the Alexa service has been unprofitable for the company. The move will effectively force users to choose to either share recordings, or lobotomize their existing Echo devices.
[17]
All Alexa Voice Requests Will Soon Go Through Amazon's Servers
Amazon is in the process of overhauling Alexa, introducing a new Alexa+ AI service that will be available free of charge for Prime users (or $20 per month on its own). But as the company plans to roll out this new service, user privacy across Echo devices is taking a hit. In a March 15 email, Amazon announced its Echo devices will no longer support local processing for Alexa requests, and will stop offering "Do Not Send Voice Recordings" as an option. This means that every request -- and its subsequent voice recording -- will end up going to Amazon's cloud. Even a request as simple as "turn off the lights" will be sent to Amazon. This change starts on March 28th, and it includes all spoken commands to Alexa in Echo speakers and smart displays. Why is this happening? According to Amazon's email (sourced by Ars Technica) it all comes down to Alexa's new generative AI features. In the email, Amazon says: As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature. The focus is Amazon's new Alexa Voice ID feature, which the company is highlighting as a flagship feature in Alexa+. It lets Alexa+ recognize who is speaking to it, and reply accordingly. But even if you choose not to enable Alexa+, or to use Voice ID, Amazon is still taking away local processing. Why is this concerning? This move has raised many concerns about user privacy on Amazon devices. The idea that a major tech company can listen in on all requests made through its devices, at any time, doesn't sit well, especially when users have no choice in the matter. There really isn't much that Amazon Echo customers can do here, aside from quitting Alexa. Of course, for many users, Alexa is an integral part of their smart home. The decision now is to continue on using the features users have relied on for years, forgoing privacy, or quit the ecosystem entirely. Amazon does say that they will automatically delete recordings of all Alexa requests after the processing is done. Plus, Amazon is assuring users that all their recordings are encrypted in transit to Amazon's secure cloud servers. But given Amazon's track record, it's hard to trust their word. Amazon has a history of mismanaging Alexa voice recordings. In 2023, Amazon paid $25 million in a case over revelations the company stored recordings of children's voices forever, and it gave employees access to this data, as well as footage from Ring cameras. In the same year, reports showed Amazon was using real conversations in Alexa to train its AI (the one that is now shipping with Alexa+). In the past, company also admitted to letting its employees listen in on audio conversations. 'Don't Save Recordings' is now much less useful Previously, users at least had the choice to stop sharing their requests to Amazon servers ("Do Not Send Voice Recordings") as well as not to save them ("Don't Save Recordings"). Now, Amazon is effectively removing that second choice as well, if you want your device to work as advertised. As it happens, the "Don't Save Recordings" toggle is also linked to the Voice ID feature. This is the feature that can identify who is making the request, so Alexa can personalize its response accordingly. That way, your requests for calendar events, reminders, or music don't interfere with anyone else's requests in your home. It was already quite useful, and is becoming an even bigger deal with Alexa+. The thing is, if you ask Alexa not to save your recordings, it will also automatically disable Voice ID, and you'll lose out on all the user-identifying features. Amazon previously warned enabling this feature could affect Voice ID, but now it essentially guarantees it won't work. So, your "choice" isn't really a choice at all. You can either let Amazon process, save, and use your recordings however they want, or you lose out on the Voice ID feature, limiting the usefulness of the product -- while still sending your requests to Amazon's servers.
[18]
Alexa is getting creepier. Take this one step to improve your privacy.
It's a good time to revisit the terms of your relationship with Alexa. If you own one of Amazon's voice assistant gadgets, everything you say to Alexa is beamed to Amazon's cloud and saved forever on the company's computer systems. Amazon uses those Alexa voice recordings to answer your commands and train its artificial intelligence. Now Amazon is removing a setting that gave some Alexa device owners a more privacy-preserving option. Few people used that setting, but the change is a reminder that Alexa is a data hog and likely growing more so. I'll walk you through what Amazon is changing and help you evaluate how to live with an Alexa device without losing complete control of your privacy. (Amazon founder Jeff Bezos owns The Washington Post.) Even if you don't own an Alexa device, Amazon shows the potential personal toll in the age of AI. Digital bits of ourselves are being fed into corporations' computers and we can't know how they might be misused. We may need personal empowerment and regulation to wrest back some control. What Alexa is changing When your email, photos or Alexa voice commands are in the cloud, that means they're saved on someone else's computers. That's useful but comes with risks. You don't know when humans are listening to your recordings saved in Amazon's cloud. You don't know whether Amazon is accidentally sending your voice recordings to a stranger or saving what you say even if you don't trigger Alexa with a "wake" word. That's all happened before. What's new are reports in recent days that Amazon, as of March 28, will ditch one option to limit some of Alexa's privacy risks. Amazon said it's focusing on privacy controls that its customers use the most and that work well with the planned debut of a revamped AI-powered Alexa. People in the United States who own a handful of Amazon device models have been able to change a setting so some commands like checking the weather are handled by the computer brain on the Alexa device. Voice commands aren't sent to Amazon's cloud, although text of the command may be. Amazon said very few people used this privacy setting. Removing it is still a step in the wrong direction for Alexa's data hoarding. When Post technology columnist Geoffrey A. Fowler looked at all the Alexa recordings that Amazon saved from inside his home, there were thousands of voice files, from the mundane (setting timers) to the unnerving, including family discussions about medication and a friend conducting a business deal. How to improve your Alexa privacy Top tip: Stop your Alexa device from saving your voice recordings. You get few benefits, and mostly potential risks, from Amazon saving in its cloud everything you say to Alexa. To do this from the Alexa phone app: Settings → Alexa Privacy → Manage Your Alexa Data → Don't save recordings. (Amazon has detailed instructions on its Amazon Devices pages on its site.) Why you should do this: Companies collect and save as much data as they can about you because there's little downside for them. But there is some for you. "The more you save today, the more that can be breached tomorrow," said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, a privacy advocacy group. "Whether you're worried about hackers breaking in or the government forcing its way in, you can protect yourself down the road by disabling data retention now." Amazon says that if it doesn't save your audio recordings, Alexa's software can't distinguish among the voices of people in your household. What Amazon can do: It would be better if Amazon did not save recordings from people's Alexa devices, period. Technology companies say their privacy settings give you control, but that lets them appear to be good guys while they also know few people will change the standard setting. Amazon said that "the Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing." Other privacy-improvement steps: Consider moving your Alexa device to less intimate areas of your home. If you can, Cahn recommended not putting an Alexa device in parts of your home where recordings could be compromising or embarrassing, such as a bedroom or bathroom. Unplug your Alexa device when it's feasible. Consider doing so if you have guests over or are having intimate family discussions about your finances or health. If you're handy, you can use a light switch-controlled wall outlet to flip your Alexa device on and off, suggested Rory Mir, associate director of community organizing at the Electronic Frontier Foundation consumer advocacy group. Consider ditching your Alexa device. Some of you will say that if people are worried about privacy, don't own a home gadget with an always-listening microphone. It's a reasonable point but not an entirely fair one. You should not have to choose between digital conveniences or autonomy over your personal information. We deserve both.
[19]
Everything You Say to Alexa Will Be Sent to Amazon Starting Next Week
I Don't Download Cracked Software to Avoid These 4 Real Dangers Amazon is taking a big step back in privacy thanks to the rollout of Alexa+. The company is removing the ability for some Echo smart speakers to process Alexa requests locally and not have those recordings sent to the cloud. Local Processing Was Only Available on Three Echo Models The ability to process requests locally was previously available on three smart speaker models -- Echo Show 15, Echo Show 10, and fourth-generation Echo Dot. First noted by Ars Technica, Amazon recently emailed customers outlining the change: Starting on March 28th, your voice recordings will be sent to and processed in the cloud, and they will be deleted after Alexa processes your requests. Any previously saved voice recordings will also be deleted. If your voice recordings setting is updated to 'Don't save recordings,' voice ID will not work and you will not be able to create a voice ID for individual users to access more personalized features. In the same email, Amazon noted that every request will be deleted after it's processed, and that voice requests are encrypted when sent to Amazon's cloud. If you own one of those devices and choose not to save recordings on Amazon's cloud, you'll lose the Voice ID functionality. That allows the Echo to share your specific, customized information like reminders and music. The New Alexa+ Is to Blame for the Change Amazon noted that the change is because of the new Alexa+ powered by generative AI. The feature set was first unveiled in February 2025 and will soon be available for the Echo Show 8, 10, 15, and 21 as part of an early-access period. Just make sure to sign up on the Alexa+ site. Amazon highlighted that Alexa will be much easier to talk to and will better understand complex commands. All the features will cost $19.99 per month. If you're an Amazon Prime member, you will be able to tap the new Alexa for free. Even though the great privacy option was only available on a few Echo devices, it's disappointing to see Amazon make the choice to remove the feature, especially for users not interested in the improved Alexa.
[20]
Amazon is removing an Echo privacy setting that keeps Alexa recordings from the company
An opt-in Alexa feature called 'Do Not Send Voice Records' stops requests from being sent to the company. It will soon be removed from Echo devices, Amazon confirmed to USA TODAY. Amazon is planning to remove a privacy setting on its Echo devices later this month that will allow virtually all voice requests to go to the company's cloud. The company's cloud-based voice service Alexa currently offers an opt-in feature called "Do Not Send Voice Recordings." It stops audio requests from being sent to the company. Starting March 28, the feature will end and convert into a new one called "Don't Save Recordings," the company recently told Echo owners via email. While this updated version will still automatically delete voice requests, they will process to "Amazon's secure cloud" beforehand. Regardless of whether the "Don't Save Recordings" feature is on or off, all Alexa requests will process to the cloud, Amazon says. "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing," an Amazon spokesperson said in a statement to USA TODAY. "We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud." Users with this setting on will also no longer be able to use voice ID, a feature that allows Alexa to recognize their voice, unlike the current version. How many Echo owners used this setting? Amazon says that less than 0.03% of Echo owners opted to use the "Do Not Send Voice Recordings" feature. The feature has only been available to U.S. customers with services set to English through the Echo Dot (4th Gen), Echo Show 10 and Echo Show 15 devices, according to Amazon. The feature has also only applied to Alexa voice requests. Echo devices have previously processed sounds to the cloud if the selected "wake word," typically the saying "Alexa," alerts the device to listen. If users have the setting enabled by March 28, they will automatically have the updated "Don't Save Recordings" setting that prevents access to voice ID. The voice ID feature allows Alexa to learn a user's voice. However, if a user wants to continue using voice ID while having the "Don't Save Recordings" setting on, they can have their recordings delete every couple of months, Amazon confirmed. Update change causes some angry responses Some took to social media to voice their concerns over the privacy change, with some Reddit users arguing that the company is changing the terms of the agreement after the sale of the Echo device. "I don't understand how anyone could buy and support this product? I assume it has been doing this since day one," one Reddit user wrote. Another wrote that they are "so glad I jumped ship away from Echo half a decade ago." On Facebook user, John Coate, wrote that the end of the feature is meant to "help their AI development, which seems to really be about keeping their stock price up. At your expense." "You may want to get rid of your Amazon Echo. Apparently, you can't opt out of this," one X user wrote. What else does Amazon process to cloud? Amazon currently processes any sounds heard from an Echo device if the "wake word" is heard or if Alexa is activated by a button press. It also processes what's called visual ID when a person is seen on a device's camera and the technology attempts to match their identity with another enrolled user, according to Amazon.
[21]
Amazon Might Be Moving All Alexa Voice Processing to the Cloud
Devices set to not allow cloud processing won't support Voice ID feature Amazon is reportedly informing Echo users that the voice requests sent to Alexa will soon stop supporting local processing. As per the report, the Seattle-based e-commerce giant is planning to stop on-device processing of voice recordings for Echo devices starting March 28. The company is reportedly making the change since the new artificial intelligence (AI) version of the virtual assistant, Alexa+, will be entirely cloud-based. Those users who continue to keep their devices set to local processing will reportedly lose the Voice ID functionality of Alexa. The tech giant added the functionality of on-device voice request processing on Echo devices in 2021, allowing users who do not want to give Amazon access to their conversations with the voice assistant to opt for a privacy-focused approach. However, now, the company is said to be taking a 180-degree turn on that feature. According to an Ars Technica report, Amazon sent an email to Echo users informing them that they will no longer be able to process Alexa requests locally. These emails were reportedly sent only to those users who had enabled the "Do Not Send Voice Recordings" feature on their devices. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature," the email stated, as per the publication. The company is said to be planning to stop supporting local processing starting March 28, likely in preparation for deploying the new AI-powered Alexa+. Those who do not disable the setting will reportedly not be able to use one of the most integral features of the virtual assistant dubbed Voice ID. This feature allows Alexa to personalise user experience and share information such as calendar events, reminders, music, and more. Alexa Voice ID is set to receive a major upgrade with the new AI version as it will be able to understand contextual information and make recommendations more personalised. It will also be able to recognise different voices of users. However, even those Echo users who do not wish to use the AI features, will not get to use the legacy version of Voice ID after March 28, the report claimed. Ars Technica also shared the rest of the email where Amazon claimed that Alexa voice requests sent to cloud servers will always be encrypted with multiple security layers to keep users' information safe. Despite the assurance, the move is likely to raise concerns of some Echo users who prefer the privacy aspect of the device. Notably, in 2023, the US Federal Trade Commission (FTC) filed a lawsuit against Amazon over allegations that the company was illegally collecting and indefinitely storing data on children under the age of 13 without parental consent. As per a TechCrunch report, the e-commerce giant settled the lawsuit by paying a fine of $25 million (roughly Rs. 216.9 crores) and deleting the existing data.
[22]
Amazon Is Ending an Important Privacy Feature for Alexa Echo Devices By the End of the Month
Amazon says it is making the move as it prepares to add generative AI features to Alexa. Have an Alexa-enabled Echo device in the living room? Maybe don't spill your secrets to the voice assistant. Amazon stated in an email sent to Echo users on Friday that starting March 28, they will no longer be able to opt into a "Do Not Send Voice Recordings" setting to process Alexa requests locally and avoid sending voice recordings to Amazon, per Ars Technica. So by the end of this month, Amazon's cloud servers will receive recordings of every command sent to Alexa. Alexa users face a choice: Give Amazon access to everything they say or stop using their Echo device. Amazon stated in the email that it had decided to end locally processed requests because it was adding generative AI features to Alexa in the coming weeks "that rely on the processing power of Amazon's secure cloud." The Alexa+ AI features, which Amazon announced in late February and are coming to the Echo Show 8, 10, 15, and 21 first, give Alexa the power to create quizzes from study guides, come up with travel plans, and summarize footage from Ring security cameras. Alexa+ costs $19.99 per month, but Prime members get it for free. Related: Amazon Is Thinking About Charging Extra for AI Alexa For Alexa+ to work fully, users have to opt to not only send their voice recordings to Amazon, but also to allow the tech giant to save the audio files. A standout feature of Alexa+ is Alexa Voice ID, which enables the voice assistant to recognize who is speaking to it and give person-specific calendar events, music, reminders, and more. If Alexa users ask Amazon not to save any of their voice recordings through settings by March 28, Voice ID may not work. So now Alexa customers face another choice: Give Amazon the power to save their voice recordings or lose access to a central Alexa+ feature. Amazon addressed privacy concerns in the email, stating that "Alexa voice requests are always encrypted in transit to Amazon's secure cloud, which was designed with layers of security protections to keep customer information safe." Related: Amazon Swaps Plastic Pillows For Paper Shipping Materials Alexa users may be wary of giving their voice recordings to Amazon, despite the company's assurances that their data is safe, given Amazon's track record. In July 2019, Amazon confirmed to legislators that it kept Alexa transcripts and voice recordings indefinitely. Later that year, Bloomberg reported that Amazon employees were listening to customer interactions with Alexa and taking notes on the recordings to improve the voice assistant. In May 2023, Amazon agreed to pay a civil penalty of $25 million to settle a case brought by the Federal Trade Commission and Justice Department accusing it of holding onto Alexa voice interactions with children and using the data for business purposes. Amazon allegedly used children's data to train its algorithm and did not delete transcripts of children's conversations with Alexa even after parents attempted to delete them. Amazon is struggling to make Alexa profitable. The tech giant lost more than $25 billion from Alexa-enabled devices between 2017 and 2021, per The Wall Street Journal, citing internal documents. At the same time, Alexa has reached millions of homes, with Amazon selling more than half a billion Alexa-enabled devices by May 2023.
[23]
Alexa Will No Longer Process Requests Locally, Recordings Will Be Sent to Amazon
Like Assistants of yesteryear, Alexa has always had an option for users to opt out of Telemetry. It made users less concerned about Amazon collecting their data, but that's about to change. Soon, Alexa will not process requests locally, meaning your recordings with Echo devices will be sent to Amazon with no option to disable the same whatsoever. Here's everything you need to know. In an email that Amazon sent to its Echo customers, the firm mentioned that starting March 28, users will lose the Do Not Send Voice Recordings option for Alexa on their Echo devices. Thereafter, every interaction with Alexa on Echo devices will be sent to Amazon for processing. We are reaching out to let you know that the Alexa feature 'Do Not Send Voice Recordings' will no longer be available beginning March 28th. As we continue to expand Alexa's capabilities with Generative AI features, we have decided to no longer support this feature. For those unaware, Amazon recently introduced Alexa+, the firm's first consumer generative LLM. The discontinuation could be closely linked to Amazon training their AI model. If you use Alexa, you have no option but to consent to these changes. Or you'll have to stop using Alexa once and for all. We won't try to justify Amazon's move. However, AI services like ChatGPT, Gemini, and Perplexity are already collecting data to train models. But, some of them also have an option to opt-out. Amazon has reaffirmed its users in that email that Alexa voice requests are always encrypted in Amazon's secure cloud. It's pretty evident that Amazon is seriously betting big on Alexa+ as the history of the assistant has been a bit rough in terms of adoption in and outside the ecosystem. What are your thoughts on Alexa processing requests on the cloud and the removal of the Do Not Send Voice Recordings option? Let us know in the comments below.
[24]
Amazon is making a privacy change to Echo -- those who don't agree...
Owners of the Amazon Echo have long had the option to process requests locally, meaning their information wasn't sent to Amazon's servers -- but that option is going away starting March 28. The company sent an email to Echo customers to explain the changes. "We are reaching out to let you know that the Alexa feature 'Do Not Send Voice Recordings' that you enabled on your supported Echo device(s) will no longer be available beginning March 28th, 2025," the email reads. "As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." Starting on March 28, voice recordings from the Amazon Echo will be sent to and processed in the cloud -- though the email said that "they will be deleted after Alexa processes your requests." However, the email also noted that if the Echo's settings are set to "Don't save recordings," Voice ID will not work. According to Ars Technica, Voice ID allows users to access more personalized features on Alexa, such as sharing user-specified calendar events, reminders, music and more. "Thus, in order to keep that functionality, users will have to manually change that setting as well," Ars Technica reported. Essentially, everything said to your Amazon Echo from March 28 on will be sent to Amazon's cloud, and having the setting to make Amazon delete recordings will lessen the functionality of features on the device that were available by default when purchased. However, Amazon clarified to The Post that "Do Not Send Voice Recordings" was an opt-in feature that was only available to customers in the US with devices set to English. They claimed that less than 0.03% of customers used the feature. The feature was also only available on three Echo devices: Echo Dot (4th Gen), Echo Show 10 and Echo Show 15. "The Alexa experience is designed to protect our customers' privacy and keep their data secure, and that's not changing. We're focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon's secure cloud," an Amazon spokesperson said in a statement to The Post. "Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We'll continue learning from customer feedback, and building privacy features on their behalf." The move comes on the heels of the company's unveiling of its new Alexa with a complete AI overhaul, called Alexa+. Customers on Reddit were not happy with the announcement, with many commenting expressing their disappointment. "As a blind person this is really frigging angry-making as voice control of apps is especially useful and important in terms of access," one person wrote. "Alexa has been a game changer for our super adhd household. I'm really not looking forward to dismantling it," someone added. "So I now have to worry my Echo is recording everything," another exasperated customer said. "I'm going downstairs to unplug and trash my Alexa right now!" one commented.
Share
Share
Copy Link
Amazon is removing the "Do Not Send Voice Recordings" option for Echo devices, forcing all voice commands to be sent to its cloud for processing. This change, effective March 28, 2025, comes as the company prepares to launch its AI-powered Alexa+ service.
In a move that has raised significant privacy concerns, Amazon has announced the removal of a crucial privacy feature from its Echo devices. Starting March 28, 2025, users will no longer have the option to prevent their voice recordings from being sent to Amazon's cloud for processing 1. This change comes as Amazon prepares to launch Alexa+, its new AI-powered voice assistant.
Amazon is eliminating the "Do Not Send Voice Recordings" option, which previously allowed users to process Alexa requests locally on their devices. The company claims this change is necessary to support the advanced features of Alexa+, particularly its improved voice recognition capabilities 2.
In an email to affected customers, Amazon stated:
"As we continue to expand Alexa's capabilities with generative AI features that rely on the processing power of Amazon's secure cloud, we have decided to no longer support this feature." 3
This change affects all Echo users, regardless of whether they intend to use Alexa+ or not. Users now face a difficult choice: allow Amazon access to all their voice recordings or lose functionality of their Echo devices 4.
While Amazon claims it will delete recordings after processing, users who choose not to save voice recordings will lose access to features like Voice ID. This feature enables personalized responses based on individual voice recognition 5.
The decision has reignited concerns about Amazon's handling of user data. In 2023, the company paid $25 million in civil penalties for storing children's voice recordings indefinitely. There have also been instances of Amazon employees listening to Alexa recordings, with some reporting up to 1,000 audio samples during nine-hour shifts 1.
Amazon appears to be prioritizing the development and profitability of Alexa+ over user privacy concerns. The company views Alexa+ as a critical opportunity to make its voice assistant financially viable, betting that enhanced AI capabilities will attract more users and generate revenue 3.
This move by Amazon could set a precedent in the smart home industry, potentially influencing how other companies balance privacy concerns with AI advancements. A CNET survey found that over 70% of people have privacy concerns about adding more AI to home voice assistants, indicating that this change may face significant user backlash 4.
As the March 28 deadline approaches, users and privacy advocates are closely watching how this change will impact the smart home landscape and whether it will prompt regulatory scrutiny or user exodus from the Alexa ecosystem.
Reference
[2]
Amazon has removed crucial privacy settings from Alexa devices, forcing users to choose between privacy and functionality. This change, effective March 28, 2024, is part of Amazon's strategy to monetize Alexa and introduce AI-powered features.
4 Sources
4 Sources
Amazon announces Alexa Plus, an AI-enhanced version of its voice assistant, offering advanced features but facing user privacy concerns and potential subscription challenges.
15 Sources
15 Sources
Amazon introduces Alexa+, an AI-enhanced version of its voice assistant, promising improved conversational abilities and smart home control. The upgrade aims to address longstanding issues with voice assistants while raising questions about privacy and real-world performance.
14 Sources
14 Sources
Amazon introduces Alexa+, a significant upgrade to its virtual assistant, featuring generative AI capabilities, natural conversations, and expanded functionalities. The new service will be available for $19.99 per month or free for Amazon Prime members.
71 Sources
71 Sources
Amazon's highly anticipated AI-enhanced Alexa update faces another setback, with the public release delayed until at least March 31 due to incorrect answers during testing. The company still plans to unveil the new version on February 26 but struggles with reliability concerns.
9 Sources
9 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved