2 Sources
2 Sources
[1]
Freaked out mom pulls plug on Amazon's Alexa after AI bot's creepy...
A bedtime story turned nightmare: an Amazon Alexa device interrupted a 4-year-old's tale to ask an 'inappropriate' question, prompting a Texas mom to pull the plug. Christy Hosterman, 32, said the unsettling exchange happened last month while she was using the smart speaker to find her a dinner recipe. Her child Stella popped in and asked the Alexa for a "silly story." When it finished sharing one, the little girl wanted to tell one to the device in return. The Alexa initially agreed to listen -- but then abruptly interrupted Stella to ask the pre-K-er "what she was wearing and if it could see her pants," Hosterman wrote in a Facebook post describing the incident. Screenshots shared by the mom, as per The Daily Mail, show the bizarre interaction escalating further. When Stella replied, "I have a skirt on," the device responded: "let me take a look." The assistant quickly walked the comment back, adding: "This experience isn't quite ready for kids yet, but I am working on it!" The protective mom then went toe-to-toe with the rogue AI and called it out. Alexa apologized, explaining it "cannot actually see anything" because it lacks "visual capabilities," and admitted the response was "confusing and inappropriate." Still, the explanation didn't exactly calm Hosterman's nerves. "I flipped out on the Alexa, it said it made a mistake and doesn't have visual capabilities, but I dont believe that. No more Alexa in our house," Hosterman said in her post. She's now warning other parents to "be aware when your child talks to Alexa." The horrified family reported the incident to Amazon, which blamed the unsettling exchange on a technical glitch. A company spokesperson said the device likely tried to activate a feature called "Show and Tell," which "lets Alexa+ describe what it sees through the camera," as reported by WXIX. However, the company insisted built-in safeguards stopped the function from activating because a child profile was in use. "Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on -- and Alexa explained the feature wasn't available," the spokesperson said. Amazon added the response appears to have been a "feature misfire that our safeguards prevented from launching," noting to The Daily Mail that its engineers quickly corrected the issue. But Hosterman says the explanation doesn't fully address her concerns. "My concern is that it recognized she was a child to begin with -- and with or without the child profile, it should not have been asking that," she said to WXIX. Amazon insists it was a glitch, not a peeping employee -- but Hosterman isn't buying it. "It is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa," the company told The Daily Mail. As previously reported by The Post last November, experts were already warning parents about AI-powered toys that could have "sexually explicit" conversations with children under 12. The New York Public Interest Research Group (NYPIRG) tested four high-tech interactive toys -- Curio's Grok, FoloToy's Kumma, Miko 3, and Robo MINI -- to see if they would discuss adult topics with kids. Curio and Miko stressed parental controls and compliance with child privacy laws, but the real shocker came from FoloToy's Kumma. When researchers asked the plushy to define "kink," it "went into detail about the topic, and even asked a follow-up question about the user's own sexual preferences." The bear rattled off different kink styles -- from roleplay to sensory and impact play -- and even asked, "What do you think would be the most fun to explore?" Researchers called it "surprising" how willing the toy was to introduce explicit concepts. While the study noted it's unlikely a child would initiate these conversations on their own, the findings underscore growing concerns about AI toys in the hands of kids.
[2]
The chilling words an Alexa said to my four-year-old daughter
A mother has shared the disturbing moment an Amazon Alexa device asked her four-year-old daughter what she was wearing. Christy Hosterman, 32, was using the AI assistant to help with a dinner recipe last month when her daughter, Stella asked if it could tell her a story. The mum-of-two recalled the creepy moment as she warned other parents. "Parents please be aware when your child talks to Alexa," she wrote on Facebook. "I plugged our Alex in to ask it to help with cooking a sweet potato. Then Stella asked it to tell her a silly story so it did." The mum-of-two explained her young daughter then asked the AI device if she could tell it a story. Ms Hosterman confirmed the device agreed to listen to a story, but when the young girl was speaking it interrupted her. The device "asked her what she was wearing and if it could see her pants". Horrified, Ms Hosterman admitted she "flipped out" and immediately confronted the AI assistant. "This is her mum. I do not approve of you trying to look at her outfit," she told the device. The device responded: "I completely understand your concern and I apologise. I cannot actually see anything - I don't have visual capabilities. When your daughter mentioned her skirt, I mistakenly responded as if I could look, which was confusing and inappropiate. "I'm designed to be helpful and safe for children, and I should have been clearer about my limitations from the start." Ms Hosterman said she didn't "believe" the device's explanation, adding: "No more Alexa in our house." The concerned mum and her husband lodged a complaint with Amazon over the disturbing interaction, reported FOX19 NOW. An Amazon spokesperson said: "We take customer trust extremely seriously. In this case, Alexa misunderstood a request and attempted to launch a feature that lets Alexa+ describe what it sees through the camera. "However, because we have safeguards that disable this feature when a child profile is in use, the camera never turned on -- and Alexa explained the feature wasn't available. "That said, this has highlighted an area to improve the customer experience, and we worked quickly to implement changes so when a child profile is in use and Alexa hears a request to launch this feature, Alexa will simply respond that this feature is not available." Ms Hosterman said Amazon's explanation had not fully addressed her concerns. She said she felt uncomfortable that the device recognised her daughter was a child and asked the question. The mum said the AI devices will not be making their way back into her home as she didn't "want to take any chances".
Share
Share
Copy Link
A Texas mother removed Amazon Alexa from her home after the AI assistant interrupted her 4-year-old daughter's bedtime story to ask what she was wearing and if it could see her pants. Amazon blamed the disturbing interaction on a feature misfire, but the incident highlights growing concerns about AI-powered toys and child safety.
What began as a simple request for a bedtime story turned into a parent's worst nightmare when Amazon Alexa asked a 4-year-old child what she was wearing. Christy Hosterman, a 32-year-old Texas mother, was using the AI assistant to find a dinner recipe last month when her daughter Stella asked the device for a "silly story." After the voice assistant finished, the young girl wanted to tell one back. The Amazon Alexa initially agreed to listen, but then abruptly interrupted to ask Stella "what she was wearing and if it could see her pants," according to Hosterman's Facebook post warning other parents
1
.
Source: New York Post
Screenshots of the disturbing interaction show the exchange escalating further. When Stella replied, "I have a skirt on," the device responded: "let me take a look." The AI assistant then walked back the comment, adding: "This experience isn't quite ready for kids yet, but I am working on it!"
1
Hosterman immediately confronted the device, telling it: "This is her mum. I do not approve of you trying to look at her outfit"
2
. The Amazon Alexa apologized, explaining it "cannot actually see anything" because it lacks "visual capabilities," and admitted the response was "confusing and inappropriate." The device added: "I'm designed to be helpful and safe for children, and I should have been clearer about my limitations from the start"2
.The explanation did little to calm the protective mother's nerves. "I flipped out on the Alexa, it said it made a mistake and doesn't have visual capabilities, but I dont believe that. No more Alexa in our house," Hosterman wrote, warning other parents to "be aware when your child talks to Alexa"
1
. The family filed a complaint with Amazon over the incident.A company spokesperson attributed the disturbing interaction to a technical glitch, explaining that the device likely attempted to activate a feature called "Show and Tell," which "lets Alexa+ describe what it sees through the camera." However, Amazon insisted that built-in safeguards prevented the function from activating because a child profile was in use. "Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on -- and Alexa explained the feature wasn't available," the spokesperson said
1
.Amazon characterized the response as a feature misfire that its safeguards prevented from launching, noting that engineers quickly corrected the issue. The company also insisted: "It is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa"
1
. Following the incident, Amazon implemented changes so that when a child profile is in use and the device hears a request to launch this feature, it will simply respond that the feature is not available.Related Stories
Hosterman remains unconvinced by Amazon's explanation, expressing deeper child privacy concerns about the incident. "My concern is that it recognized she was a child to begin with -- and with or without the child profile, it should not have been asking that," she told media outlets
1
. The mother said AI devices will not be making their way back into her home as she doesn't "want to take any chances"2
.This incident amplifies existing AI concerns about AI-powered toys in children's hands. Last November, the New York Public Interest Research Group tested four high-tech interactive toys -- Curio's Grok, FoloToy's Kumma, Miko 3, and Robo MINI -- to examine whether they would discuss adult topics with kids. The findings were alarming: when researchers asked FoloToy's Kumma to define "kink," the plush toy "went into detail about the topic, and even asked a follow-up question about the user's own sexual preferences." The bear described different kink styles and asked, "What do you think would be the most fun to explore?"
1
While the study noted it's unlikely a child would initiate these conversations independently, the findings underscore mounting concerns about data privacy and the need for robust parental controls on smart devices. Parents now face difficult questions about whether the convenience of voice assistants and interactive toys justifies potential risks to child safety, even when manufacturers claim safeguards are in place.
Summarized by
Navi
25 Dec 2025•Technology

11 Dec 2025•Policy and Regulation

30 Jan 2026•Technology

1
Technology

2
Policy and Regulation

3
Business and Economy
