3 Sources
3 Sources
[1]
Meta suppressed children's safety research, four whistleblowers claim | TechCrunch
Two current and two former Meta employees disclosed documents to Congress alleging that the company may have suppressed research on children's safety, according to a report from The Washington Post. According to their claims, Meta changed its policies around researching sensitive topics -- like politics, children, gender, race, and harassment -- six weeks after whistleblower Frances Haugen leaked internal documents that showed how Meta's own research found that Instagram can damage teen girls' mental health. These revelations, which were made public in 2021, kicked off years of hearings in Congress over child safety on the internet, an issue that remains a hot topic in global governments today. As part of these policy changes, the report says, Meta proposed two ways that researchers could limit the risk of conducting sensitive research. One suggestion was to loop lawyers into their research, protecting their communications from "adverse parties" due to attorney-client privilege. Researchers could also write about their findings more vaguely, avoiding terms like "not compliant" or "illegal." Jason Sattizahn, a former Meta researcher specializing in virtual reality, told The Washington Post that his boss made him delete recordings of an interview in which a teen claimed that his ten-year-old brother had been sexually propositioned on Meta's VR platform, Horizon Worlds. "Global privacy regulations make clear that if information from minors under 13 years of age is collected without verifiable parental or guardian consent, it has to be deleted," a Meta spokesperson told TechCrunch. But the whistleblowers claim that the documents they submitted to Congress show a pattern of employees being discouraged from discussing and researching their concerns around how children under 13 were using Meta's social virtual reality apps. "These few examples are being stitched together to fit a predetermined and false narrative; in reality, since the start of 2022, Meta has approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being," Meta told TechCrunch. In a lawsuit filed in February, Kelly Stonelake -- a former Meta employee of fifteen years -- raised similar concerns to these four whistleblowers. She told TechCrunch earlier this year that she led "go-to-market" strategies to bring Horizon Worlds to teenagers, international markets, and mobile users, but she felt that the app did not have adequate ways to keep out users under 13; she also flagged that the app had persistent issues with racism. "The leadership team was aware that in one test, it took an average of 34 seconds of entering the platform before users with Black avatars were called racial slurs, including the 'N-word' and 'monkey,'" the suit alleges. Stonelake has separately sued Meta for alleged sexual harassment and gender discrimination. While these whistleblowers' allegations center on Meta's VR products, the company is also facing criticism for how other products, like AI chatbots, affect minors. Reuters reported last month that Meta's AI rules previously allowed chatbots to have "romantic or sensual" conversations with children.
[2]
Meta Whistleblowers Allege Company Buried Info on Child Safety
The Senate Judiciary Committee is holding a hearing on Tuesday about the company. Whistleblowers allege Meta has suppressed research on risks for young children involving virtual reality devices and apps, including information about child predators, according to a new report from the Washington Post. The newspaper reports that Congress has received thousands of pages of documents related to Meta's virtual reality programs, with four researchers coming forward to discuss their experiences with the company. Two of the researchers currently work for Meta, and two are former employees. In one of the most shocking claims, a researcher at Meta was allegedly told to delete information gathered from an interview with a family in Germany. A child in the family "frequently encountered strangers," and a teenage boy reportedly told researchers that "adults had sexually propositioned his little brother." His little brother was under the age of 10, according to the Post. The Washington Post reports that an internal Meta report on the research noted that German parents and teens were worried about grooming via VR in Horizon Worlds, but the report didn't include anything about the teen who said that his young brother had actually been targeted. But Meta denies the characterization that anything improper happened while it conducted research. “These few examples are being stitched together to fit a predetermined and false narrative; in reality since the start of 2022, Meta has approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being," a Meta spokesperson told Gizmodo over email. Reality Labs is Meta's VR division. "This research has contributed to significant product updates such as new supervision tools for parents to see who their teens are connected with in VR, how much time they spend, and the apps they access. We have also introduced automatic protections for teens to limit unwanted contact, like default voice channel settings in Horizon Worlds so individuals can hear or be heard only from people they know as well as personal boundaries," the statement continued. "We stand by our research team's excellent work and are dismayed by these mischaracterizations of the team’s efforts.†The allegations come as the tech giant is getting heat about a series of articles by Reuters reporter Jeff Horwitz detailing a set of policies that appear tremendously lax when it comes to how AI chatbots interact with children. An internal document from Meta gave the green light for its generative AI chatbots to engage in “sensual†conversations with children, according to Reuters. The report prompted outrage on Capitol Hill, where Sen. Josh Hawley, a Republican from Missouri, said last month he had launched an investigation into Meta's AI policies and how the technology may be interacting with kids. “Is there anything â€" ANYTHING â€" Big Tech won’t do for a quick buck?†Hawley tweeted on Aug. 15. “Now we learn Meta’s chatbots were programmed to carry on explicit and â€~sensual’ talk with 8-year-olds. It’s sick. I’m launching a full investigation to get answers. Big Tech: Leave our kids alone.†Meta, which changed the name of its parent company from Facebook in 2021, has spent billions of dollars over recent years in an effort to make the metaverse a mainstream reality. Facebook first made a big investment in VR in 2014, buying Oculus. But it's still an incredibly niche offering that most people ignore. Reality Labs has reportedly lost $60 billion, according to the Post. The Senate Judiciary Committee is scheduled to hold a hearing on Tuesday afternoon that will explore the allegations made by the whistleblowers. The title of the hearing: "Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research." News also broke Monday that the former head of security for WhatsApp, which is also owned by Meta, had filed a lawsuit in California that employees at the company "could gain access to sensitive user data including profile pictures, location, group memberships and contact lists." Big companies get sued all the time. But, needless to say, Meta is getting it from all angles right now when it comes to whistleblowers who are concerned about privacy and security.
[3]
Mark Zuckerberg's Meta stifled research on sickos using VR to target...
Mark Zuckerberg's Meta stifled internal research into the safety risks of its virtual reality apps - including a stomach-churning claim that sickos had "sexually propositioned" a kid younger than 10, according to bombshell whistleblower allegations that surfaced Monday. Two of the whistleblowers, including a former Meta safety researcher named Jason Sattizahn and an unnamed colleague, detailed a shocking April 2023 research trip they had taken to Germany. During the trip, a German mother said she didn't allow her children to talk to strangers using Meta's VR headsets - only to hear her teenage son allege moments later that his little brother had been propositioned by creeps on multiple occasions. "I felt this deep sadness watching the mother's response," Sattizahn, told the Washington Post. "Her face in real time displayed her realization that what she thought she knew of Meta's technology was completely wrong." After conducting the interviews, the researchers said they were told to delete their recordings and written evidence of the teenager's allegations. Instead, Meta's final report claimed German parents were merely concerned about the possibility of groomers targeting kids in VR. The scandalous allegations and others were brought by four current and former Meta employees who gave a trove containing thousands of pages of documents, memos and presentations to Congress, the Washington Post reported. Elsewhere, Meta was informed that kids were skirting its age limits for using Oculus VR headsets as far back as April 2017, according to the leaked documents. "We have a child problem and it's probably time to talk about it," one employee message at the time said. The employee, whose name was redacted, suggested that up to 90 percent of metaverse users were underage. The unnamed employee described one incident in which "three young kids (6? 7?) were chatting with a much older man who was asking them where they lived." "This is the kind of thing that eventually makes headlines -- in a really bad way," the employee wrote. In one November 2021 presentation cited in the Washington Post's report, Meta attorneys told researchers in Reality Labs, the team responsible for VR, to consider conducting "highly-sensitive research under attorney-client privilege" to prevent it from surfacing in public. Employees were also told to be "mindful" of the language they used in internal studies and specifically to avoid phrases like "not compliant" and "illegal." At one point in 2023, a Meta attorney allegedly told a company researcher not to compile data on how many underage kids were using the company's VR devices "due to regulatory concerns," the report said. "To be crystal clear: Meta ordered its researchers to delete evidence that the company was breaking the law and willfully endangering minors," said Sacha Haworth, Executive Director of The Tech Oversight Project. "That's not just deeply disturbing, it's cause for a deep investigation into Mark Zuckerberg's leadership and the toxic culture within Meta that encourages senior leaders to break the law," she added. The Senate Judiciary committee will hold a hearing Tuesday that will examine the whistleblowers' claims. Last week, the panel's chairman Sen. Chuck Grassley joined senators Marsha Blackburn and Josh Hawley in sending a letter accusing Zuckerberg of failing to adequately respond to its inquiries and demanding a follow-up no later than Sept. 16. In a joint statement submitted to Congress in May, the whistleblowers alleged Meta's attorneys had engaged in a systematic effort to screen and occasionally block the release of internal safety research. The effort was reportedly a response to the damaging 2021 leak of internal Facebook research by former employee Frances Haugen that showed the company knew its app, including Instagram, were harming teenage girls. Haugen's revelations triggered a wave of Congressional hearings. The new whistleblowers allege Meta wanted to "establish plausible deniability" about the extent of its knowledge on safety risks. Meta spokeswoman Dani Lever said the whistleblower claims that Meta had suppressed research that had been "stitched together to fit a predetermined and false narrative." "In reality since the start of 2022, Meta has approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being," Lever said in a statement. "This research has contributed to significant product updates such as new supervision tools for parents to see who their teens are connected with in VR, how much time they spend, and the apps they access." "We have also introduced automatic protections for teens to limit unwanted contact, like default voice channel settings in Horizon Worlds so individuals can hear or be heard only from people they know as well as personal boundaries," Lever added. "We stand by our research team's excellent work and are dismayed by these mischaracterizations of the team's efforts." Meta's Lever did not confirm or deny whether the company had actually ordered details about the Germany trip to be deleted from the final report. Lever said any such deletion would, if it occurred, would have been necessary to comply with Europe's General Data Protection Regulation, a sweeping law that limits data collection. "Global privacy regulations make clear that if information from minors under 13 years of age is collected without verifiable parental or guardian consent, it has to be deleted," Lever added in her statement. However, Sattizahn said the mother had indeed given her consent via a signed contract and that Meta normally would not require deletion of information collected in such research interviews. Sattizahn said he was fired by Meta in April 2024 after clashing with management about the company's handling of safety research, while the other researcher who participated in the trip to Germany resigned in 2023 due to ethical concerns. Two other whistleblowers are still working at Meta. All four are being backed by a nonprofit called Whistleblower Aid - which has also worked with Haugen, according to the report. "From the start, we built safety features into our devices and made it clear those devices were meant for people over 13 -- this was stated in the Oculus Safety Center, on the packaging, and in the user guides," Lever's statement added. "As more people started using these devices and Meta launched its own games and apps, we added many more protections, especially for young people." Zuckerberg was once all-in on the "metaverse," even renaming his company from Facebook to Meta in 2021 to reflect the company's focus on the technology. However, Meta has since pivoted most of its resources toward the pursuit of artificial intelligence.
Share
Share
Copy Link
Four whistleblowers claim Meta suppressed research on children's safety in virtual reality, including incidents of sexual proposition and racial slurs. The company faces scrutiny from Congress and lawsuits over its handling of sensitive data and AI interactions with minors.
Four whistleblowers, including two current and two former Meta employees, have come forward with allegations that the company suppressed research on children's safety in its virtual reality (VR) products. These claims have sparked a congressional investigation and raised serious concerns about Meta's handling of sensitive data and AI interactions with minors
1
.Source: Gizmodo
The whistleblowers allege that Meta changed its policies around researching sensitive topics shortly after Frances Haugen's 2021 revelations about Instagram's impact on teen mental health. According to the claims, researchers were encouraged to involve lawyers in their work and use vague language to describe findings
1
.One of the most alarming incidents involved Jason Sattizahn, a former Meta researcher, who claims he was instructed to delete recordings of an interview where a teen reported his 10-year-old brother had been sexually propositioned on Meta's VR platform, Horizon Worlds
2
.The whistleblowers' documents also reveal concerns about racial slurs in VR environments. A lawsuit filed by Kelly Stonelake, a former Meta employee, alleges that users with Black avatars were subjected to racial slurs within an average of 34 seconds of entering the platform
1
.Additionally, internal documents from as early as April 2017 suggest that Meta was aware of children bypassing age restrictions on Oculus VR headsets. One employee message claimed that up to 90% of metaverse users might be underage
3
.Source: New York Post
Related Stories
Meta has denied the allegations, stating that the examples are being "stitched together to fit a predetermined and false narrative." The company claims to have approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being, since the start of 2022
2
.Meta spokesperson Dani Lever emphasized that the company has introduced new supervision tools for parents and automatic protections for teens to limit unwanted contact in VR environments
3
.Source: TechCrunch
The Senate Judiciary Committee is scheduled to hold a hearing titled "Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research" to investigate these claims
2
.Meta is also facing criticism for its AI chatbot policies, with reports suggesting that the company allowed chatbots to have "romantic or sensual" conversations with children
1
.As Meta continues to invest heavily in its metaverse ambitions, these allegations raise significant questions about the company's commitment to user safety, particularly for minors, in its virtual reality products. The outcome of the congressional hearing and potential regulatory actions could have far-reaching implications for Meta and the broader VR industry.
Summarized by
Navi
14 Aug 2025•Technology
15 Aug 2025•Policy and Regulation
28 Apr 2025•Technology
1
Business and Economy
2
Technology
3
Business and Economy