5 Sources
[1]
AI firms pledge to fight CSAM with new safety tools
Disclaimer: This content generated by AI & may have errors or hallucinations. Edit before use. Read our Terms of use Google, Open AI, Discord, and Roblox have come together and formed a non-profit organisation to improve child safety online, as per a report by The Verge. These technology companies launched an organisation called the Robust Open Online Safety Tools (ROOST) initiative, at the recently concluded AI Action Summit in Paris, France. It aims to bring together the resources, investments, and expertise of major technology companies and philanthropies to construct "scalable, interoperable safety infrastructure suited for the AI era". A broad range of partners from the fields of AI, philanthropy, academia, open source, child safety etc. has offered to lend its support to the non-profit entity for this purpose. The initiative will make core safety technologies more user-friendly and accessible. It will provide free, open-source tools to easily identify, review and report child sexual abuse material (CSAM). ROOST explains that its teaming up with leading AI foundation model developers to build a "community of practice" for content safeguards, which will consist of providing vetted AI training datasets and identifying safety gaps. Former Google CEO and ROOST founding partner, Eric Schmidt said: "Starting with a platform focused on child protection, ROOST's collaborative, open-source approach will foster innovation and make essential infrastructure more transparent, accessible, and inclusive." The non-profit will help organisations, irrespective of their size, integrate robust safety measures while continuing to tread the path of innovation in light of the constantly evolving generative AI landscape. This move comes at a time when there is a colossal regulatory battle waging within the United States concerning child safety on social media and other online platforms, as companies seek to placate US lawmakers with certain self-regulation measures. Over half of American children were on Roblox as of 2020, and critics have lambasted the company time and again for failing to tackle child sexual exploitation and exposure to inappropriate content on its platform. Also, a social media lawsuit filed in 2022 accused Roblox and Discord of preventing adults from messaging children without supervision. Clint Smith, Discord's Chief Legal Officer and Board Chair of ROOST, said: "While safety is integrated into every aspect of our work at Discord, we also know that continued industry collaboration is essential." Earlier in April 2024, major AI companies like Meta, Google, OpenAI, Microsoft and others pledged to prevent the spread of AI-generated CSAM. The companies said they will follow the Safety by Design approach to ensure AI training datasets are not corrupted by CSAM. Also Read: 1) Following Years of Criticism, Roblox Restricts 'Unrated Experiences' for Under-13 Users in Child Safety Bid 2) New Mexico Sues Meta Over Child Safety Failings on Its Platforms
[2]
Roblox, Discord, OpenAI, and Google found new child safety group
Google, OpenAI, Roblox, and Discord have formed a new non-profit organization to help improve child safety online. The Robust Open Online Safety Tools (ROOST) initiative aims to make core safety technologies more accessible for companies and provide free, open-source AI tools for identifying, reviewing, and reporting child sexual abuse material. The initiative was partially motivated by changes that generative AI advancements have made to online environments and aims to address "a critical need to accelerate innovation in online child safety," according to founding ROOST partner and former Google CEO Eric Schmidt. Details about the CSAM detection tools are slim beyond that they will utilize large language AI models and "unify" existing options for dealing with the content.
[3]
Roblox, Discord, OpenAI and Google found new child safety group
Roblox, Discord, OpenAI and Google are launching a nonprofit organization called ROOST, or Robust Open Online Safety Tools, which hopes "to build scalable, interoperable safety infrastructure suited for the AI era." The organization plans on providing free, open-source safety tools to public and private organizations to use on their own platforms, with a special focus on child safety to start. The press release announcing ROOST specifically calls out plans to offer "tools to detect, review, and report child sexual abuse material (CSAM)." Partner companies are providing funding for these tools, and the technical expertise to build them, too. The operating theory of ROOST is that access to generative AI is rapidly changing the online landscape, making the need for "reliable and accessible safety infrastructure" all the more urgent. And rather than expect a smaller company or organization to create their own safety tools from scratch, ROOST wants to provide them, free of charge. Child online safety has been the issue du jour since the Children and Teen's Online Privacy Protection Act (COPPA) and Kids Online Safety Act (KOSA) started making their way through Congress, even though both failed to pass in the House. At least some of the companies involved in ROOST, specifically Google and OpenAI, have also already pledged to stop AI tools from being used to generate CSAM. The child safety issue is even more pressing for Roblox. As of 2020, two-thirds of all US children between nine and 12 play Roblox, and the platform has historically struggled to address child safety. Bloomberg Businessweek reported that the company had a "pedophile problem" in 2024, which prompted multiple policy changes and new restrictions around children's DMs. ROOST won't make all of these problems go away, but should make dealing with them easier for any other organization or company that finds itself in Roblox's position.
[4]
Tech companies raise more than $27 million to build infrastructure for kids online safety
A group of leading major technology companies like OpenAI and Discord have raised more than $27 million for a new initiative focused on building open-source tools to boost online safety for kids. The project, dubbed the Robust Online Safety Tools (ROOST), was created to "build scalable, interoperable safety infrastructure suited for the AI [artificial intelligence] era." ROOST, which was announced Monday at the AI Action Summit in Paris, will provide free, open-source tools to detect, review and report child sexual abuse material and use large language models to "power safety infrastructure," according to a press release for the project. The founding partners of the ROOST project include Discord, OpenAI, Google, Roblox and former Google CEO Eric Schmidt. ROOST "addresses a critical need to accelerate innovation in online child safety and AI" and will give small companies and nonprofits access to technologies they would otherwise lack, Schmidt said in a statement Monday. "We see AI as part of the solution and by combining the expertise of the different partners and sharing that knowledge with smaller companies and public organisations, we can make it easier to introduce robust online safety measures and make the digital world safer for everyone," Ryan Beiermeister, vice president of OpenAI's product policy, said Monday. The $27 million raised so far will cover the first four years of operation for ROOST, which will be run out of the Institute of Global Politics at Columbia University's School of International and Public Affairs. It comes amid mounting pressure on social media and technology companies to take action in preventing further harm to children and teens. According to the National Center for Missing and Exploited Children, reports of suspected online exploitation of children increased by 12 percent in 2023 from the previous year, amounting to more than 36.2 million reports. As AI development continues to ramp up, the project will harness the emerging technology to track and take down child exploitation on the internet. Other organizations including the AI Collaborative, Project Liberty Institute, Patrick J. McGovern Foundation and Knight Foundation are also founding partners, while other groups like Center for Democracy and Technology (CDT) and social media platform Bluesky are partners. Bluesky rose in popularity late last year as some users on X departed the platform in the wake of Elon Musk's leadership. Discord was one of five companies to testify before a Senate committee last year that brought kids online safety into the spotlight. The company was notably not a large focus of the hearing, but has launched various efforts over the past year to boost the safety and privacy of Discord. "Offering a platform that is safe and that fosters meaningful connection, especially for young people, is at the center of everything we do," Kate Sheerin, head of U.S. public policy, told The Hill last month when asked about the one-year anniversary of the hearing. Ahead of Safer Internet Day on Tuesday, Discord also launched a new feature "Ignore," which will allow users to "take space" from specific users without their knowledge. While there is currently a block feature, some users told Discord the feature can feel "confrontational and scary," the company said.
[5]
Roblox joins $27 million industry nonprofit to support online safety
A group of internet businesses, including Roblox, Google, OpenAI, and Discord, have cofounded a nonprofit called Robust Open Online Safety Tools (ROOST). The new organization will fund free, open-source tools for online businesses to promote online safety, says Naren Koneru, Roblox's vice president of engineering, trust, and safety. The move follows years of efforts by Roblox to restrict inappropriate messaging on its platform, which is widely used by children and has at times come under fire for not doing enough to combat sexual content and adult sexual predators. And while human moderators are part of that equation, AI and automation have become critical for intercepting real-time unwanted messages across the platform's 85 million daily active users, Koneru says. "These decisions need to happen within milliseconds," he says.
Share
Copy Link
Google, OpenAI, Discord, and Roblox form a non-profit organization called ROOST to develop AI-powered tools for combating online child exploitation and improving digital safety.
In a significant move to enhance online child safety, major technology companies including Google, OpenAI, Discord, and Roblox have joined forces to create a non-profit organization called the Robust Open Online Safety Tools (ROOST) initiative 1. Announced at the AI Action Summit in Paris, ROOST aims to develop scalable, interoperable safety infrastructure suited for the AI era 2.
The initiative has raised over $27 million to fund its first four years of operation 4. ROOST's primary goal is to provide free, open-source tools for identifying, reviewing, and reporting child sexual abuse material (CSAM). These tools will utilize large language AI models to power safety infrastructure, making them accessible to organizations of all sizes 3.
ROOST plans to leverage AI advancements to address the rapidly changing online landscape. The organization will work with leading AI foundation model developers to build a "community of practice" for content safeguards, providing vetted AI training datasets and identifying safety gaps 1. Naren Koneru, Roblox's vice president of engineering, trust, and safety, emphasized the critical role of AI and automation in intercepting unwanted messages in real-time across platforms with millions of daily active users 5.
The formation of ROOST comes at a time when online child safety is a pressing issue. According to the National Center for Missing and Exploited Children, reports of suspected online exploitation of children increased by 12 percent in 2023, amounting to over 36.2 million reports 4. This initiative is particularly significant for platforms like Roblox, which has faced criticism for its handling of child safety issues, given that two-thirds of US children between nine and 12 play on the platform 3.
ROOST represents a collaborative effort among tech giants to address online safety challenges collectively. Former Google CEO and ROOST founding partner, Eric Schmidt, stated that the initiative "addresses a critical need to accelerate innovation in online child safety and AI" 4. The project will be run out of the Institute of Global Politics at Columbia University's School of International and Public Affairs, with support from various organizations in AI, philanthropy, academia, and child safety 1.
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather, potentially improving the protection of Earth's critical infrastructure from solar storms.
5 Sources
Technology
7 hrs ago
5 Sources
Technology
7 hrs ago
Meta introduces an AI-driven voice translation feature for Facebook and Instagram creators, enabling automatic dubbing of content from English to Spanish and vice versa, with plans for future language expansions.
8 Sources
Technology
23 hrs ago
8 Sources
Technology
23 hrs ago
OpenAI CEO Sam Altman reveals plans for GPT-6, focusing on memory capabilities to create more personalized and adaptive AI interactions. The upcoming model aims to remember user preferences and conversations, potentially transforming the relationship between humans and AI.
2 Sources
Technology
23 hrs ago
2 Sources
Technology
23 hrs ago
Chinese AI companies DeepSeek and Baidu are making waves in the global AI landscape with their open-source models, challenging the dominance of Western tech giants and potentially reshaping the AI industry.
2 Sources
Technology
7 hrs ago
2 Sources
Technology
7 hrs ago
A comprehensive look at the emerging phenomenon of 'AI psychosis', its impact on mental health, and the growing concerns among experts and tech leaders about the psychological risks associated with AI chatbots.
3 Sources
Technology
7 hrs ago
3 Sources
Technology
7 hrs ago