Curated by THEOUTPOST
On Thu, 19 Sept, 4:07 PM UTC
20 Sources
[1]
Facebook, YouTube, WhatsApp All 'Engaged in Vast Surveillance' to Earn Billions, According to the FTC
FTC chair Lina M. Khan said these social media companies "endanger people's privacy." In December 2020, the Federal Trade Commission ordered the biggest social media and streaming companies in the world, including Twitch owner Amazon, Facebook (now Meta), YouTube, Reddit, WhatsApp, Twitter (now X), Snap, Discord and TikTok's ByteDance, to share how they used their users' personal information. On Thursday, FTC staff released a 129-page report, which found that these companies all "harvest an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year," stated FTC chair Lina M. Khan. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," Khan said. Related: The FTC Is Banning Businesses From Writing, Buying Their Own Reviews and Bot Followers The report called out major social media companies for collecting vast swaths of personal data and using it in ways their users may not expect. The FTC found, for example, that "many" of these companies buy data from third-party brokers about where a user is located, how much they make per year, and what their interests are, to understand more about a user's activity on the Internet outside of the social media platform. This personal information becomes the basis of targeted ads, which most social media sites rely on for revenue. Meta, the parent company of Facebook, Instagram, WhatsApp, and other products and platforms, reported that 98% of its $39.07 billion revenue in its second quarter came from ads on Facebook and Instagram. Related: Federal Judge Blocks FTC's Noncompete Ban 2 Weeks Before It Would Have Taken Effect -- Here's Why According to the FTC report, it's difficult for users to understand how social media platforms collect their information and how much is used to tailor ads. Many may not even be aware of what's happening behind the scenes. Plus, even if users are tuned in and know that social media platforms are using their data, they still don't have "any meaningful control over how personal information [is] used," the FTC report shows. Companies use personal information to fuel algorithms, data analytics, and AI that, in turn, shape content recommendations, search, advertising, and other crucial aspects of their business. The FTC recommended that companies be transparent about the data they collect, do more to protect privacy, and put users in charge of data. The FTC further found that if a user wants to delete their data, some sites will de-identify the data they have on hand, but keep it on file instead of wiping it all. The platforms that did delete personal data upon request would select which parts to delete and fail to remove all of it, according to the report. Related: The FTC Is Suing to Block a Mega-Merger That Would Unite Coach and Michael Kors "Companies can and should do more to protect consumers' privacy, and Congress should enact comprehensive federal privacy legislation that limits surveillance and grants consumers data rights," the report stated.
[2]
Social media users lack control over data used by AI, US FTC says
Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the U.S. Federal Trade Commission said in a report released on Thursday. The report analyzed how Meta Platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymized and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google. Discord, a communications platform, said the report lumps very different business models into one category, and that it did not offer advertising at the time the study was conducted. An X spokesperson said the report is based on practices from 2020 when the site was known as Twitter, which X has since improved. "X takes user data privacy seriously and ensures users are aware of the data they are sharing with the platform and how it is being used, while providing them with the option of limiting the data that is collected from their accounts," the spokesperson said. Only about 1% of X's current U.S. users are between ages 13 and 17, the spokesperson said. Other companies did not immediately reply to requests for comment. Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The U.S. House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said. Advertising industry groups criticised the report on Thursday, saying that consumers recognize the value of ad-supported services. "We are disappointed with the FTC's continued characterization of the digital advertising industry as engaged in 'mass commercial surveillance,'" said David Cohen, chief executive of the Interactive Advertising Bureau, an advertising and marketing group which counts Snapchat, TikTok and Amazon among its members. (Reporting by Jody Godoy in New York; Editing by Matthew Lewis, Andrea Ricci and Stephen Coates)
[3]
Meta, TikTok, Twitch, and other social media platform users lack control over data used by AI: U.S. FTC
Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the U.S. Federal Trade Commission said in a report released on Thursday. The report analysed how Meta Platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymised and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google. Discord, a communications platform, said the report lumps very different business models into one category, and that it did not offer advertising at the time the study was conducted. Other companies did not immediately reply to requests for comment. Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Amazon, Flipkart breached antitrust laws: CCI Data privacy, particularly for kids and teens, has been a hot-button issue. The U.S. House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. FTC seeking details on Amazon deal with AI startup Adept, source says Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said. Advertising industry groups criticised the report on Thursday, saying that consumers recognise the value of ad-supported services. "We are disappointed with the FTC's continued characterization of the digital advertising industry as engaged in 'mass commercial surveillance,'" said David Cohen, chief executive of the Interactive Advertising Bureau, an advertising and marketing group which counts Snapchat, TikTok and Amazon among its members. Published - September 20, 2024 10:15 am IST Read Comments
[4]
Social media and online video firms are conducting 'vast surveillance' on users, FTC finds
Agency accuses Meta, Google, TikTok and other companies of sharing troves of user information with third-parties Social media and online video companies are collecting huge troves of your personal information on and off their websites or apps and sharing it with a wide range of third-party entities, a new Federal Trade Commission (FTC) staff report on nine tech companies confirms. The FTC report published on Thursday looked at the data-gathering practices of Facebook, WhatsApp, YouTube, Discord, Reddit, Amazon, Snap, TikTok and Twitter/X between January 2019 and 31 December 2020. The majority of the companies' business models incentivized tracking how people engaged with their platforms, collecting their personal data and using it to determine what content and ads users see on their feeds, the report states. The FTC's findings validate years of reporting on the depth and breadth of these companies' tracking practices and call out the tech firms for "vast surveillance of users". The agency is recommending Congress pass federal privacy regulations based on what it has documented. In particular, the agency is urging lawmakers to recognize that the business models of many of these companies do little to incentivize effective self-regulation or protection of user data. "Recognizing this basic fact is important for enforcers and policymakers alike because any efforts to limit or regulate how these firms harvest troves of people's personal data will conflict with their primary business incentives," FTC chair Lina Khan said in a statement. "To craft effective rules or remedies limiting this data collection, policymakers will need to ensure that violating the law is not more lucrative than abiding by it." The FTC is also asking that the companies mentioned in the report invest in "limiting data retention and sharing, restricting targeted advertising, and strengthening protections for teens". Notably, the report highlights how consumers have little control over how these companies use and share their personal details. Most companies collected or inferred demographic information about users such as age, gender and language. Some collected information about household income, education and parental and marital status. But even when this type of personal information was not explicitly collected, some companies could analyze user behavior on the platform to deduce the details of their personal lives without their knowledge. For instance, some companies' user interest categories included "baby, kids and maternity" which would reveal parental status or "newlyweds" and "divorce support", which would reveal marital status. This information was then used by some companies to tailor what content people saw to increase engagement on their platforms. In some cases, that demographic information was shared with third-party entities to help target them with more relevant advertisements. Whatever product was in use, it was not easy to opt out of data collection, according to the FTC. Nearly all the companies said they fed personal information to automated systems, most often to serve content and advertisements. On the flipside, almost none of them offered "a comprehensive ability to directly control or opt-out of use of their data by all Algorithms, Data Analytics, or AI", per the report. Several firms say it's impossible to even compile a full list of who they share your data with. When the companies were asked to enumerate which advertisers, data brokers or other entities they shared consumer data with, none of these nine firms provided the FTC with a full inventory. The FTC also found that despite evidence that children and teens use many of these platforms, many of the tech companies reported that, because their platforms are not directed at children, they do not need different data-sharing practices for children under 13 years of age. According to the report, none of the companies reported having data-sharing practices that treated the information collected about and from 13- to 17-year-olds via their sites and apps differently than adult data, even though data about minors is more sensitive. The FTC called the companies' data-minimization practices "woefully inadequate", finding that some of the companies did not delete information when users requested it. "Even those Companies that actually deleted data would only delete some data, but not all," the report stated. "That is the most basic requirement," said Mario Trujillo, a staff attorney at the Electronic Frontier Foundation. "The fact that some weren't doing that even in the face of state privacy laws that require it proves that stronger enforcement is needed, especially from consumers themselves." Some of the firms have disputed the report's findings. In a statement, Discord said the FTC report was an important step but lumped "very different models into one bucket." "Discord's business model is very different - we are a real-time communications platform with strong user privacy controls and no feeds for endless scrolling. At the time of the study, Discord did not run a formal digital advertising service," Kate Sheerin, Discord's head of public policy in the US and Canada, said in a statement. A Google spokesperson said the company had the strictest privacy policies in the industry. "We never sell people's personal information and we don't use sensitive information to serve ads. We prohibit ad personalization for users under 18 and we don't personalize ads to anyone watching 'made for kids content' on YouTube," said Google spokesperson, José Castañeda. The other firms either did not provide an on-the-record comment or did not immediately respond to a request for comment. However, if companies dispute the FTC's findings, the onus is on them to provide evidence, says the Electronic Privacy Information Center (Epic), a Washington DC-based public interest research organization focused on privacy and free speech. "I used to work in privacy compliance for companies, and let's just say I believe absolutely nothing without documentation to back up claims," said Epic global privacy counsel, Calli Schroeder. "And I agree with the FTC's conclusion that self-regulation is a failure. Companies have repeatedly shown that their priority is profit and they will only take consumer protection and privacy issues seriously when failing to do so affects that profit."
[5]
FTC Says Social Media Platforms Engage in 'Vast Surveillance' of Users
A scathing new report takes aim directly at Big Tech and alleged violations of privacy. Social media platforms are engaging in "vast surveillance" of people online and failing to protect children, according to a new report from the U.S. Federal Trade Commission. And if you thought Big Tech was serious about calling for FTC Chair Lina Khan to be fired before, just wait until this report properly trickles through Silicon Valley today. The FTC issued a warning letter back in late 2020 to nine social media and video streaming services alleging their operations were "dangerously opaque" and said their data collection techniques and algorithms were "shrouded in secrecy." The companiesâ€"Amazon, Facebook, YouTube, X, Snap, ByteDance, Discord, Reddit, and WhatsAppâ€"were told the FTC would be investigating their practices and Thursday's report is the result of those efforts. The report notes that the amount of data collected by large tech companies is enormous, even using the words "simply staggering," to describe how both users and non-users alike can be tracked in myriad ways. And that data that's collected directly by platforms is then combined with data from third-party brokers to compile an even more detailed picture of any given person, according to the FTC. "They track what we do on and off their platforms, often combining their own information with enormous data sets purchased through the largely unregulated consumer data market. And large firms are increasingly relying on hidden pixels and similar technologiesâ€"embedded on other websitesâ€"to track our behavior down to each click," the FTC report reads. "In fact, the Companies collected so much data that in response to the Commission’s questions, they often could not even identify all the data points they collected or all of the third parties they shared that data with," the report continues. The report also warns that AI is complicating the picture even more, with companies feeding data into their artificial intelligence training without consistent approaches to monitoring or testing standards. The report lists things the FTC would like policymakers to do, emphasizing that "self-regulation is not the answer," while also laying out changes the big tech companies are supposed to make. On the policymaker side, the FTC says Congress should pass comprehensive federal privacy legislation to limit surveillance and give consumers rights over their data. The FTC also advocates for new privacy legislation that it says will "fill in the gap in privacy protections" that exist in the Children's Online Privacy Protection Act of 1998, abbreviated as COPPA. As for the companies, the FTC wants to see these platforms limit data collection and implement "concrete and enforceable data minimization and retention policies." The FTC also calls on the companies to limit the sharing of data with third parties and to delete consumer data when it's not needed anymore. The new report also calls on companies to, "not collect sensitive information through privacy-invasive ad tracking technologies," which include pixel trackers, and give better protections to teens. But, again, this report is likely to only increase the calls for Khan to be fired, which have grown louder in the business community in recent months. “The report lays out how social media and video streaming companies harvest an enormous amount of Americans’ personal data and monetize it to the tune of billions of dollars a year,†Lina Khan said in a statement published online. “While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking. Several firms’ failure to adequately protect kids and teens online is especially troubling. The Report’s findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices.†Gizmodo reached out to all nine of the tech companies mentioned by name in the new report but only Discord and Google responded immediately while Meta, which owns Facebook and WhatsApp, declined to comment. Google gave Gizmodo a very short statement about the 129-page report, only focusing on rather narrow issues like reselling data and ad personalization for kids. Discord sent a more robust statement and believes its business is very different from the other eight companies mentioned in the report. “The FTC report’s intent and focus on consumers is an important step. However, the report lumps very different models into one bucket and paints a broad brush, which might confuse consumers and portray some platforms, like Discord, inaccurately," said Kate Sheerin, Head of US/Canada Public Policy for Discord. "The report itself says 'the business model varies little across these nine companies.' Discord’s business model is very differentâ€"we are a real-time communications platform with strong user privacy controls and no feeds for endless scrolling. At the time of the study, Discord did not run a formal digital advertising service, which is a central pillar of the report. We look forward to sharing more about Discord and how we protect our users.†We'll update this post if we hear back from any of the other companies referenced in the FTC report we didn't hear from on Thursday.
[6]
FTC exposes massive surveillance of kids, teens by social media giants
A Federal Trade Commission (FTC) staff report has found that social media and video streaming companies have been engaging in widespread user surveillance, particularly of children and teens, with insufficient privacy protections and earning billions of dollars annually by monetizing their data. The FTC's findings were released after a probe that began in December 2020 and started with 6(b) orders sent to Amazon (owner of Twitch), Meta (Facebook), YouTube, Twitter (now X Corp.), Snapchat, TikTok (owned by ByteDance), Discord, Reddit, and WhatsApp (Meta) four years ago, in December 2020. The report is based on an investigation into the companies' data collection methods, how they track personal and demographic information, and the impact of these practices on minors from 2019 to 2020. As the FTC revealed today, the results raise significant concerns about data retention, sharing practices, and targeted advertising. FTC Chair Lina M. Khan further underscored the gravity of these findings today, saying the report reveals how these companies monetize an "enormous amount of Americans' personal data," earning billions of dollars annually. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," Khan said. "Several firms' failure to adequately protect kids and teens online is especially troubling." The FTC says the platforms collected massive amounts of data, most of which was retained indefinitely. Several companies engaged in broad data sharing, often with inadequate oversight, failed to delete all user data even after the users requested it. The companies' business models also encourage the mass collection of user data to drive targeted advertising, which generates most of their revenue. As the reports noted, this approach directly conflicts with user privacy and increases the risk of misuse of personal information. The report also highlighted how social media and video streaming platforms fed user and non-user data into algorithms and artificial intelligence systems, often without allowing users to opt out, raising further concerns about transparency, oversight, and potential consumer harm. One of the report's most significant findings was the lack of protections for children and teens on these platforms, with many companies saying "that there are no children on their platforms because their services were not directed to children," seemingly in an attempt to avoid compliance with the Children's Online Privacy Protection Act (COPPA). However, the FTC's report found that teens were often treated the same as adults on these platforms, with few, if any, additional safeguards. The FTC staff report urges policymakers to take action, calling for Congress to pass comprehensive federal privacy legislation, including limits on data collection, stricter data minimization and retention policies, and more transparent and consumer-friendly privacy policies. It also asks companies to enhance protections for teens and children on their platforms, treating COPPA as a baseline and offering additional safety and privacy protection measures. "The Report's findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices," Khan added.
[7]
Social media users lack control over data used by AI, US FTC says
(Reuters) - Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the U.S. Federal Trade Commission said in a report released on Thursday. The report analyzed how Meta Platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymized and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google. Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The U.S. House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said. (Reporting by Jody Godoy in New York; Editing by Matthew Lewis)
[8]
FTC Agrees Social Media and Streaming Services Surveil People
The U.S. Federal Trade Commission (FTC) says that many popular social media websites and streaming services are engaged in "vast surveillance" of users. The FTC's survey concluded that social media sites and streaming services engage in "vast surveillance of consumers in order to monetize their personal information while failing to adequately protect users online," including children and teens. The surveillance compromises people's privacy, the FTC warned, exposing them to identity theft, stalking, and other online harms. This is hardly a shocking reveal. We've known about Meta's privacy ill practices for years now, but now there's a study to back up what privacy experts and advocates have been warning us about all along. Importantly, there is no enforceable action here. The FTC has decided to release its findings so that Congress could eventually act on them. That might be easier said than done, however, as House lawmakers haven't even passed the Kids Online Safety Act. The biggest US-owned platforms in the survey include Amazon's gaming service Twitch, Meta's Facebook and WhatsApp, and Google's YouTube. Other popular sites were named, too, including Reddit, Snapchat, and X. ByteDance's TikTok is the only non-U.S. platform named. Worryingly, the survey found many services don't restrict accounts for teens. It also discovered multiple ways that social media sites and streaming services harvest useful information from people who use them, including tracking ads, collecting engagement data, and storing direct inputs from users. External data obtained from corporate affiliates and data brokers is leveraged, too. Services even collect data from people who don't use them! Online services also use analytical processing, artificial intelligence, and algorithms to infer additional data. Coupled with "woefully inadequate" data handling controls and retention policies, this is a recipe for a privacy disaster in case a company gets hacked. Unlike the European Union where online services must provide options for people to download and delete their data from online services, there's no such legislation in the U.S. as yet. As a result, the survey has made a startling discovery that some companies don't even bother deleting user data in response to account deletion requests. Data is the strongest currency in the digital economy. The named services harvest "an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year," said FTC Chair Lina M. Khan. The report also mentions earlier research which linked feelings of exclusion and mental distress in adolescents to exposure to social media. Source: FTC via Engadget
[9]
Social media users lack control over data used by AI, US FTC says
Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the US Federal Trade Commission said in a report released on Thursday. The report analysed how Meta Platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymised and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google. Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The US House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said.
[10]
Social media companies engaged in 'vast surveillance,' FTC finds, calling status quo 'unacceptable'
The FTC issued a scathing report detailing how social media companies and video streaming services track users.Ting Shen / Bloomberg via Getty Images file Popular social media platforms and video streaming services pose serious risks to user privacy, with children and teenagers most at risk, the Federal Trade Commission found in a report published Thursday. The report, which stretches more than 100 pages, details the data, advertising and recommendation-system efforts by these companies, and how they rely on information about users to sell ads. Users also "lacked any meaningful control over how personal information was used for AI-fueled systems" on the companies' platforms, according to the report. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," FTC Chair Lina Khan said in a press release. The report includes staff recommendations calling for federal privacy legislation, as well as more efforts from companies to prioritize privacy in their data collection and recommendation systems. It also said parents should have more control over what information is collected from children and teenagers. "Protecting users - especially children and teens - requires clear baseline protections that apply across the board," the FTC said in the report. The report comes as concerns about data collection, privacy and recommendation systems powered by artificial intelligence have become an increasingly bipartisan issue in an era of deep political divisions. Some legislation has moved forward, most notably the Kids Online Safety Act (KOSA) and the Children and Teens' Online Privacy Protection Act (COPPA) 2.0, both of which have passed the Senate and recently advanced in the House Committee on Energy and Commerce. "COPPA should be the floor, not the ceiling," the FTC said in its recommendations. The FTC initially ordered Amazon, Facebook and WhatsApp (now under Meta), Twitter (now X), ByteDance, YouTube, Reddit, Snap and Discord to provide data about how the companies collect and use personal information from their users in December 2020. The report examined 13 platforms owned by the companies, including Twitch, Facebook, Messenger, Kids Messenger, Instagram, WhatsApp, X, TikTok, YouTube, YouTube Kids, Snapchat, Reddit and Discord. The report found that companies engaged in "vast surveillance" by collecting and retaining personal information about consumers, whether or not they are users of the companies' platforms. Some companies purchased this information from data brokers, according to the report. Representatives for Amazon, Meta, X, ByteDance, YouTube, Reddit and Snap did not immediately respond to requests for comment. Kate Sheerin, head of U.S. and Canada public policy at Discord, said that the FTC report is an "important step," but stated that it "lumps very different models into one bucket and paints a broad brush, which might confuse consumers and portray some platforms, like Discord, inaccurately." Sheerin disputed the report's claim that user privacy concerns "stem from a business model that varies little across these nine firms." The privacy of children and teens were not adequately protected on these social media platforms and streaming services, according to the report. The FTC wrote that companies attempted to avoid liability under COPPA, which regulates the collection of data from children under 13, by claiming "that there are no child users on their platforms because children cannot create accounts." However, children and teens are known to be on social media, and the FTC wrote that companies "should not ignore this reality." The report found that teen accounts were treated the same as adults by most companies, which puts their privacy and mental health at risk, according to the report.
[11]
Social media companies slammed for 'woefully inadequate' data...
Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the Federal Trade Commission said in a report released on Thursday. The report analyzed how Meta Platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymized and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google. Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said.
[12]
Social media users lack control over data used by AI, US FTC says
Sept 19 (Reuters) - Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the U.S. Federal Trade Commission said in a report released on Thursday. The report analyzed how Meta Platforms (META.O), opens new tab, ByteDance's TikTok, Amazon's (AMZN.O), opens new tab gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." Advertisement · Scroll to continue YouTube, social media platform X, Snap (SNAP.N), opens new tab, Discord and Reddit (RDDT.N), opens new tab were also included in the FTC report, though its findings were anonymized and did not reveal specific companies' practices. YouTube is owned by Alphabet's (GOOGL.O), opens new tab Google. Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. Advertisement · Scroll to continue "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The U.S. House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said. Reporting by Jody Godoy in New York Editing by Matthew Lewis Our Standards: The Thomson Reuters Trust Principles., opens new tab Jody Godoy Thomson Reuters Jody Godoy reports on tech policy and antitrust enforcement, including how regulators are responding to the rise of AI. Reach her at jody.godoy@thomsonreuters.com
[13]
FTC Says Social Media and Streaming Companies Have Engaged in Vast Surveillance
A report from the Federal Trade Commission (FTC) has found that major social media and video streaming have been engaged in 'vast surveillance,' including of children. The FTC published a report today of an ongoing study of social media sites from the past four years. According to the findings, many social media sites and streaming services have been involved in "vast surveillance" of customers to profit from personal information. The report names Amazon, Facebook, YouTube, Twitter/X, Snapchat, ByteDance (the company that owns TikTok), and more. According to the report, spotted by Engadget, these companies "indefinitely retain troves" of the data provided by users, and are found to have "woefully inadequate" security measures in place. Additionally, the FTC staff reported that "some companies did not delete all user data in response to deletion requests." Probably the most troubling news from this report is that the social media sites and streamers listed didn't "adequately protect children and teens on their sites." While some companies suggested that there are no children on their sites, the data in the report shows that most social media platforms "treated teens the same as adult users," often not imposing account restrictions. This is a similar conclusion to a Wall Street Journal report from earlier this year that found Instagram's content algorithms are serving sexual content to kids and teenagers. The report alleges that social media companies may claim children aren't using their services to avoid liability under existing child privacy protection regulations. The FTC says the core of the issue here is that the business model of most of these platforms hinges on the "mass collection" of user data to implement revenue-generating targeted advertising. The report describes social media and video streaming companies as having constructed "the infrastructure for mass commercial surveillance," which the FTC says creates "serious costs to our privacy." "The report lays out how social media and video streaming companies harvest an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year," says FTC Chair Lina M. Khan. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking. Several firms' failure to adequately protect kids and teens online is especially troubling. The Report's findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices." The study concludes that attempts by these companies to self-regulate "has been a failure." The FTC report is pessimistic that social media companies will adequately protect their users, provided that strict laws with regulatory teeth are not in effect. The commission recommends that Congress pass privacy legislation and limit social media platforms' ability to share information with data collection third parties. Additionally, the report also urges social media platforms to actually delete information when requests are made, as well as limiting the use of ad tracking technology. This is a troubling yet unsurprising conclusion that many social media users are likely already aware of. As AI systems are further implemented into daily online platforms, the protection of users personal data, especially that of children, is vital. The lengthy, detailed report is available in full on the FTC's website.
[14]
Social media users lack control over data used by AI, US FTC says
US FTC report analysed how Meta platforms, ByteDance's TikTok, Amazon's Twitch, and others manage user data Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the US Federal Trade Commission (FTC) said in a report released on Thursday. The report analysed how Meta platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymised and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google. Click here to connect with us on WhatsApp Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The US House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. More From This Section Wall Street rallies after Federal Reserve cuts interest rates 50 bps How Israel built modern-day trojan horse equipped for exploding pagers OECD still sees total commitment from countries to finalise global tax pact Oil prices rise 2% on US interest rate cut, helps Brent crude recover Hezbollah leader says device attack crossed 'red line,' fears of war mount The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said. Specific findings Social media users lack control over data usage Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers Most of the companies collected users' age and gender or guessed it based on other information Some also gathered information on users' income, education and family status Companies gathered data on individuals who did not use their services Some were not able to identify all of the ways they collected and used data Also Read Amazon veteran Samir Kumar to replace Manish Tiwary as head of India ops Amazon Great Indian Festival to begin from September 27, check top deals Samir Kumar named Amazon India's country manager after Manish Tiwary's exit 'Amazon Biz customer base up at annual 111% rate in 6 yrs, 65% via phones' BJP MP Khandelwal asks Goyal to suspend festival sales of e-commerce giants
[15]
Sweeping FTC study finds that social media sites engage in 'vast surveillance' of its users
The FTC of a sweeping study of social media sites that has been going on for four years. The organization said that many social media sites and streaming services engage in "vast surveillance of consumers in order to monetize their personal information." This mass surveillance impacts adult users, but also children and teens. This isn't exactly surprising. After all, the old saying goes "if you're not paying for the product, you are the product." Still, the study suggests a level of surveillance that could shock even the most cynical among us. According to the FTC, these entities collect and "indefinitely retain troves of data." The companies also engage in "broad data sharing" with "woefully inadequate" security measures. The report also found that some companies didn't delete all user data in response to deletion requests. That's not a good look. Additionally, some companies were found to be using privacy-invasive technologies like to "facilitate advertising to users based on preferences and interests." But wait, there's more. The report found that users (and even non-users) had little or no way to opt out of how their data was used by automated systems, like algorithms, data analytics and AI. The FTC found that these companies employed "different, inconsistent and inadequate approaches to monitoring and testing the use" of these automated systems. Finally, the report found that "social media and video streaming services didn't adequately protect children and teens on their sites." The study goes on to suggest that social media, and digital technology as a whole, contributes to "negative mental health impacts on young users." This is nothing new, though some social media companies are putting tools in place to protect kids. Instagram for teen accounts to include parental controls. The FTC says that all of these issues boil down to the profit models of the big social media and streaming companies. These business models mandate the "mass collection of user data to monetize, especially through targeted advertising." This is in "tension" with privacy concerns as, well, privacy doesn't make money. The study concludes that "self-regulation has been a failure." To that end, the FTC has issued several recommendations to help solve these problems. It wants Congress to pass comprehensive privacy legislation to limit surveillance and to offer "baseline protections." It also wants social media and streaming companies to limit data collection and data sharing with third parties. The FTC also recommends that these companies actually delete consumer data when it's no longer needed or upon request and to stop using invasive ad tracking technologies like pixels. It also wants these entities to address the overall lack of transparency regarding their methods. As for kids and teens, the FTC says these organizations should "recognize teens are not adults and provide them greater privacy protections." Finally, it urges Congress to pass federal privacy legislation for teens over the age of 13. Again, none of this information is new, but it's pretty damning to see it all laid out this way. You can . The services involved in the study include X, TikTok, Reddit, Discord, Twitch, YouTube, Instagram and several others.
[16]
FTC says social networks' data privacy, safety policies are 'woefully inadequate'
Top social media and video streaming companies are facing new scrutiny from the Federal Trade Commission (FTC), which released a new report Thursday accusing the platforms of vastly violating users' privacy and failing to provide safeguards for kids and teens. The 129-page report, published Thursday morning, found that several social media and video streaming platforms carried out practices in the last four years that "did not consistently" prioritize consumers' privacy. FTC Chair Lina Khan said the report determined these platforms "harvest an enormous amount of Americans' personal data and monetize it," for billions of dollars every year. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," she wrote in a release. Those surveyed in the report included Meta Platforms, YouTube, X, Snapchat, Reddit, Discord, WhatsApp, Amazon -- the owner of gaming platform Twitch -- and ByteDance, the owner of TikTok. The Hill reached out to the companies for further comment. To carry out the examination, the FTC in 2020 asked the nine companies for information on how they collect and track users' personal and demographic information and whether they apply their content algorithms or data studies to this information. The companies were also asked how they determine which ads and other content are shown to users, and how their platforms might impact the youth. The companies' data management and retention practices were "woefully inadequate," the FTC said, noting the companies have collected troves of data "in ways consumers might not expect." This includes gathering data through online advertisement and buying information from data brokers, the report found. Some of the companies are increasingly using the data for their artificial intelligence systems, though users were often left in the dark went it came to how their data was involved in these products, the FTC report stated. The FTC noted each of its findings may not apply to each company, maintaining the report is instead a general summary of nearly four years of research. The agency report separately looked at the impact of these practices on children and teens, finding they put such users at a "unique risk." FTC staff pointed to social media algorithms, which may push harmful content such as dangerous online challenges that can prompt negative health consequences for children and teens, as a particular danger for young people. "Several firms' failure to adequately protect kids and teens online is especially troubling. The Report's findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices," Khan wrote Thursday. The report comes as the privacy of users, especially children and teens, has captured the attention of various lawmakers and child safety advocates on Capitol Hill. The report came out a day after a House panel advanced the Kids Online Safety Act (KOSA), pushing forward legislation intended to boost online privacy and safety for children. KOSA would create regulations governing the kinds of features tech and social media companies can offer kids online and aims to reduce the addictive nature and mental health impact of these platforms. While KOSA received overwhelming support in the Senate and advanced through the House committee, the legislation could face challenges on the full House floor. Some Republicans are concerned the bill could give the FTC "sweeping authority," and the potential censorship of conservative views, a House leadership source told The Hill this week.
[17]
Social media users lack control over data used by AI, US FTC says
Social media companies gather data through tracking technologies used in online advertising and buying information from data brokers, and other means, the FTC said. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking," said FTC Chair Lina Khan. Data privacy, particularly for kids and teens, has been a hot-button issue. The U.S. House of Representatives is considering bills passed by the Senate in July aimed at addressing social media's effects on younger users. And Meta recently rolled out teen accounts that incorporate enhanced parental controls. Meanwhile, Big Tech companies have been scrambling to acquire sources of data to train their emerging artificial-intelligence technologies. The data deals are infrequently disclosed and often involve private content locked behind paywalls and login screens, with scant or no notice to the users who posted it. In addition to collecting data about how users engage with their services, most of the companies the FTC reviewed collected users' age and gender or guessed it based on other information. Some also gathered information on users' income, education and family status, the FTC said. Companies gathered data on individuals who did not use their services, and some were not able to identify all of the ways they collected and used data, the FTC said. (Reporting by Jody Godoy in New York; Editing by Matthew Lewis)
[18]
Social media users lack control over data used by AI, says US FTC
Social media companies collect, share and process vast troves of information about their users while offering little transparency or control, including over how it is used by systems incorporating artificial intelligence, the US Federal Trade Commission said in a report released on Thursday. The report analyzed how Meta Platforms, ByteDance's TikTok, Amazon's gaming platform Twitch, and others manage user data, concluding that data management and retention policies at many of the companies were "woefully inadequate." YouTube, social media platform X, Snap, Discord and Reddit were also included in the FTC report, though its findings were anonymized and did not reveal specific companies' practices. YouTube is owned by Alphabet's Google.
[19]
FTC report assails social networks' privacy, safety practices
Agency chief Lina Khan says companies' data practices can endanger users' privacy and freedom. The Federal Trade Commission rebuked social media and streaming companies including YouTube, Amazon and Facebook on Thursday, accusing them of failing to adequately protect users from privacy intrusions and safeguard children and teens on their sites. In a sprawling 129-page staff report, the agency summed up a years-long study into industry practices by criticizing the companies for not "consistently prioritizing" users' privacy, for broadly scooping up data to power new artificial intelligence tools and for refusing to confront potential risks to kids. FTC Chair Lina Khan, a Democrat whose aggressive oversight of the tech giants has drawn plaudits from liberals and conservatives alike, said the report shows how companies' practices "can endanger people's privacy, threaten their freedoms and expose them to a host of harms," adding that the findings on child safety were "especially troubling." In 2020, the FTC demanded that nine social networks and video streaming providers hand over information on how they collect, use and sell people's personal data, how their products are powered by algorithms and how their policies affect kids and teens. The agency was able to compel information from companies whose practices lawmakers and regulators have often criticized as being too opaque. They included Amazon, Facebook (now Meta), Google-owned YouTube, Twitter (now X), Snap, TikTok owner ByteDance, Discord, Reddit and Meta-owned WhatsApp. (Amazon founder Jeff Bezos owns The Washington Post.) FTC employees wrote that the report described "general findings" across those studied but noted that not all of them applied to every company in every instance. Still, agency staffers described numerous pervasive patterns they said exposed users to harm or left them in the dark about how their data was being used to make money for the companies. According to the report, the companies have collected troves of data on users and nonusers, often in "ways consumers might not expect," and many of the guardrails put in place to protect that information were erected only in response to global regulations. While the companies are increasingly mining that data to launch AI products, the agency found, consumers typically lacked "any meaningful control over how personal information was used" for them. The findings, staffers wrote, revealed "an inherent tension between business models that rely on the collection of user data and the protection of user privacy." The agency's Democratic leadership has spoken out before against "commercial surveillance" practices they say have come to dominate Silicon Valley. An FTC official, who briefed reporters on the condition of anonymity to discuss the findings, declined to comment on how the study might shape the agency's enforcement but said it showed that many of the issues they anticipated ran much deeper than expected. According to the report, many of the companies studied "bury their heads in the sand when it comes to children" on their sites. Many claimed that because their products were not directly targeted at children and their policies did not allow children on their sites, they knew nothing of children being present on them. "This is not credible," agency staffers wrote. Child safety advocates have long expressed concern that under existing federal child privacy laws, known as the Children's Online Privacy Protection Act, or COPPA, companies can avoid accountability by claiming not to have knowledge that children are accessing their sites. Concerns about companies failing to protect younger users were particularly pronounced among teens, whom many platforms simply treated like "traditional adult users" and typically did not afford the same protections as young children, the agency wrote. The FTC official declined to comment on Instagram's newly released safety tools for teens but said companies can't be relied upon to regulate themselves. The report recommended that Congress pass both comprehensive federal privacy legislation to cover all consumers and to expand existing guardrails for children onto teens. Since the study began four years ago, the social media market has become more fractured and decentralized as upstarts such as TikTok challenge long-standing leaders and platforms such as Telegram cater to increasingly niche audiences. Asked whether the agency's analysis was still relevant, the FTC official said it was difficult to obtain information from the internet companies even with their investigative authority. The official added that the practices they have highlighted are tied to the companies' business models, which have not changed. While the study began during the Trump administration, the FTC under Khan has dialed up its enforcement against the tech sector over data privacy and child safety complaints, including by launching sprawling efforts to update privacy regulations. The study's release arrives as lawmakers at the federal and state levels push to pass expanded protections for children's privacy and safety. Dozens of states have passed laws to that effect over the past year, and a key House committee advanced a pair of bills Wednesday that would mark the most significant update to child online safety laws in decades. But those efforts face opposition from tech industry and business groups that say they trample on users' free speech rights, force companies to collect more data and stifle innovation. This is a developing story.
[20]
Big Tech under fire for 'harmful' harvesting of data
Says Lina Khan in latest push to rein in Meta, Google, Amazon and pals Buried beneath the endless feeds and attention-grabbing videos of the modern internet is a network of data harvesting and sale that's far more vast than most people realize, and it desperately needs regulation. That's the conclusion the FTC made after spending nearly four years poring over internal data from nine major social media and video streaming corporations in the US. These internet behemoths are collecting vast amounts of data, both on and off their services, and the handling of such data is "woefully inadequate," particularly around data belonging to children and teenagers, the FTC said. "Social media and video streaming companies harvest an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year," FTC chair Lina Khan said of the findings. "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms." These surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms Twitch owner Amazon, Meta, YouTube, X, Snap, TikTok owner ByteDance, Discord, Reddit, and WhatsApp were all asked in late 2020 to provide the FTC with answers to questions regarding their data collection and use. The 129-page report [PDF] doesn't break things down by company, and the US watchdog said not all of its recommendations apply to every platform. That said, the findings are pretty serious. First, the report concluded that many of the big players collect and indefinitely retain records on users and non-registered visitors from both their platforms and beyond with, as mentioned, subpar data protection practices. In addition, some services reported only de-identifying data when a user asked for its deletion, while others deleted only portions of the requested data. Second, many of the nine corporations relied financially on selling ad space to third parties based on user information with little way for netizens to see what's being shared and with whom. This leads to the next finding: Many of the businesses feed user data into AI models to train them, and users lack the ability to have much of that data deleted - especially if it was harvested off platform. Neither users nor non-user visitors had the ability "to review the information used by these systems or their outcomes, to correct incorrect data or determinations, or to understand how decisions were made, raising the potential of further harms when systems may be unreliable or infer sensitive information about individuals," the FTC said. The regulator said it was also worried about how the web giants treated teenagers, whose data is no longer covered under the Children's Online Privacy Protection Rule (COPPA rule enforcement ends at age 13), but who still need to be treated as a special class. "Almost all of the companies allowed teens on their [social media and video streaming services] and placed no restrictions on their accounts, and collected personal information from teens just like they do from adults," the FTC said. The Register has reached out to all nine big names in the report, and most haven't responded. Google, which acquired YouTube in 2006, told us it has "the strictest privacy policies in our industry," and doesn't sell personal data (though it could be argued it doesn't need to) nor use sensitive data to personalize ads or collect data from minors. Discord said that, while it supports the intent of the study, its business model differs so much from the other orgs in the report that it doesn't think it should be lumped in with the rest. "At the time of the study, Discord did not run a formal digital advertising service, which is a central pillar of the report," Discord head of US public policy Kate Sheerin told us. "We look forward to sharing more about Discord and how we protect our users." None of this is stuff we haven't heard from various government agencies and NGOs over the years. We know third-party companies hand lots of data over to Meta and other firms, we know Americans lack the same AI opt-outs that EU residents have and we know the amount of data those nine companies have can be a privacy concern. In order to address what is at this point a well-established and ongoing issue, the FTC made several suggestions, saying that online services should limit their data collection practices, eliminate privacy-invasive tracking technologies, add user controls, and treat teenagers differently than adults. This report makes clear that self-regulation has been a failure But beyond that, the report concluded, a comprehensive federal privacy regulation is needed. "America's hands-off approach has produced an enormous ecosystem of data extraction and targeting that takes place largely out of view to consumers," FTC bureau of consumer protection director Samuel Levine wrote in a preface to the report. "While there have been isolated instances of firms taking pro-privacy actions, those continue to be the exceptions that prove the rule." "This report makes clear that self-regulation has been a failure," Levine added.
Share
Share
Copy Link
The U.S. Federal Trade Commission has released a report highlighting the widespread surveillance and monetization of user data by major social media platforms. The study raises concerns about user privacy and the lack of control over personal information used in AI systems.
The U.S. Federal Trade Commission (FTC) has released a comprehensive report exposing the extensive surveillance and monetization of user data by major social media platforms. The study, which examined the practices of Meta (formerly Facebook), YouTube, TikTok, Twitter (now X), and other tech giants, reveals a disturbing trend in how these companies collect, use, and profit from personal information 1.
According to the FTC's findings, social media companies engage in "commercial surveillance" on a massive scale. These platforms collect vast amounts of personal data, including users' physical locations, personal messages, and browsing histories. This information is then used to create detailed profiles of individuals, which are subsequently monetized through targeted advertising and other means 2.
One of the most concerning aspects highlighted in the report is the lack of control users have over their data, especially when it comes to its use in artificial intelligence systems. The FTC noted that many platforms provide limited or no options for users to manage how their information is utilized in AI training and decision-making processes 3.
The extensive data collection practices not only raise privacy concerns but also have significant implications for market competition. The FTC report suggests that the vast troves of user data accumulated by these platforms create substantial barriers to entry for potential competitors, effectively entrenching the dominance of existing social media giants 4.
In light of these findings, there are growing calls for increased regulation and transparency in the social media industry. The FTC's report emphasizes the need for stronger safeguards to protect user privacy and ensure fair competition in the digital marketplace. Some experts and policymakers are advocating for more stringent data protection laws and enhanced user control over personal information 5.
While some social media companies have defended their practices as necessary for providing personalized services and maintaining free platforms, the FTC's report has sparked a broader debate about the balance between innovation and privacy. As public awareness grows and regulatory scrutiny intensifies, it remains to be seen how the social media landscape will evolve to address these pressing concerns surrounding user data and AI utilization.
Reference
[1]
[3]
[4]
The Federal Trade Commission (FTC) has released a report exposing the massive scale of data collection and scraping by major social media companies. This article explores the findings and provides guidance on how users can protect their personal information.
2 Sources
The Federal Trade Commission has ordered eight major companies, including Mastercard and JPMorgan Chase, to provide information on their surveillance-based pricing practices. This move aims to investigate how companies use personal data to set prices for consumers.
20 Sources
Meta faces scrutiny from Australian authorities over its use of user data for AI training. The company has admitted to scraping posts and photos from Facebook users since 2007 without explicit consent, raising privacy concerns.
8 Sources
The European Union is pressuring Meta to address concerns over its new privacy policy that offers users a choice between paying for ad-free services or consenting to data collection for targeted advertising.
5 Sources
The Federal Trade Commission (FTC) has initiated a campaign to combat deceptive AI product claims and scams. The agency is targeting five companies for potential violations, signaling increased scrutiny of the AI industry.
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved