4 Sources
4 Sources
[1]
Facebook Group admins complain of mass bans; Meta says it's fixing the problem | TechCrunch
After a wave of mass bans impacting Instagram and Facebook users alike, Meta users are now complaining that Facebook Groups are also being impacted by mass suspensions. According to individual complaints and organized efforts on sites like Reddit to share information, the bans have impacted thousands of groups both in the U.S. and abroad, and have spanned various categories. Reached for comment, Meta spokesperson Andy Stone confirmed the company was aware of the issue and working to correct it. "We're aware of a technical error that impacted some Facebook Groups. We're fixing things now," he told TechCrunch in an emailed statement. The reason for the mass bans is not yet known, though many suspect that faulty AI-based moderation could be to blame. Based on the information shared by impacted users, many of the suspended Facebook groups aren't the type that would regularly face moderation concerns, as they focus on fairly innocuous content -- like savings tips or deals, parenting support, groups for dog or cat owners, gaming groups, Pokémon groups, groups for mechanical keyboard enthusiasts, and more. Facebook Group admins report receiving vague violation notices related to things like "terrorism-related" content or nudity, which they claim their group hasn't posted. While some of the impacted groups are smaller in size, many are large, with tens of thousands, hundreds of thousands, or even millions of users. Those who have organized to share tips about the problem are advising others not to appeal their group's ban, but rather wait a few days to see if the suspension is automatically reversed when the bug is fixed. Currently, Reddit's Facebook community (r/facebook) is filled with posts from group admins and users who are angry about the recent purge. Some report that all the groups they run have been removed at once. Some are incredulous about the supposed violations -- like a group for bird photos with just under a million users getting flagged for nudity. Others claim that their groups were already well-moderated against spam- like a family-friendly Pokémon group with nearly 200,000 members, which received a violation notice that their title referenced "dangerous organizations," or an interior design group that served millions, which received the same violation. At least some Facebook Group admins who pay for Meta's Verified subscription, which includes priority customer support, have been able to get help. Others, however, report that their groups have been suspended or fully deleted. It's unclear if the problem is related to the recent wave of bans impacting Meta users as individuals, but this seems to be a growing problem across social networks. In addition to Facebook and Instagram, social networks like Pinterest and Tumblr have also faced complaints about mass suspensions in recent weeks, leading users to suspect that AI-automated moderation efforts are to blame. Pinterest at least admitted to its mistake, saying the mass bans were due to an internal error, but it denied that AI was the issue. Tumblr said its issues were tied to tests of a new content filtering system, but did not clarify if that system involved AI. When asked about the recent Instagram bans, Meta had declined to comment. Users are now circulating a petition that has topped 12,380 signatures so far, asking Meta to address the problem. Others, including those whose businesses were impacted, are pursuing legal action. Meta has still not shared what's causing the issue with either individual accounts or groups.
[2]
Meta admits wrongly suspending Facebook Groups
Another admin, who runs a group on AI which has 3.5 million members, posted on Reddit to say his group and his own account had been suspended for a few hours, with Meta telling him later: "Our technology made a mistake suspending your group." It comes as Meta faces questions from thousands of people over the mass banning or suspension of accounts on Facebook and Instagram. A petition entitled "Meta wrongfully disabling accounts with no human customer support" has gathered almost 22,000 signatures at the time of writing on change.org. Meanwhile, a Reddit thread dedicated to the issue features many people sharing their stories of being banned in recent months. Some have posted about losing access to pages with significant sentimental value, while others highlight they had lost accounts linked to their businesses. There are even claims that users have been banned after being accused by Meta of breaching its policies on child sexual exploitation. Users have blamed Meta's AI moderation tools, adding it is almost impossible to speak to a person about their accounts after they have been suspended or banned. BBC News has not independently verified those claims. In a statement, Meta said: "We take action on accounts that violate our policies, and people can appeal if they think we've made a mistake." It said it used a combination of people and technology to find and remove accounts that broke its rules, and was not aware of a spike in erroneous account suspension. Instagram states on its website AI is "central to our content review process". It says AI can detect and remove content against its community standards before anyone reports it, while content is sent to human reviewers on certain occasions. Meta adds accounts may be disabled after one severe violation, such as posting child sexual exploitation content.
[3]
Did a Facebook group randomly ban you? You're not alone.
Mass suspensions are hitting Facebook Groups, with users complaining that their groups have been deleted for unjustified reasons. Across Reddit and X, users said their groups have been banned or shut down, including a Pokémon group with 260,000 members, a "bad drivers" group with 120,000 members, and an interior design group with over three million members. According to these posts, the groups were taken down because they fell into the categories of "dangerous organizations and individuals" and "terrorism." A birding group with 927,000 members was deleted for "nudity and adult sexual content." "These are birds," wrote an admin of the group on Reddit. "Myself and my Modmins heavily moderate the group. If Facebook doesn't give me back my group, I'm done." This Tweet is currently unavailable. It might be loading or has been removed. This Tweet is currently unavailable. It might be loading or has been removed. Meta has said it's aware of the technical error, according to a statement to TechCrunch. "We're fixing things now," said a Meta spokesperson. Users suspect AI-based moderation is behind these mass bans. The content and topics of these groups -- Pokémon, interior design, birding, and others -- are unlikely to break guidelines for things like "terrorism-related" content, nudity, or "dangerous criminal activity." Meta has not shared the reason behind the suspensions so far. Group admins affected by this wave of bans are now taking action. Minneapolis-based creator Chris Moore said he is suing Meta through a class-action lawsuit, encouraging affected creators and businesses to join him. Meanwhile, petitions are making the rounds on Change.org, with one asking to "hold Meta accountable for wrongfully disabled accounts" garnering more than 20,200 signatures. On another petition, a user who moderated a science group with 600,000 members said his group was "shut down for 'terrorism'" even though the group had no previous violations. Mass suspensions have been taking place on other platforms in recent weeks, including Pinterest and Tumblr. Instagram, also Meta-owned, was accused of unfair and widespread bans last week, with users suspecting faulty AI as the cause. Some banned users who pay for a Meta Verified subscription were able to contact Instagram for support, but the rest were unable to do so. Big tech companies like Google and Meta typically make it extremely difficult for users to contact them, and they lack traditional phone-based customer support. In this case, Facebook Groups that pay for the Meta Verified subscription were also able to receive help and restore their groups.
[4]
Meta faces backlash as technical error bans thousands of Facebook groups
Some admins who pay for Meta's Verified subscription, which offers access to priority support, were able to get help. Meta is under fire once again after a technical error led to the mass banning of Facebook groups, leaving thousands of communities across the world suspended or deleted without clear reasons. The issue follows a recent wave of account suspensions on both Facebook and Instagram, raising growing concerns among users about the company's moderation systems. Many affected group admins and users took to Reddit and other platforms to share their frustration. The banned groups range across various interests- from parenting and pet owners to gaming, savings tips, Pokemon fans, and even interior design. Also read: Microsoft plans to cut more jobs at Xbox division next week in latest round of layoffs Group admins say they received vague violation notices, with accusations of sharing "terrorism-related" or adult content which they claim they never posted, reports TechCrunch. Some groups with just a few thousand members were hit, while others had hundreds of thousands or even millions of users. Responding to the situation, Meta spokesperson Andy Stone told TechCrunch, "We're aware of a technical error that impacted some Facebook Groups. We're fixing things now." Also read: OpenAI and Jony Ive's first AI device might not be wearable, court documents reveal Many users suspect that Meta's AI-based moderation system might be wrongly flagging and removing content or entire groups. Some admins who pay for Meta's Verified subscription, which offers access to priority support, were able to get help, according to the report. Reddit's Facebook community is currently flooded with posts from confused and upset group admins and users. Meanwhile, other platforms like Pinterest and Tumblr have also seen similar mass bans recently. While Pinterest admitted it was due to an internal mistake, it denied AI was the cause. Tumblr linked the problem to a test of its content filtering system.
Share
Share
Copy Link
A technical error in Meta's AI-based moderation system has led to widespread suspensions of Facebook Groups, affecting millions of users and raising questions about the reliability of automated content moderation.
Meta, the parent company of Facebook, is facing significant backlash after a technical error in its AI-based moderation system led to the mass suspension of Facebook Groups. The issue has affected thousands of communities worldwide, ranging from small groups to those with millions of members
1
2
3
.Source: Mashable
The suspensions have impacted a diverse array of groups, including those focused on savings tips, parenting support, pet owners, gaming, Pokémon, and even interior design. Many group administrators reported receiving vague violation notices, with accusations of sharing "terrorism-related" content or nudity, which they vehemently deny
1
3
.Some notable examples of affected groups include:
3
Meta spokesperson Andy Stone acknowledged the issue, stating, "We're aware of a technical error that impacted some Facebook Groups. We're fixing things now"
1
. However, the company has not provided detailed information about the cause of the problem or a timeline for resolution2
.Users and group administrators have expressed their frustration and concerns across various platforms, including Reddit and X (formerly Twitter). Many suspect that faulty AI-based moderation is to blame for the mass bans
3
. The content and topics of the affected groups are generally innocuous, making it unlikely that they would violate guidelines related to terrorism or adult content3
.This incident is not isolated to Facebook Groups. It follows a recent wave of account suspensions on both Facebook and Instagram, raising questions about the reliability of Meta's moderation systems
2
4
. Other social media platforms, including Pinterest and Tumblr, have also faced similar issues with mass suspensions in recent weeks1
.Related Stories
In response to the mass bans, affected users and group administrators are taking various actions:
1
.2
.3
.Source: TechCrunch
Interestingly, some Facebook Group admins who subscribe to Meta's Verified service, which includes priority customer support, have been able to get help and restore their groups
1
4
. This disparity in support has further fueled frustration among non-subscribers who are unable to contact Meta for assistance3
.As Meta works to resolve this technical error, the incident has reignited debates about the effectiveness and fairness of AI-based content moderation systems. It also highlights the challenges faced by social media platforms in balancing automated moderation with human oversight to ensure accurate and just enforcement of community guidelines.
Summarized by
Navi
[1]