Curated by THEOUTPOST
On Mon, 16 Dec, 4:01 PM UTC
2 Sources
[1]
Over 700K CSAM Groups Blocked by Telegram in 2024
Messaging platform Telegram has released a new moderation page, which contains details of the groups and channels they suspended for violating their terms of service. According to the page, Telegram suspended over 15.5 million groups and channels in 2024. "Telegram blocks tens of thousands of groups and channels daily and removes millions of pieces of content that violate its Terms of Service, including incitement to violence, sharing child abuse materials, and trading illegal goods," said the platform. The messaging service also stated that "cutting-edge AI moderation tools" had helped them enhance moderation efforts. The platform claimed to have blocked over 700,000 CSAM related groups and channel in the year. Telegram operates a system where it automatically checks images against a hash database of CSAM images the service had already banned in the past. They've now expanded this database with hashes from organizations like the Internet Watch Foundation. Hashing is a technique where CSAM content is assigned digital fingerprints and stored in a database which can then be used to compare and detect similar content uploaded on the platform. In addition, Telegram receives thousands of manual user reports from on CSAM from NGOs, leading it to ban over 30,000 pieces of content from the platform between July and November this year. The platform's daily reports stated that it blocked 1908 groups and channels related to child abuse banned on December 15, totalling 34,083 for the month. Terrorism related content also saw a major crackdown in the latter half of this year, with over 130,000 terrorism related channels and groups blocked in 2024. Telegram banned 4,714 communities in a single day on October 15. Similar to its CSAM detection mechanism, Telegram collaborates with NGOs like ETIDAL, who have helped it remove over 100 million pieces of content. According to the platform's daily reports, it blocked 1,001 terrorism bots and channels on December 15, bringing the total for December up to 9617. Telegram's founder and CEO Pavel Durov was arrested by French authorities in August this year, over an alleged lack of moderation that led to the proliferation of Child Sexual Abuse Material (CSAM) and hate speech. He also reportedly resisted cooperation with authorities. Besides France, Telegram is also reportedly under investigation by Indian authorities for extortion and gambling. The platform is also under investigation in South Korea over the spread of deepfake porn on the platform. This was only the latest in a long string of government action against the company. Durov's own home country Russia had banned the app after the company refused to cooperate with the government over user privacy. However, the ban proved to be ultimately unsuccessful as Telegram continued to thrive in Russia. The country eventually rescinded the ban, after Durov reportedly agreed to cooperate with Russian security services over terrorism and extremism present on the platform. Durov's arrest in France also triggered a subsequent willingness to cooperate with authorities. He stated in September that the company will improve its content moderation efforts and added a line to the FAQ page informing users about a 'report' button they could use to flag illegal content to moderators. The company also agreed to provide a user's phone number and IP address to judicial authorities if it receives an order confirming that the user is a suspect in a criminal case. Telegram also partnered with the Internet Watch Foundation (IWF) to prevent the propagation of CSAM on its platform. The partnership granted Telegram access to IWF's datasets and tools, including the aforementioned hash database, and would also help in the blocking of AI generated CSAM. In fact, the statistics Telegram provided on its moderation page present a clear but gradual increase in the banning of communities in the latter half of 2024, following Durov's arrest. The difference seems to be especially evident in the number of terrorism related communities being banned. However, Telegram did not provide any details on the number of user accounts it suspended over the year, if any, and whether it reported incidents of CSAM or terrorism to authorities.
[2]
Telegram uses AI to block millions of illegal channels
According to the platform's moderation overview, it blocked more than 2.2m groups and channels on 23 March alone. Telegram, the messaging app heavily criticised for its hands-off approach to content moderation said that it blocked more than 15.5m groups and channels that violated its terms and services in just 2024. According to the platform, the blocked pieces of content include those that incited violence, shared child abuse material and traded in illegal goods. Content moderation on Telegram includes combining user reports with "proactive monitoring" powered by machine learning, the company said - an effort which was "further enhanced" earlier this year with the use of artificial intelligence moderation tools. Moreover, Telegram said that since 2018, public images have been automatically checked against a child sexual abuse material (CSAM) database, leading to nearly 710,000 CSAM-related groups and channels on the platform being blocked this year, including more than 32,000 just this month. Meanwhile, more than 130,000 terrorist-related communities were also blocked this year. Data provided by Telegram and Etidal, the Saudi Arabian non-profit organisation combating extremist ideology - which Telegram has partnered with since 2022 - showed that the platform removed nearly 130m extremist content and more than 14,500 extremist channels from the platform since February 2022. Telegram's daily transparency reports, which the platform has been published since 2016, showed that it banned nearly 9,000 terrorist bots and channels in just this month alone. The organisation said that "calls to violence and terrorist propaganda have no place" on the platform. However, Pavel Durov, the platform's CEO and co-founder was arrested earlier this August by French authorities over claims that Durov failed to take steps towards stopping criminal activity on the messaging app. Although out of a €5m bail, the case against him is still pending and he is forbidden from leaving France. The platform had initially pushed back against Durov's arrest, saying that its CEO has "nothing to hide". "It is absurd to claim that a platform or its owner are responsible for abuse of that platform," read a statement from the company published on the day of his arrest. Since his arrest, however, the company updated its terms of service and privacy policy to "further deter criminals" from abusing the app. The app's privacy policy now states that a user's IP address and phone number may be shared with relevant authorities if the platform receives valid information that the user is a suspect in a case involving criminal activities. While earlier this month, Telegram joined the UK's Internet Watch Foundation, gaining access to the organisation's datasets and technology to prevent CSAM on its app. The Foundation's interim CEO Derek Ray-Hill called this move a "transformational first step" for Telegram. Although Telegram may be removing millions of illegal pieces of material from its platform, a report published in October by the UN Office on Drugs and Crime revealed that criminal groups are sprawling Telegram forums openly, conducting illicit activities with little moderation. It also found that the platform has become an important tool for underground cryptocurrency exchanges, organised crime networks and online gambling rings. Don't miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic's digest of need-to-know sci-tech news.
Share
Share
Copy Link
Telegram intensifies its content moderation efforts using AI tools, blocking millions of illegal channels and groups in 2024, following the arrest of CEO Pavel Durov and increased scrutiny from global authorities.
In a significant shift towards stricter content moderation, Telegram has reported blocking over 15.5 million groups and channels violating its terms of service in 2024 12. This intensified effort comes in the wake of increased scrutiny from global authorities and the arrest of CEO Pavel Durov by French authorities in August 2024.
Telegram has implemented "cutting-edge AI moderation tools" to bolster its content filtering capabilities 1. The platform combines user reports with proactive monitoring powered by machine learning, which has been further enhanced with AI this year 2. This technological approach has enabled Telegram to block tens of thousands of groups and channels daily.
Telegram's moderation efforts have led to the blocking of over 700,000 CSAM-related groups and channels in 2024 12. The platform employs an automated system that checks images against a hash database of previously banned CSAM content. This database has been expanded with hashes from organizations like the Internet Watch Foundation (IWF), which Telegram recently partnered with 12.
More than 130,000 terrorism-related channels and groups were blocked in 2024 1. Telegram's collaboration with NGOs like ETIDAL has resulted in the removal of over 100 million pieces of extremist content since February 2022 2.
The arrest of Pavel Durov in France over alleged lack of moderation has been a turning point for Telegram's approach to content moderation 1. Following the arrest, Telegram has shown increased willingness to cooperate with authorities:
Despite these efforts, Telegram continues to face challenges:
The statistics provided by Telegram show a gradual increase in community bans in the latter half of 2024, particularly for terrorism-related content 1. This trend suggests that the platform's new approach to moderation, driven by AI and increased cooperation with authorities, may be yielding results. However, the long-term effectiveness of these measures and their impact on user privacy remain to be seen.
Reference
[1]
[2]
Pavel Durov, founder of Telegram, has announced a new initiative to combat illegal content on the platform. This move comes weeks after his arrest in France, sparking discussions about content moderation on messaging apps.
5 Sources
5 Sources
Telegram, the popular messaging app, has made significant changes to its policy on private chat moderation following the arrest of its CEO, Pavel Durov. The update comes amid growing concerns over content moderation and legal compliance.
3 Sources
3 Sources
Telegram, the popular messaging app known for its privacy features, is set to update its privacy policy. The change will allow the company to disclose user data, including IP addresses and phone numbers, to authorities in criminal cases.
15 Sources
15 Sources
Pavel Durov, CEO of Telegram, faces legal challenges in France over the app's alleged criminal use and personal accusations. The case highlights growing tensions between tech platforms and law enforcement.
5 Sources
5 Sources
South Korea's telecommunications regulator announces Telegram's compliance with new regulations. The messaging app agrees to appoint a domestic representative and cooperate with illegal content removal.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved