5 Sources
5 Sources
[1]
Wikipedia: AI-Generated Summaries Are Hurting Our Traffic
(Credit: Avishek Das/SOPA Images/LightRocket via Getty Images) After crunching the numbers to exclude armies of data-scraping AI bots, the Wikimedia Foundation says that between March and August this year, the number of Wikipedia page views coming from real humans declined by 8% year-on-year. The nonprofit, which supports Wikipedia, points the finger at the impact of generative AI and social media for the marked decline in how people now seek information online. For example, it highlighted that search engines are now providing answers directly to users -- often based on Wikipedia content -- and that younger generations are increasingly seeking information on social video platforms rather than the open web. That doesn't mean Wikipedia isn't still a strong presence in the world of online information. Though users may not be visiting Wikipedia directly, the Foundation points out that it remains one of the most popular sources of training data for large language models like OpenAI's ChatGPT and Anthropic's Claude, meaning users are still getting knowledge from the site -- just in a much more indirect way. The sheer volume of these data-scraping bots has caused issues for the platform, straining its resources and increasing its hosting costs. Wikipedia warned against the potential implications of falling viewer numbers on the site, saying it could lead to fewer volunteers filling the site with content and fewer donors to support its upkeep. It also called on AI firms and digital platforms to make it more transparent about where their generations come from and support their sources. "For people to trust information shared on the internet, platforms should make it clear where the information is sourced from and elevate opportunities to visit and participate in those sources," read the blog. Wikipedia's recent problems aren't unique. Many of the world's most popular online news websites have also complained that AI-generated search engine summaries -- such as Google's AI Overviews -- are leading to fewer clicks and less revenue. Last month, an organization representing some of the world's largest publishers, Digital Content Next (DCN), found that median year-over-year referral traffic from Google Search was down 10% in May and June 2025. The nonprofit represents outlets such as The New York Times, Bloomberg, Fox News Digital, and NBC News. Some of the worst-hit publishers reported click-through declines of as much as 25%, though Google has denied having a role in the trend. Meanwhile, there is simply much more content online than ever before for Wikipedia to compete with, now that AI-generated writing has broken into the mainstream. According to an analysis from SEO firm Graphite, the percentage of AI-generated articles on the web is now slightly above 50%, with the volume of AI-generated articles skyrocketing month by month since ChatGPT debuted in November 2022.
[2]
Wikimedia says AI bots and summaries are hurting Wikipedia's traffic
Wikimedia is sounding the alarm on the impact AI is having on reliable knowledge and information on the internet. In a , Wikimedia's senior director of product, Marshall Miller, lays out the impact on page views that the foundation attributes to the rise of LLM chatbots and AI-generated summaries in search results. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content," said Miller. The foundation has increasingly faced whose sophistication has made it difficult to parse human traffic from bots. After improving bot detection to yield more accurate metrics, Wikipedia's data shows an 8 percent drop in page views year over year. Miller paints a picture of an existential risk greater than that of a website's page views. He posits that if Wikipedia's traffic continues to decline, it could threaten what he calls "the only site of its scale with standards of verifiability, neutrality and transparency powering information all over the internet." He warns that fewer visits to Wikipedia would lead to fewer volunteers, less funding and ultimately less reliable content. The solution he offers is for LLMs and search results to be more intentional in giving users the opportunity to interact directly with the source for the information being presented. "For people to trust information shared on the internet, platforms should make it clear where the information is sourced from and elevate opportunities to visit and participate in those sources," Miller writes. Earlier this summer, Wikipedia floated the idea of AI-generated summaries that would appear at the top of articles. The project was before it began after fierce backlash from the site's volunteer editors.
[3]
AI Is Killing WikipediaΓ’β¬β’s Human Traffic
WikipediaΓ’β¬β’s human traffic is declining as more people rely on AI tools for answers. The Wikimedia Foundation, the nonprofit that runs Wikipedia, says that shifts in how people search for information online are cutting into its human traffic. In a blog post published today, Marshall Miller, the foundationΓ’β¬β’s senior director of product, said WikipediaΓ’β¬β’s human visits are down about 8% over the past few months compared to the same period in 2024. The decline was revealed after the Foundation revised how it distinguishes between human and bot traffic, something it does to better understand real readership and enforce limits on how third-party bots scrape its data for commercial search and AI tools. The update came after Wikimedia noticed what looked like a spike in human traffic from Brazil, which turned out to be mostly bots. Γ’β¬ΕWe believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content,Γ’β¬ Miller wrote. He wrote that the drop wasnΓ’β¬β’t exactly a surprise. Search engines are increasingly using AI to surface answers directly on results pages instead of linking to external sites like Wikipedia. At the same time, younger users are turning to platforms like YouTube and TikTok for information. Unfortunately, these shifts could lead to negative ripple effects for Wikipedia. With fewer visits, WikipediaΓ’β¬β’s volunteer base, the community that writes and edits its content, could shrink, Miller warned. And with less traffic, individual donations that keep the nonprofit running could also decline. The situation is ironic, Miller noted, because almost all large language models (LLMS) rely on WikipediaΓ’β¬β’s datasets for training. Yet in doing so, they may be hurting one of their most trusted sources of reliable information. Because of this, Wikimedia is urging LLMs, AI chatbots, search engines, and social platforms that use Wikipedia content to help drive more traffic back to the site. In order to combat the issue, the nonprofit said itΓ’β¬β’s working to ensure third parties can access and reuse Wikipedia content responsibly and at scale by enforcing its policies and developing clearer attribution standards. ItΓ’β¬β’s also experimenting with new ways to reach younger audiences on platforms like YouTube, TikTok, Roblox, and Instagram, via videos, games, and chatbots. Wikimedia itself isn't anti-AI. Just this month, the Foundation launched the Wikidata Embedding Project, a new resource that converted roughly 120 million open data points in Wikidata into a format thatΓ’β¬β’s easier for large language models to use. The goal is to give AI systems access to free, higher-quality data and improve the accuracy of their answers.
[4]
Wikipedia Says AI Is Causing a Dangerous Decline in Human Visitors
"With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." The Wikimedia Foundation, the nonprofit organization that hosts Wikipedia, says that it's seeing a significant decline in human traffic to the online encyclopedia because more people are getting the information that's on Wikipedia via generative AI chatbots that were trained on its articles and search engines that summarize them without actually clicking through to the site. The Wikimedia Foundation said that this poses a risk to the long term sustainability of Wikipedia. "We welcome new ways for people to gain knowledge. However, AI chatbots, search engines, and social platforms that use Wikipedia content must encourage more visitors to Wikipedia, so that the free knowledge that so many people and platforms depend on can continue to flow Sustainably," the Foundation's Senior Director of Product Marshall Miller said in a blog post. "With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." Ironically, while generative AI and search engines are causing a decline in direct traffic to Wikipedia, its data is more valuable to them than ever. Wikipedia articles are some of the most common training data for AI models, and Google and other platforms have for years mined Wikipedia articles to power its Snippets and Knowledge Panels, which siphon traffic away from Wikipedia itself. "Almost all large language models train on Wikipedia datasets, and search engines and social media platforms prioritize its information to respond to questions from their users," Miller said. That means that people are reading the knowledge created by Wikimedia volunteers all over the internet, even if they don't visit wikipedia.org -- this human-created knowledge has become even more important to the spread of reliable information online." Miller said that in May 2025 Wikipedia noticed unusually high amounts of apparently human traffic originating mostly from Brazil. He didn't go into details, but explained this caused the Foundation to update its bot detections systems. "After making this revision, we are seeing declines in human pageviews on Wikipedia over the past few months, amounting to a decrease of roughly 8% as compared to the same months in 2024," he said. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content." Miller told me in an email that Wikipedia has policies for third-party bots that crawl its content, such as specifying identifying information and following its robots.txt, and limits on request rate and concurrent requests. "For obvious reasons, we can't share details publicly about how exactly we block and detect bots," he said. "In the case of the adjustment we made to data over the past few months, we observed a substantial increase over the level of traffic we expected, centering on a particular region, and there wasn't a clear reason for it. When our engineers and analysts investigated the data, they discovered a new pattern of bot behavior, designed to appear human. We then adjusted our detection systems and re-applied them to the past several months of data. Because our bot detection has evolved over time, we can't make exact comparisons - but this adjustment is showing the decline in human pageviews." The Foundation's findings align with other research we've seen recently. In July, the Pew Research Center found that only 1 percent of Google searches resulted in the users clicking on the link in the AI summary, which takes them to the page Google is summarizing. In April, the Foundation previously reported that it was getting hammered by AI scrapers, a problem that has also plagued libraries, archives, and museums. Wikipedia editors are also acutely aware of the risk generative AI poses to the reliability of Wikipedia articles if its use is not moderated effectively. "These declines are not unexpected. Search engines are increasingly using generative AI to provide answers directly to searchers rather than linking to sites like ours," Miller said. "And younger generations are seeking information on social video platforms rather than the open web. This gradual shift is not unique to Wikipedia. Many other publishers and content platforms are reporting similar shifts as users spend more time on search engines, AI chatbots, and social media to find information. They are also experiencing the strain that these companies are putting on their infrastructure." Miller said that the Foundation is "enforcing policies, developing a framework for attribution, and developing new technical capabilities" in order to ensure third-parties responsibly access and reuse Wikipedia content, and continues to "strengthen" its partnerships with search engines and other large "re-users." The Foundation, he said, is also working on bringing Wikipedia content to younger audiences via YouTube, TikTok, Roblox, and Instagram. However, Miller also called on users to "choose online behaviors that support content integrity and content creation." "When you search for information online, look for citations and click through to the original source material," he said. "Talk with the people you know about the importance of trusted, human curated knowledge, and help them understand that the content underlying generative AI was created by real people who deserve their support."
[5]
Wikipedia Is Getting Pretty Worried About AI
Over at the official blog of the Wikipedia community, Marshall Miller untangled a recent mystery. "Around May 2025, we began observing unusually high amounts of apparently human traffic," he wrote. Higher traffic would generally be good news for a volunteer-sourced platform that aspires to reach as many people as possible, but it would also be surprising: The rise of chatbots and the AI-ification of Google Search have left many big websites with fewer visitors. Maybe Wikipedia, like Reddit, is an exception? Nope! It was just bots: This [rise] led us to investigate and update our bot detection systems. We then used the new logic to reclassify our traffic data for March-August 2025, and found that much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection ... after making this revision, we are seeing declines in human pageviews on Wikipedia over the past few months, amounting to a decrease of roughly 8% as compared to the same months in 2024. To be clearer about what this means, these bots aren't just vaguely inauthentic users or some incidental side effect of the general spamminess of the internet. In many cases, they're bots working on behalf of AI firms, going undercover as humans to scrape Wikipedia for training or summarization. Miller got right to the point. "We welcome new ways for people to gain knowledge," he wrote. "However, LLMs, AI chatbots, search engines, and social platforms that use Wikipedia content must encourage more visitors to Wikipedia." Fewer real visits means fewer contributors and donors, and it's easy to see how such a situation could send one of the great experiments of the web into a death spiral. Arguments like this are intuitive and easy to make, and you'll hear them beyond the ecosystem of the web: AI models ingest a lot of material, often without clear permission, and then offer it back to consumers in a form that's often directly competitive with the people or companies that provided it in the first place. Wikipedia's authority here is bolstered by how it isn't trying to make money -- it's run by a foundation, not an established commercial entity that feels threatened by a new one -- but also by its unique position. It was founded as a stand-alone reference resource before settling ambivalently into a new role: A site that people mostly just found through Google but in greater numbers than ever. With the rise of LLMs, Wikipedia became important in a new way as a uniquely large, diverse, well-curated data set about the world; in return, AI platforms are now effectively keeping users away from Wikipedia even as they explicitly use and reference its materials. Here's an example: Let's say you're reading this article and become curious about Wikipedia itself -- its early history, the wildly divergent opinions of its original founders, its funding, etc. Unless you've been paying attention to this stuff for decades, it may feel as if it's always been there. Surely, there's more to it than that, right? So you ask Google, perhaps as a shortcut for getting to a Wikipedia page, and Google uses AI to generate a blurb that looks like this: This is an AI Overview that summarizes, among other things, Wikipedia. Formally, it's pretty close to an encyclopedia article. With a few formatting differences -- notice the bullet-point AI-ese -- it hits a lot of the same points as Wikipedia's article about itself. It's a bit shorter than the top section of the official article and contains far fewer details. It's fine! But it's a summary of a summary. The next option you encounter still isn't Wikipedia's article -- that shows up further down. It's a prompt to "Dive deeper in AI Mode." If you do that, you see this: It's another summary, this time with a bit of commentary. (Also: If Wikipedia is "generally not considered a reliable source itself because it is a tertiary source that synthesizes information from other places," then what does that make a chatbot?) There are links in the form of footnotes, but as Miller's post suggests, people aren't really clicking them. Google's treatment of Wikipedia's autobiography is about as pure an example as you'll see of AI companies' effective relationship to the web (and maybe much of the world) around them as they build strange, complicated, but often compelling products and deploy them to hundreds of millions of people. To these companies, it's a resource to be consumed, processed, and then turned into a product that attempts to render everything before it is obsolete -- or at least to bury it under a heaping pile of its own output.
Share
Share
Copy Link
The Wikimedia Foundation reports an 8% drop in human traffic to Wikipedia, attributing the decline to AI-generated summaries and changing information-seeking behaviors. This trend poses potential risks to Wikipedia's sustainability and the broader landscape of online knowledge.
The Wikimedia Foundation, the nonprofit organization behind Wikipedia, has reported a significant decline in human traffic to its website. After improving its bot detection systems, the Foundation discovered an 8% year-over-year decrease in page views from real humans between March and August 2025
1
2
.Source: NYMag
The Foundation attributes this decline to the growing influence of generative AI and changes in how people seek information online. Search engines now provide direct answers to users, often based on Wikipedia content, without requiring users to visit the site itself
3
. Additionally, younger generations are increasingly turning to social video platforms for information rather than traditional web sources.Source: Gizmodo
Marshall Miller, Wikimedia's senior director of product, warns that this trend could have serious implications for Wikipedia's long-term sustainability
4
. Fewer visits to Wikipedia could lead to:Source: PC Magazine
Ironically, while AI-generated summaries are reducing direct traffic to Wikipedia, the encyclopedia remains one of the most popular sources of training data for large language models like OpenAI's ChatGPT and Anthropic's Claude
1
. This situation has led to increased strain on Wikipedia's resources and higher hosting costs due to the volume of data-scraping bots.Wikipedia's challenges are not unique. Many popular online news websites have reported similar declines in click-through rates due to AI-generated search engine summaries. Digital Content Next (DCN), representing major publishers, found a 10% decrease in median year-over-year referral traffic from Google Search in May and June 2025
1
.Related Stories
The Wikimedia Foundation is urging AI firms and digital platforms to be more transparent about their information sources and to support the original content creators
2
. They argue that for people to trust online information, platforms should clearly indicate their sources and provide opportunities for users to visit and engage with those sources directly.To address these challenges, Wikipedia is taking several steps
3
5
:As the landscape of online information continues to evolve, the future of Wikipedia and other knowledge-sharing platforms remains uncertain, highlighting the need for a balanced approach to AI integration and content attribution in the digital age.
Summarized by
Navi
[1]
1
Technology
2
Business and Economy
3
Business and Economy