15 Sources
15 Sources
[1]
Wikipedia says traffic is falling due to AI search summaries and social video | TechCrunch
Wikipedia is often described as the last good website on an internet increasingly filled with toxic social media and AI slop, but it seems the online encyclopedia is not completely immune to broader trends, with human pageviews falling 8% year-over-year, according to a new blog post from Marshall Miller of the Wikimedia Foundation. The foundation works to distinguish between traffic from humans and bots, and Miller writes that the decline "over the past few months" was revealed after an update to Wikipedia's bot detection systems appeared to show that "much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection." Why is traffic falling? Miller points to "the impact of generative AI and social media on how people seek information," particularly as "search engines are increasingly using generative AI to provide answers directly to searchers rather than linking to sites like ours" and as "younger generations are seeking information on social video platforms rather than the open web." (Google has disputed the claim that AI summaries reduce traffic from search.) Miller says the foundation welcomes "new ways for people to gain knowledge" and argues this doesn't make Wikipedia any less important, since knowledge sourced from the encyclopedia is still reaching people even if they don't visit the website. Wikipedia even experimented with AI summaries of its own, though it paused the effort after editors complained. But this shift does present risks, particularly if people are becoming less aware of where their information actually comes from. As Miller puts it, "With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." (And some of those volunteers are truly remarkable, reportedly disarming a gunman at a Wikipedia editors' conference on Friday.) For that reason, he argues that AI, search, and social companies using content from Wikipedia "must encourage more visitors" to the website itself. And he says Wikipedia is taking steps of its own, for example by developing a new framework for attributing content from the encyclopedia. The organization also has two teams tasked with helping Wikipedia reach new readers, and it's looking for volunteers to help. Miller also encourages readers to "support content integrity and content creation" more broadly. "When you search for information online, look for citations and click through to the original source material," he writes. "Talk with the people you know about the importance of trusted, human curated knowledge, and help them understand that the content underlying generative AI was created by real people who deserve their support."
[2]
Wikipedia Says It's Losing Traffic Due to AI Summaries, Social Media Videos
Julian is a contributor and former staff writer at CNET. He's covered a range of topics, such as tech, crypto travel, sports and commerce. His past work has appeared at print and online publications, including New Mexico Magazine, TV Guide, Mental Floss and NextAdvisor with TIME. On his days off, you can find him at Isotopes Park in Albuquerque watching the ballgame. Wikipedia has seen a decline in users this year due to artificial intelligence summaries in search engine results and the growing popularity of social media, according to a blog post Friday from Marshall Miller of the Wikimedia Foundation, the organization that oversees the free online encyclopedia. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. In the post, Miller describes an 8% drop in human pageviews over the last few months compared with the numbers Wikipedia saw in the same months in 2024. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content," Miller wrote. AI-generated summaries that pop up on search engines like Bing and Google often use bots called web crawlers to gather much of the information that users read at the top of the search results. Websites do their best to restrict how these bots handle their data, but web crawlers have become pretty skilled at going undetected. "Many bots that scrape websites like ours are continually getting more sophisticated and trying to appear human," Miller wrote. After reclassifying Wikipedia traffic data from earlier this year, Miller says the site "found that much of the unusually high traffic for the period of May and June was coming from bots built to evade detection." The Wikipedia blog post also noted that younger generations are turning to social-video platforms for their information rather than the open web and such sites as Wikipedia. There is now promising research on the impact of generative AI on the internet, especially concerning online publishers with business models that rely on users visiting their webpages. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) In July, Pew Research examined browsing data from 900 US adults and found that the AI-generated summaries at the top of Google's search results affected web traffic. When the summary appeared in a search, users were less likely to click on links compared to when the search results didn't include the summaries. Google search is especially important, because Google.com is the world's most visited website -- it's how most of us find what we're looking for on the internet. "LLMs, AI chatbots, search engines and social platforms that use Wikipedia content must encourage more visitors to Wikipedia, so that the free knowledge that so many people and platforms depend on can continue to flow sustainably," Miller wrote. "With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." Last year, CNET published an extensive report on how changes in Google's search algorithm decimated web traffic for online publishers.
[3]
Wikipedia: AI-Generated Summaries Are Hurting Our Traffic
(Credit: Avishek Das/SOPA Images/LightRocket via Getty Images) After crunching the numbers to exclude armies of data-scraping AI bots, the Wikimedia Foundation says that between March and August this year, the number of Wikipedia page views coming from real humans declined by 8% year-on-year. The nonprofit, which supports Wikipedia, points the finger at the impact of generative AI and social media for the marked decline in how people now seek information online. For example, it highlighted that search engines are now providing answers directly to users -- often based on Wikipedia content -- and that younger generations are increasingly seeking information on social video platforms rather than the open web. That doesn't mean Wikipedia isn't still a strong presence in the world of online information. Though users may not be visiting Wikipedia directly, the Foundation points out that it remains one of the most popular sources of training data for large language models like OpenAI's ChatGPT and Anthropic's Claude, meaning users are still getting knowledge from the site -- just in a much more indirect way. The sheer volume of these data-scraping bots has caused issues for the platform, straining its resources and increasing its hosting costs. Wikipedia warned against the potential implications of falling viewer numbers on the site, saying it could lead to fewer volunteers filling the site with content and fewer donors to support its upkeep. It also called on AI firms and digital platforms to make it more transparent about where their generations come from and support their sources. "For people to trust information shared on the internet, platforms should make it clear where the information is sourced from and elevate opportunities to visit and participate in those sources," read the blog. Wikipedia's recent problems aren't unique. Many of the world's most popular online news websites have also complained that AI-generated search engine summaries -- such as Google's AI Overviews -- are leading to fewer clicks and less revenue. Last month, an organization representing some of the world's largest publishers, Digital Content Next (DCN), found that median year-over-year referral traffic from Google Search was down 10% in May and June 2025. The nonprofit represents outlets such as The New York Times, Bloomberg, Fox News Digital, and NBC News. Some of the worst-hit publishers reported click-through declines of as much as 25%, though Google has denied having a role in the trend. Meanwhile, there is simply much more content online than ever before for Wikipedia to compete with, now that AI-generated writing has broken into the mainstream. According to an analysis from SEO firm Graphite, the percentage of AI-generated articles on the web is now slightly above 50%, with the volume of AI-generated articles skyrocketing month by month since ChatGPT debuted in November 2022.
[4]
Where does Wikipedia go in the age of AI?
On the second Sunday of every month you will find a small group of Wikipedia enthusiasts in a pub near London's Fleet Street discussing the most wildly obscure facts. Armed with flasks of coffee, laptops and the belief that knowledge should be freely shared, they form a volunteer bastion against the twin internet evils of misinformation and artificial intelligence slop. On a recent Sunday, 15 people showed up, including three women ("more than usual to be honest", murmurs one). Everyone here has their own specialist interest -- cotton mills in Lancashire, say, or the 19th-century newspaper launched by Benjamin Disraeli -- something that got them hooked on creating or correcting Wikipedia entries. It is, they say, addictive to see your work read by millions of people. Still, it can be a bit lonely, so the meetups are important. Wikipedia has always been a crowdsourced project. Created in 2001 by Jimmy Wales and Larry Sanger, the online encyclopedia is now a living relic of Gen X's version of the internet: text heavy, cookie-less, largely anonymous and advert free. Anyone can create a Wikipedia article and anyone else can change it. No matter how fierce political division and online arguments become, consensus must be reached through debate. It remains one of the 10 most popular websites. Over the past three years, however, Wikipedia has taken on a new role, acting as the feeding ground for generative AI models. Information curated by hand has been scraped, absorbed and regurgitated into chatbot summaries. Human traffic to the site is falling -- though bot traffic is up. As if that wasn't enough, Elon Musk has taken up arms against what he regards as Wikipedia's liberal bias, vowing to launch a rival called Grokipedia. You might think this would all be unsettling to people who have dedicated so much of their free time to building the site. And yet everyone I speak to seems to display a scholarly sense of serenity about its future. "If AI is the best way to spread accurate information around the world then that's what should be used," says Mike Peel, a softly spoken radio astronomer in a grey "Wikimedia" T-shirt. Peel, who is also one of Wikipedia's most prolific editors and sits on its foundation's board of (unpaid) trustees, thinks AI won't obliterate the site but might change the way people use it. "We have to look at it philosophically," he says. "Maybe our role is going to move from content creation to accuracy." Accuracy wasn't always Wikipedia's calling card. Open editing can result in errors and early, high-profile hoax entries attracted negative attention. The site's answer was to impose strict requirements for attribution -- raising the bar for factual reliability. "Things can get obnoxious online," says Charles Matthews, a former Cambridge university maths professor who has been a Wikipedia editor since 2003. "So we put tough content policies. You want us to believe something? Find a reference." Matthews, who has retreated to a corner to help a new editor set up an entry, says source requirements also tend to reduce AI-generated content and subdue arguments. Wikipedia pages on the most contentious subjects involve in-depth, citation-heavy debates. One pub goer who was blocked when an administrator opposed his edits says he has no hard feelings about it. He understands that the administrator (also a volunteer) was trying to guide the edits towards neutrality. ("Besides," he adds, "it was English Wikipedia, I can still edit the German one" -- there are more than 300 versions). AI companies could save themselves a lot of trouble if they followed Wikipedia's example. Earlier this month, OpenAI proudly claimed that it had succeeded in solving a difficult mathematical problem only to be informed that the answer had been scraped from an online source without attribution. ("Hoisted by their own GPTards" as Meta's chief AI scientist Yann LeCun put it). Instead of viewing AI as an extinction threat, Wikipedia volunteers point out how useful it can be -- they already use a machine-learning tool to spot editing vandalism. While Wikipedia can't replicate the content deal Reddit made with AI companies because its license allows free distribution, it has set up paid subscriptions for instant access to new entries. Google is already a subscriber. That should help keep the site ticking over. Besides, thanks to the volunteers, it's not that expensive to run. Reports show operating expenses of $178mn last year -- the money is largely spent on infrastructure such as servers. Maybe, says Peel, AI will even enhance Wikipedia's value. In the age of artificial content, human-made work deserves a premium.
[5]
Wikimedia says AI bots and summaries are hurting Wikipedia's traffic
Wikimedia is sounding the alarm on the impact AI is having on reliable knowledge and information on the internet. In a , Wikimedia's senior director of product, Marshall Miller, lays out the impact on page views that the foundation attributes to the rise of LLM chatbots and AI-generated summaries in search results. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content," said Miller. The foundation has increasingly faced whose sophistication has made it difficult to parse human traffic from bots. After improving bot detection to yield more accurate metrics, Wikipedia's data shows an 8 percent drop in page views year over year. Miller paints a picture of an existential risk greater than that of a website's page views. He posits that if Wikipedia's traffic continues to decline, it could threaten what he calls "the only site of its scale with standards of verifiability, neutrality and transparency powering information all over the internet." He warns that fewer visits to Wikipedia would lead to fewer volunteers, less funding and ultimately less reliable content. The solution he offers is for LLMs and search results to be more intentional in giving users the opportunity to interact directly with the source for the information being presented. "For people to trust information shared on the internet, platforms should make it clear where the information is sourced from and elevate opportunities to visit and participate in those sources," Miller writes. Earlier this summer, Wikipedia floated the idea of AI-generated summaries that would appear at the top of articles. The project was before it began after fierce backlash from the site's volunteer editors.
[6]
AI Is Killing Wikipedia’s Human Traffic
Wikipedia’s human traffic is declining as more people rely on AI tools for answers. The Wikimedia Foundation, the nonprofit that runs Wikipedia, says that shifts in how people search for information online are cutting into its human traffic. In a blog post published today, Marshall Miller, the foundation’s senior director of product, said Wikipedia’s human visits are down about 8% over the past few months compared to the same period in 2024. The decline was revealed after the Foundation revised how it distinguishes between human and bot traffic, something it does to better understand real readership and enforce limits on how third-party bots scrape its data for commercial search and AI tools. The update came after Wikimedia noticed what looked like a spike in human traffic from Brazil, which turned out to be mostly bots. “We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content,†Miller wrote. He wrote that the drop wasn’t exactly a surprise. Search engines are increasingly using AI to surface answers directly on results pages instead of linking to external sites like Wikipedia. At the same time, younger users are turning to platforms like YouTube and TikTok for information. Unfortunately, these shifts could lead to negative ripple effects for Wikipedia. With fewer visits, Wikipedia’s volunteer base, the community that writes and edits its content, could shrink, Miller warned. And with less traffic, individual donations that keep the nonprofit running could also decline. The situation is ironic, Miller noted, because almost all large language models (LLMS) rely on Wikipedia’s datasets for training. Yet in doing so, they may be hurting one of their most trusted sources of reliable information. Because of this, Wikimedia is urging LLMs, AI chatbots, search engines, and social platforms that use Wikipedia content to help drive more traffic back to the site. In order to combat the issue, the nonprofit said it’s working to ensure third parties can access and reuse Wikipedia content responsibly and at scale by enforcing its policies and developing clearer attribution standards. It’s also experimenting with new ways to reach younger audiences on platforms like YouTube, TikTok, Roblox, and Instagram, via videos, games, and chatbots. Wikimedia itself isn't anti-AI. Just this month, the Foundation launched the Wikidata Embedding Project, a new resource that converted roughly 120 million open data points in Wikidata into a format that’s easier for large language models to use. The goal is to give AI systems access to free, higher-quality data and improve the accuracy of their answers.
[7]
Wikipedia Says AI Is Causing a Dangerous Decline in Human Visitors
"With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." The Wikimedia Foundation, the nonprofit organization that hosts Wikipedia, says that it's seeing a significant decline in human traffic to the online encyclopedia because more people are getting the information that's on Wikipedia via generative AI chatbots that were trained on its articles and search engines that summarize them without actually clicking through to the site. The Wikimedia Foundation said that this poses a risk to the long term sustainability of Wikipedia. "We welcome new ways for people to gain knowledge. However, AI chatbots, search engines, and social platforms that use Wikipedia content must encourage more visitors to Wikipedia, so that the free knowledge that so many people and platforms depend on can continue to flow Sustainably," the Foundation's Senior Director of Product Marshall Miller said in a blog post. "With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." Ironically, while generative AI and search engines are causing a decline in direct traffic to Wikipedia, its data is more valuable to them than ever. Wikipedia articles are some of the most common training data for AI models, and Google and other platforms have for years mined Wikipedia articles to power its Snippets and Knowledge Panels, which siphon traffic away from Wikipedia itself. "Almost all large language models train on Wikipedia datasets, and search engines and social media platforms prioritize its information to respond to questions from their users," Miller said. That means that people are reading the knowledge created by Wikimedia volunteers all over the internet, even if they don't visit wikipedia.org -- this human-created knowledge has become even more important to the spread of reliable information online." Miller said that in May 2025 Wikipedia noticed unusually high amounts of apparently human traffic originating mostly from Brazil. He didn't go into details, but explained this caused the Foundation to update its bot detections systems. "After making this revision, we are seeing declines in human pageviews on Wikipedia over the past few months, amounting to a decrease of roughly 8% as compared to the same months in 2024," he said. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content." Miller told me in an email that Wikipedia has policies for third-party bots that crawl its content, such as specifying identifying information and following its robots.txt, and limits on request rate and concurrent requests. "For obvious reasons, we can't share details publicly about how exactly we block and detect bots," he said. "In the case of the adjustment we made to data over the past few months, we observed a substantial increase over the level of traffic we expected, centering on a particular region, and there wasn't a clear reason for it. When our engineers and analysts investigated the data, they discovered a new pattern of bot behavior, designed to appear human. We then adjusted our detection systems and re-applied them to the past several months of data. Because our bot detection has evolved over time, we can't make exact comparisons - but this adjustment is showing the decline in human pageviews." The Foundation's findings align with other research we've seen recently. In July, the Pew Research Center found that only 1 percent of Google searches resulted in the users clicking on the link in the AI summary, which takes them to the page Google is summarizing. In April, the Foundation previously reported that it was getting hammered by AI scrapers, a problem that has also plagued libraries, archives, and museums. Wikipedia editors are also acutely aware of the risk generative AI poses to the reliability of Wikipedia articles if its use is not moderated effectively. "These declines are not unexpected. Search engines are increasingly using generative AI to provide answers directly to searchers rather than linking to sites like ours," Miller said. "And younger generations are seeking information on social video platforms rather than the open web. This gradual shift is not unique to Wikipedia. Many other publishers and content platforms are reporting similar shifts as users spend more time on search engines, AI chatbots, and social media to find information. They are also experiencing the strain that these companies are putting on their infrastructure." Miller said that the Foundation is "enforcing policies, developing a framework for attribution, and developing new technical capabilities" in order to ensure third-parties responsibly access and reuse Wikipedia content, and continues to "strengthen" its partnerships with search engines and other large "re-users." The Foundation, he said, is also working on bringing Wikipedia content to younger audiences via YouTube, TikTok, Roblox, and Instagram. However, Miller also called on users to "choose online behaviors that support content integrity and content creation." "When you search for information online, look for citations and click through to the original source material," he said. "Talk with the people you know about the importance of trusted, human curated knowledge, and help them understand that the content underlying generative AI was created by real people who deserve their support."
[8]
Wikipedia Traffic Drops as AI Answers Eat the Free Encyclopedia - Decrypt
>>>> gd2md-html alert: inline image link in generated source and store images to your server. NOTE: Images in exported zip file from Google Docs may not appear in the same order as they do in your doc. Please check the images! -----> The Wikimedia Foundation announced this week that human traffic to Wikipedia fell roughly 8% between May and August compared to the same period last year. The decline came into focus after the foundation discovered that sophisticated bots, primarily from Brazil, had been disguising themselves as human visitors. After updating its detection systems in May, the foundation reclassified traffic data and found much of the unusually high traffic in May and June came from bots built to evade detection. The revised numbers revealed what many in publishing already knew: fewer people visit Wikipedia directly because search engines now provide answers on their own pages. "After making this revision, we are seeing declines in human pageviews on Wikipedia over the past few months, amounting to a decrease of roughly 8% as compared to the same months in 2024," Marshall Miller, wrote. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content." AI is not just killing Wikipedia. Data from Pew Research showed median year-over-year referral traffic from Google Search to premium publishers has decreased almost every week during May and June 2025, with losses outpacing gains two-to-one. Nearly 60% of all Google searches end up in an AI summary instead of promoting the reading of the actual source. Publishers across industries are sounding alarms and resorting to lawsuits to get some protection. Danielle Coffey, who leads the News/Media Alliance representing more than 2,000 outlets, said Google is using publisher content without compensation while offering no meaningful way to opt out without disappearing from search entirely. "It's parasitic, it's unsustainable, and it poses a real existential threat to many in our industry," she said. The volume of AI content online is rising fast. Research from SEO firm Graphite found that as of November 2024, almost half of new web articles were generated using AI in some form, up from just 5% before ChatGPT's launch. A post by Ask Perplexity on X claimed AI content went from around 5% in 2020 to 48% by May 2025, with projections saying 90% or more by next year. The Wikimedia Foundation said fewer visits to Wikipedia could mean that fewer volunteers grow and enrich the content, and fewer individual donors support the work. The foundation is responding by enforcing policies for third-party access, developing a framework for attribution, and experimenting with ways to bring free knowledge to younger audiences on platforms like YouTube and TikTok. The foundation said Wikipedia's human knowledge is more valuable to the world than ever before, 25 years since its creation. The question is whether the platforms using that knowledge will support the ecosystem that creates it. The Wikipedia Foundation did not immediately respond to a request for comments by Decrypt.
[9]
Wikipedia says it's getting fewer 'human pageviews' thanks to AI and social media but it's got a plan to deal with it
Hopefully that plan doesn't include even bigger donation banners. Whatever your feelings about it, we are heading into a somewhat different digital world thanks to AI. I'm ridiculously resistant to changing my habits, and even I am starting to consider keeping an AI chatbot open rather than Google for quick research. So really, it's little surprise that Wikipedia is apparently seeing fewer human pageviews lately. That's according to Marshall Miller of the Wikimedia Foundation, who says that over the past few months, the site has seen a "decrease of roughly 8% as compared to the same months in 2024" in human pageviews. After updating its bot detection systems, the Foundation "found that much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection." "We believe that these declines reflect the impact of generative AI and social media on how people seek information," says Miller, "especially with search engines providing answers directly to searchers, often based on Wikipedia content." The AI answers that Google spits up top of the SERPs, in my experience, tend to reference articles from publications -- such as yours truly -- more so than Wikipedia. And unfortunately we now know that AI content now outnumbers human-written articles, although thankfully the trend in this direction seems to have plateaued. I don't have any stats to back up the notion that search engine AI answers reference a lot of articles from publications rather than Wikipedia, just my own anecdotal evidence. But I don't doubt that many chatbots get their unreferenced answers from Wikipedia, given it is the internet's biggest and most trustworthy encyclopaedia. That point is often disputed. Many a time have I happened across the common 'debate bro' retort: 'Wikipedia's not a real source.' And while there's something to that, it's hard to deny its use and trustworthiness to get a baseline understanding of at least more common or popular topics which are more widely vetted. Editing is democratic in nature, so expert updates can sometimes be overruled by a more baseline consensus, meaning you're not getting entirely accurate or up-to-date information, but for a general overview of uncontroversial or popular topics, it's a great and wide-spanning resource. For instance, I have expertise in some areas of philosophy, and yet I still sometimes turn to Wikipedia for an initial understanding of areas of philosophy I'm less familiar with (prior to deeper SEP dives or even light IEP dives). Fewer people visiting Wikipedia might not be too much of an issue if the chatbots that people are instead presumably relying on attain their info from Wikipedia. That would just mean people are getting it from Wikipedia indirectly. Miller thinks this is the case: "[Wikipedia is] still among the most valuable datasets that these new formats of knowledge dissemination rely on. Almost all large language models (LLMs) train on Wikipedia datasets, and search engines and social media platforms prioritize its information to respond to questions from their users." But Miller identifies another issue, this being that, "with fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." Plus, there's the question of how accurately AI is taking this information from Wikipedia, and whether it picks up all the latest edits. And there's also the possibility that it's not AI but other forms of content that users are getting their information from, such as social media. That would be something the Wikipedia probably can't do much about, though, given that's just humans learning and/or repeating things, which is different to AI scraping and repeating information. Regarding what Wikipedia might have more of a say in -- people using AI which has scraped and is relying on Wikipedia content -- The Wikimedia Foundation is taking steps to try and tackle this change. Amongst other things, the Foundation says, "To make sure third-parties responsibly access and reuse Wikipedia content at scale, we are enforcing policies, developing a framework for attribution, and developing new technical capabilities, including through Wikimedia Enterprise."
[10]
Wikipedia's human traffic drops 8% as AI takes the wheel
Search engines increasingly display AI-generated answers instead of linking to sites like Wikipedia. The Wikimedia Foundation announced an 8% year-over-year decrease in human pageviews to Wikipedia. A blog post by Marshall Miller, from the foundation, attributed the decline to the effects of generative AI and social media on information-seeking habits. The decline in human traffic became apparent after the foundation updated its bot-detection systems. Miller wrote that this update revealed "much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection." This adjustment in traffic analysis uncovered the 8% drop in human pageviews "over the past few months," correcting previously inflated metrics and providing a clearer picture of user engagement. Miller identified two primary factors for the reduction in visitors: "the impact of generative AI and social media on how people seek information." He specified that search engines are "increasingly using generative AI to provide answers directly to searchers rather than linking to sites like ours." He also noted a separate trend of "younger generations seeking information on social video platforms rather than the open web." In response to similar assertions, Google has disputed the claim that its AI-driven summaries reduce traffic from search results to external websites. Despite the traffic downturn, Miller stated the Wikimedia Foundation welcomes "new ways for people to gain knowledge," arguing this does not diminish Wikipedia's role, as its content still reaches people indirectly. Wikipedia had initiated its own experiment with AI-generated summaries but paused the project following complaints from its volunteer editors. The shift in user behavior presents specific risks. Miller stated, "With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work." The foundation noted the dedication of its volunteers, citing a recent incident where they reportedly disarmed a gunman at a Wikipedia editors' conference. Miller contended that AI, search, and social media companies that utilize Wikipedia's content "must encourage more visitors" to the website itself. In parallel, the Wikimedia Foundation is implementing its own strategies to address the trend. The organization is developing a new framework for the attribution of its content when used by third parties. It has also established two internal teams specifically tasked with developing methods for Wikipedia to reach new readers. The foundation continues to actively seek new volunteers to contribute to the platform. Miller also issued an appeal for readers to "support content integrity and content creation." He wrote, "When you search for information online, look for citations and click through to the original source material." He further encouraged people to "Talk with the people you know about the importance of trusted, human curated knowledge, and help them understand that the content underlying generative AI was created by real people who deserve their support."
[11]
Wikipedia Is Getting Pretty Worried About AI
Over at the official blog of the Wikipedia community, Marshall Miller untangled a recent mystery. "Around May 2025, we began observing unusually high amounts of apparently human traffic," he wrote. Higher traffic would generally be good news for a volunteer-sourced platform that aspires to reach as many people as possible, but it would also be surprising: The rise of chatbots and the AI-ification of Google Search have left many big websites with fewer visitors. Maybe Wikipedia, like Reddit, is an exception? Nope! It was just bots: This [rise] led us to investigate and update our bot detection systems. We then used the new logic to reclassify our traffic data for March-August 2025, and found that much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection ... after making this revision, we are seeing declines in human pageviews on Wikipedia over the past few months, amounting to a decrease of roughly 8% as compared to the same months in 2024. To be clearer about what this means, these bots aren't just vaguely inauthentic users or some incidental side effect of the general spamminess of the internet. In many cases, they're bots working on behalf of AI firms, going undercover as humans to scrape Wikipedia for training or summarization. Miller got right to the point. "We welcome new ways for people to gain knowledge," he wrote. "However, LLMs, AI chatbots, search engines, and social platforms that use Wikipedia content must encourage more visitors to Wikipedia." Fewer real visits means fewer contributors and donors, and it's easy to see how such a situation could send one of the great experiments of the web into a death spiral. Arguments like this are intuitive and easy to make, and you'll hear them beyond the ecosystem of the web: AI models ingest a lot of material, often without clear permission, and then offer it back to consumers in a form that's often directly competitive with the people or companies that provided it in the first place. Wikipedia's authority here is bolstered by how it isn't trying to make money -- it's run by a foundation, not an established commercial entity that feels threatened by a new one -- but also by its unique position. It was founded as a stand-alone reference resource before settling ambivalently into a new role: A site that people mostly just found through Google but in greater numbers than ever. With the rise of LLMs, Wikipedia became important in a new way as a uniquely large, diverse, well-curated data set about the world; in return, AI platforms are now effectively keeping users away from Wikipedia even as they explicitly use and reference its materials. Here's an example: Let's say you're reading this article and become curious about Wikipedia itself -- its early history, the wildly divergent opinions of its original founders, its funding, etc. Unless you've been paying attention to this stuff for decades, it may feel as if it's always been there. Surely, there's more to it than that, right? So you ask Google, perhaps as a shortcut for getting to a Wikipedia page, and Google uses AI to generate a blurb that looks like this: This is an AI Overview that summarizes, among other things, Wikipedia. Formally, it's pretty close to an encyclopedia article. With a few formatting differences -- notice the bullet-point AI-ese -- it hits a lot of the same points as Wikipedia's article about itself. It's a bit shorter than the top section of the official article and contains far fewer details. It's fine! But it's a summary of a summary. The next option you encounter still isn't Wikipedia's article -- that shows up further down. It's a prompt to "Dive deeper in AI Mode." If you do that, you see this: It's another summary, this time with a bit of commentary. (Also: If Wikipedia is "generally not considered a reliable source itself because it is a tertiary source that synthesizes information from other places," then what does that make a chatbot?) There are links in the form of footnotes, but as Miller's post suggests, people aren't really clicking them. Google's treatment of Wikipedia's autobiography is about as pure an example as you'll see of AI companies' effective relationship to the web (and maybe much of the world) around them as they build strange, complicated, but often compelling products and deploy them to hundreds of millions of people. To these companies, it's a resource to be consumed, processed, and then turned into a product that attempts to render everything before it is obsolete -- or at least to bury it under a heaping pile of its own output.
[12]
Wikipedia Reports Traffic Decline as A.I. Changes How Users Seek Information
Wikipedia says A.I.-generated summaries and chatbots are cutting into its human readership, even as its content powers those same tools. It's hard to imagine a world without Wikipedia, the online encyclopedia that for a quarter of a century has stood as one of the enduring staples of the Internet. In recent months, however, the site's foundation has been shaken by the rise of generative A.I., with Wikipedia attributing an 8 percent decline in human visitors this year to the proliferation of A.I.-generated content. Sign Up For Our Daily Newsletter Sign Up Thank you for signing up! By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime. See all of our newsletters "These changes are not unexpected," said Marshall Miller, senior director of product at the Wikimedia Foundation, in a recent blog post. "Search engines are increasingly using generative A.I. to provide answers directly to searchers rather than linking to sites like ours." Wikipedia first noticed the decline after, ironically, observing an unusually high amount of seemingly human visitors this spring. The platform later discovered that the uptick was due to a surge in bots designed to evade detection. A subsequent revamp of its bot detection systems revealed that real human traffic over the past few months had fallen roughy 8 percent compared with the same time period in 2024. "We believe that these declines reflect the impact of generative A. I. and social media on how people seek information," said Miller. One of the forces at play is Google's AI Overviews feature, which was introducted last year to provide A.I.-generated summaries above search results, drawing information across the web. According to a Pew Research Center report, the share of users likely to click on traditional search result links is halved when a Google AI Overview summary appears. Of 900 surveyed U.S. adults, only 1 percent clicked on a link within an AI Overview summary itself. Google, however, maintains that its A.I. integration has done little to reduce overall click volume to linked websites. Content from Wikipedia -- alongside YouTube and Reddit -- dominates not only traditional Google search results but the company's new A.I. summaries. Together, the three websites accounted for roughly 15 percent of A.I. content and 17 percent of standard search content, according to Pew. Wikipedia remains a prime target for data-hungry large language models (LLMs) that rely on vast troves of online information for training, often gathered through web-scraping bots. Nearly all LLMs train on Wikipedia datasets, according to Miller, who noted that the encyclopedia's information is prioritized by search engines and social media platforms alike when they generate chatbot responses. This means that even as site visits decline, Wikipedia's content continues to be consumed. Still, fewer visitors pose an existential threat to the encyclopedia's future, as the nonprofit relies on donations and volunteer editors to keep its vast trove of knowledge free and up to date. How does Wikipedia use A.I.? Wikipedia isn't necessarily anti-A.I. -- in fact, the encyclopediahas embraced A.I. to streamline tedious tasks for its editors. Its A.I. tools assist with workflows, giving volunteers more time for discussion and consensus building, while also automating translations of common topics and scaling up onboarding for new contributors. To better manage A.I.'s risks, Wikipedia is now pursuing a range of new projects. These include developing a framework for content attribution and making it easier for volunteers to edit from mobile devices, said Miller. The foundation is also experimenting with new ways to reach younger audiences on platforms like TikTok and Roblox through mediums like videos, games and even A.I. chatbots. "Twenty-five years since its creation, Wikipedia's human knowledge is more valuable to the world than ever before," said Miller. "As we call upon everyone to support our knowledge platforms in old and new ways, we are optimistic that Wikipedia will be here, ensuring the internet provides free, accurate, human knowledge, for generations to come.
[13]
Wikipedia blames ChatGPT for falling traffic -- and claims bots are...
After threatening to replace humans in many sectors, generative AI is now targeting online platforms as well. Wikipedia is seeing a sharp decline in traffic as online users increasingly turn to ChatGPT and Google AI overviews to get their info. According to a new blog post by Marshall Miller of the Wikimedia Foundation, human page views are are down 8% these past few months "as compared to the same months in 2024." This troubling phenomenon came to light after Wikipedia's bot detection systems seemed to show that "much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection." Miller believes that the trend reflects "the impact of generative AI and social media on how people seek information, noting "search engines providing answers directly to searchers, often based on Wikipedia content." Throw in the fact that "younger generations are seeking information on social video platforms rather than the open web," and it's no wonder that internet users are increasingly bypassing the Wiki middleman. To wit, an Adobe Express report conducted over the summer found that 77% of Americans who use ChatGPT treat it as a search engine while three in ten ChatGPT users trust it more than a search engine. Despite the looming threat of AI, Miller doesn't believe that the digital encyclopedia was going obsolete. "Almost all large language models (LLMs) train on Wikipedia datasets, and search engines and social media platforms prioritize its information to respond to questions from their users," he wrote. "That means that people are reading the knowledge created by Wikimedia volunteers all over the internet, even if they don't visit wikipedia.org." To help users get their info straight from the source, Wikipedia even experimented with AI summaries like Google, but put the kibosh on the movement after editors complained, Techcrunch reported. Nonetheless, Miller expressed concern that the AI takeover would make it difficult to know where information is coming from. "With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work," he fretted. Wikipedia is not the only platform's whose eyeballs have been impacted by generative AI. In a statement to the Competition and Markets Authority in July, DMG Media, owner of MailOnline, claimed that AI Overviews had caused click-through rates for their site to plummet by 89 percent. This comes amid a spike in AI slop -- low quality and misleading content that's auto-generated by artificial intelligence -- that's impacting every sector from academia to law. In May, California judge slapped two law firms with a $31,000 fine after finding that they'd included AI slop in a legal brief sans any due diligence.
[14]
Wikipedia Sees Traffic Decline of Nearly 10 Percent as Search Engines' AI Summaries Divert Users
Wikimedia Foundation noticed the drop in human users after discovering an increase in bot traffic from Brazil. Wikipedia has reported a sharp decline in traffic because bots that scrape information from the website are generating AI summaries for the major search engines that do not require web users to click through to their sources. The Wikimedia Foundation, which operates the online encyclopedia, said in a recent blog post that human traffic has dipped by eight percent over the past few months, admitting that the decline went unnoticed until it spotted a surge in bot traffic from Brazil. "Around May 2025, we began observing unusually high amounts of apparently human traffic, mostly originating from Brazil. This led us to investigate and update our bot detection systems," the foundation said in the post, adding that upon further review they detected that the traffic was not human but instead by bot programs. "We believe that these declines reflect the impact of generative AI and social media on how people seek information, especially with search engines providing answers directly to searchers, often based on Wikipedia content," the foundation says. "These declines are not unexpected. Search engines are increasingly using generative AI to provide answers directly to searchers rather than linking to sites like ours." The foundation's findings underscore a trend across the internet as users are increasingly finding answers directly in search results without the need to click through to the source link. Traditional search engines like Google and Bing are now displaying AI-generated summaries prominently in their results with clicks to news and reference websites declining substantially as a result. The report also reveals a significant decline in Wikipedia usage in the last two years, with fewer people turning to the platform for fact-checking, dropping to 35 percent from more than 50 percent, while users seeking information on current events decreased to under 20 percent from more than 30 percent. The WMF claims that a "regular review of global trends" led them to take preemptive actions like developing new frameworks for attribution and technical capabilities. Wikipedia is also facing more direct competition of late, including from new online encyclopedias like Justapedia, the right-leaning Conservapedia, and the upcoming Grokipedia from Elon Musk. The foundation has also long faced criticism for bias and misinformation. One of Wikipedia's fiercest critics since he left the organization in 2002, Larry Sanger, has claimed that he has observed a significant increase in bias since 2020, describing a "syndrome of perspectives" favoring globalist, academic, secular, and progressive views. Mr. Sanger recently published what he has dubbed the "Nine Theses on Wikipedia" naming them after the 95 theses written by theologian Martin Luther in 1517 that are credited with launching the Protestant Reformation. His list of nine theses focuses on three areas of reform -- improving editorial standards by reviving the site's original neutrality policy, increasing community governance, and addressing concerns of how Wikipedia engages with the public. "The 62 most powerful editors in Wikipedia are anonymous," Mr. Sanger recently told the New York Sun. "They need to allow the public to rate articles right on Wikipedia, not just for their own benefit, but also for the benefit of the rest of the public."
[15]
AI chatbots are replacing Wikipedia for quick answers, human visits drops by 8 pct
Foundation warns of impact on volunteers, funding, and content credibility. Wikipedia, the free online encyclopedia, has seen a notable dip in human engagement, due to AI chatbots. According to the Wikimedia Foundation, pageviews from genuine users have dropped by 8% compared to 2024, as AI tools, search engines, and social media increasingly serve up information without redirecting users to the site. In a blog post published on October 17, the Foundation explained that changing internet habits and evolving bot traffic patterns are reshaping how people access information. While the platform continues to serve as a backbone for large language models and AI systems that draw from its vast database, this shift raises questions on the survival of the platform, as fewer users actually visit the site. Notably, AI chatbots and modern search engines deliver direct, summarized answers often based on Wikipedia's own content, eliminating the need for users to visit the website. Additionally, younger audiences are turning to short-form video platforms like TikTok and YouTube for quick insights. Also read: OnePlus 15 India launch date, specifications, price, design and all other leaks As the genuine human traffic drops, the Wikimedia Foundation warned that it could impact volunteer recruitment and donor funding, both of which are important to maintain the encyclopedia's values of neutrality, transparency, and verifiability. The Foundation urged AI firms, search platforms, and social media companies to credit and link back to Wikipedia, ensuring visibility for its community-driven knowledge base. Also read: Apple iPhone 16e price drops by over Rs 10,000: Check deal details here The website has also urged volunteers to participate in testing new tools, reviewing experiments, and helping Wikipedia to serve while maintaining the platform's human-centred, free knowledge ethos.
Share
Share
Copy Link
Wikipedia reports an 8% drop in human pageviews, attributing the decline to AI-generated search summaries and shifts in information-seeking behavior. The Wikimedia Foundation expresses concerns about the potential impact on content creation and funding.
Wikipedia, often hailed as the last bastion of reliable information on the internet, is facing a significant challenge in the form of declining human traffic. According to Marshall Miller of the Wikimedia Foundation, the online encyclopedia has experienced an 8% drop in human pageviews year-over-year
1
2
.
Source: New York Post
The decline in traffic is attributed to two main factors:
AI-generated summaries: Search engines are increasingly using generative AI to provide direct answers to users' queries, often based on Wikipedia content, without requiring users to visit the site
1
3
.Shift in information-seeking behavior: Younger generations are turning to social video platforms for information rather than traditional websites
2
.
Source: Gizmodo
While Wikipedia remains a crucial source of information for AI training data and search engine summaries, the decline in direct traffic poses several risks:
Reduced volunteer engagement: Fewer visitors may lead to a decrease in volunteers contributing to and enriching Wikipedia's content
1
.Funding challenges: A decline in visitors could result in fewer individual donors supporting Wikipedia's work
3
.Content quality concerns: With less direct engagement, there's a risk of diminished content accuracy and reliability
4
.
Source: Financial Times News
Related Stories
Despite these challenges, Wikipedia and its community are adapting to the changing landscape:
Calls for attribution: The Wikimedia Foundation is urging AI companies and platforms to provide clear attribution and encourage users to visit the original sources
5
.Evolving role: Some Wikipedia editors see the site's future role shifting from content creation to ensuring accuracy in AI-generated information
4
.Embracing AI tools: Wikipedia is exploring the use of AI for tasks such as detecting vandalism while maintaining its commitment to human-curated knowledge
4
.As the internet landscape continues to evolve, Wikipedia faces the challenge of maintaining its relevance and ensuring the sustainability of its crowdsourced model of knowledge creation and curation in an AI-driven world.
Summarized by
Navi
[1]
[3]
[4]