6 Sources
6 Sources
[1]
At 25, Wikipedia embodies what the internet could have been - but can it survive AI?
Wikipedia is the world's most popular online encyclopedia.It is the most successful open data project of all time.However, AI brings new challenges and long-term threats. Today, when people ask, 'Where was Madonna born?' 'Who won the 1999 Super Bowl?' or 'Who's the current world classical chess champion?' (Bay City, Michigan, the Denver Broncos, and Gukesh Dommaraju), they turn to Wikipedia. Or, to be more exact, if they Google the answer, Wikipedia is the top source, but Google's AI Overview is what they'll see at the search results page. It's Wikipedia writers, however, who did the research for the answers. Also: Even Linus Torvalds is vibe coding now Twenty-five years ago, it was another story. Before 15 January 2001, if you did a Google search, your answers to those earlier questions would have come from a Madonna fan site, ESPN, and the Internet Chess Club. On that day, a small nonprofit launched what seemed like a utopian idea, an encyclopedia that anyone could edit. Today, it's one of the top 10 websites in the world, cited in court rulings, academic papers, and journalism. And yet, volunteers and donations still run it, without a single ad in sight. Wikipedia started as a side project of Nupedia. This obscure project was Jimmy Wales' and Larry Sanger's first attempt to create a peer-reviewed encyclopedia. Nupedia was launched in March 2000 as a free online encyclopedia, written and peer‑reviewed by subject-matter experts through a seven‑step approval process. It failed badly. Also: I tried Grokipedia, the AI-powered anti-Wikipedia. Here's why neither is foolproof In its first months of existence. Nupedia, a free, online encyclopedia written and peer‑reviewed by subject-matter experts under a seven‑step approval process, produced a mere two dozen articles. Wikipedia, which allows anyone to write and edit articles, soared in popularity after its 2001 launch. It quickly became the most successful open collaboration experiment ever. Today, Wikipedia boasts over six million English-language articles and content in over 320 languages. Early skeptics doubted it would last. How could a website that anyone could change produce reliable facts? In 2005, Nature famously compared Wikipedia's accuracy to Encyclopedia Britannica and found surprisingly little difference. Two and a half decades later, Wikipedia remains fallible, but self-correcting. Errors get fixed faster than they'd be noticed in print. Its openness, paradoxically, is also its safeguard. As Wales said at the time, Wikipedia was both self-policing and self-cleaning. That's not to say Wikipedia is perfect, far from it. Wikipedia has spent 25 years walking a tightrope between openness and abuse, and most of its growing pains come from that contradiction. The biggest challenges have been maintaining reliability at scale, keeping a small volunteer community from burning out, and defending the project against political and legal attacks. Also: Linux will be unstoppable in 2026 - but one open-source legend may not survive There are some areas, Wikipedia editors admit, they've been found lacking. The site's open-edit model has encouraged systemic biases (gender, racial, national, and ideological), especially since most editors are male and concentrated in North America and Europe. New and minority editors often report a hostile climate of harassment, cliques, and "ownership" of articles that drives people away and feeds long-running "editor retention" efforts. Even setting these issues aside, that doesn't mean the platform's immune to abuse. Edit wars, coordinated disinformation campaigns, and cultural bias persist. The Wikimedia Foundation has had to double down on anti-manipulation policies, while editors wage daily battles to keep political and corporate spin in check. Still, the community itself, with over 250,000 active editors, remains its greatest defense. Despite their best efforts, corporations, governments, and PR firms have repeatedly attempted to launder reputations through undisclosed paid editing, sockpuppet networks, and conflict‑of‑interest campaigns. For example, in 2012, a pair of senior Wikipedia editors was found to be writing and editing articles at the request of their clients for a fee. Since then, numerous other instances of Wikipedia editors collaborating to profit from writing and editing biased articles have emerged. It's an ongoing problem. Also: My 11 favorite Linux distributions of all time, ranked The site also suffers from a long history of "edit wars" and politicized editing around topics like biographies, climate, and geopolitics, which can turn article pages into battlegrounds instead of neutral references. Indeed, controversial pages, such as those on the Arab-Israeli conflict, caste topics in India, and Donald Trump, can't be touched by most editors. That said, Wikipedia also helped to birth some key open technologies. The MediaWiki engine that powers Wikipedia also runs countless internal wikis, from NASA to Mozilla. Its open API and structured data project, Wikidata, quietly underpins parts of modern AI and search indexing. When you ask your phone a factual question, odds are the answer traces back, in part, to Wikipedia's structured metadata. Unlike social media platforms fueled by outrage and engagement metrics, Wikipedia thrives on consensus and transparency. Its talk pages are messy democratic forums -- more C-SPAN than TikTok. And that's precisely why Wikipedia has lasted. It resists the dopamine economy. Wikipedia's future is another matter. Donations fund its servers and staff, but editor participation is aging. Recruiting new contributors, especially from outside the English-speaking world, remains a challenge. The Wikipedia Foundation, now led by Maryana Iskander, is experimenting with partnerships, mobile tools, and even AI-assisted editing while insisting that human judgment stays central. However, AI is also hurting Wikipedia. After cleaning up AI bot noise in 2025, Wikimedia reported that genuine human page views declined by about 8% year-on-year in recent months. Wikipedia traffic analysis indicated that nearly all of the multi‑year decline was attributable to people no longer clicking on Wikipedia search links. According to SimilarWeb, ChatGPT is now the world's fifth-favorite website, while Wikipedia has dropped to ninth. Could Wikipedia die? That question might sound alarmist, but it was only a few years ago that Stack Overflow was everyone's favorite programming website. Stack Overflow's traffic began to fall when ChatGPT arrived. As AI programming has become commonplace, Stack Overflow's decline has accelerated. In December 2025, only 3,862 questions were posted on Stack Overflow, representing a 78% drop from the previous December. Also: You're reading more AI-generated content than you think At 25, Wikipedia embodies what the internet could have been: user-powered, open, and accountable. Wikipedia may have its warts, and it's far from perfect, but it's transparent about its imperfections. That's more than can be said for many tech giants. Wikipedia's quiet resilience proves that trust earned collectively can scale. The encyclopedia anyone can edit has outlived the dot-com boom, Web 2.0, and the first wave of AI hype. Whether it can survive the AI-enabled web of the next 25 years remains an open question; one that, fittingly, anyone can edit.
[2]
Wikipedia at 25: can its original ideals survive in the age of AI?
Around the turn of the century, the internet underwent a transformation dubbed "web 2.0". The world wide web of the 1990s had largely been read-only: static pages, hand-built homepages, portal sites with content from a few publishers. Then came the dotcom crash of 2000 to 2001, when many heavily financed, lightly useful internet businesses collapsed. In the aftermath, surviving companies and new entrants leaned into a different logic that the author-publisher Tim O'Reilly later described as "harnessing collective intelligence": platforms rather than pages, participation rather than passive consumption. And on January 15 2001, a website was born that seemed to encapsulate this new era. The first entry on its homepage read simply: "This is the new WikiPedia!" Wikipedia wasn't originally conceived as a not-for-profit website. In its early phase, it was hosted and supported through co-founder Jimmy Wales's for-profit search company, Bomis. But two years on, the Wikimedia Foundation was created as a dedicated non-profit to steward Wikipedia and its sibling projects. Wikipedia embodied the web 2.0 dream of a non-hierarchical, user-led internet built on participation and sharing. One foundational idea - volunteer human editors reviewing and authenticating content incrementally after publication - was highlighted in a 2007 Los Angeles Times report about Wales himself trying to write an entry for a butcher shop in Gugulethu, South Africa. His additions were reverted or blocked by other editors who disagreed about the significance of a shop they had never heard of. The entry finally appeared with a clause that neatly encapsulated the platform's self-governance model: "A Wikipedia article on the shop was created by the encyclopedia's co-founder Jimmy Wales, which led to a debate on the crowdsourced project's inclusion criteria." As a historical sociologist of artificial intelligence and the internet, I find Wikipedia revealing not because it is flawless, but because it shows its workings (and flaws). Behind almost every entry sits a largely uncredited layer of human judgement: editors weighing sources, disputing framing, clarifying ambiguous claims and enforcing standards such as verifiability and neutrality. Often, the most instructive way to read Wikipedia is to read its revision history. Scholarship has even used this edit history as a method - for example, when studying scientific discrepancies in the developnent of Crispr gene-editing technology, or the unfolding history of the 2011 Egyptian revolution. The scale of human labour that goes into Wikipedia is easy to take for granted, given its disarming simplicity of presentation. Statista estimates 4.4 billion people accessed the site in 2024 - over half the world and two-thirds of internet users. More than 125 million people have edited at least one entry. Wikipedia carries no advertising and does not trade in users' data - central to its claim of editorial independence. But users regularly see fundraising banners and appeals, and the Wikimedia Foundation has built paid services to manage high-volume reuse of its content - particularly by bots scraping it for AI training. The foundation's total assets now stand at more than US$310 million (£230 million). 'Wokepedia' v Grokipedia At 25, Wikipedia can still look like a rare triumph for the original web 2.0 ideals - at least in contrast to most of today's major open platforms, which have turned participation into surveillance advertising. Some universities, including my own, have used the website's anniversary to soothe fears about student use of generative AI. We panicked about students relying on Wikipedia, then adapted and carried on. The same argument now suggests we should not over-worry about students relying on generative AI to do their work. This comparison is sharpened by the rapid growth of Elon Musk's AI-powered version of Wikipedia (or "Wokepedia", as Musk dismissively refers to it). While Grokipedia uses AI to generate most of its entries, some are near-identical to Wikipedia's (all of which are available for republication under creative commons licensing). Grokipedia entries cannot be directly edited, but registered users can suggest corrections for the AI to consider. Despite only launching on October 27 2025, this AI encyclopedia already has more than 5.6 million entries, compared with Wikipedia's total of over 7.1 million. So, if Grokipedia overtakes its much older rival in scale at least, which now seems plausible, should we see this as the end of the web 2.0 dream, or simply another moment of adaptation? Credibility tested AI and the human-created internet have always been intertwined. Voluntary sharing is exploited for AI training with contested consent and thin attribution. Models trained on human writing generate new text that pollutes the web as "AI slop". Wikipedia has already collided with this. Editors report AI-written additions and plausible citations that fail on checking. They have responded with measures such as WikiProject AI Cleanup, which offers guidance on how to detect generic AI phrasing and other false information. But Wales does not want a full ban on AI within Wikipedia's domain. Rather, he has expressed hope for human-machine synergy, highlighting AI's potential to bring more non-native English contributors to the site. Wikipedia also acknowledges it has a serious gender imbalance, both in terms of entries and editors. Wikipedia's own credibility has regularly been tested over its 25-year history. High-profile examples include the John Seigenthaler Sr biography hoax, when an unregistered editor falsely wrote about the journalist's supposed ties to the Kennedy assasinations, and the Essjay controversy, in which a prominent editor was found to have fabricated their education credentials. There have also been recurring controversies over paid- or state-linked conflicts of interest, including the 2012 Wiki-PR case, when volunteers traced patterns to a firm and banned hundreds of accounts. These vulnerabilities have seen claims of political bias gain traction. Musk has repeatedly framed Wikipedia and mainstream outlets as ideologically slanted, and promoted Grokipedia as a "massive improvement" that needed to "purge out the propaganda". As Wikipedia reaches its 25th anniversary, perhaps we are witnessing a new "tragedy of the commons", where volunteered knowledge becomes raw material for systems that themselves may produce unreliable material at scale. Ursula K. Le Guin's novel The Dispossessed (1974) dramatises the dilemma Wikipedia faces: an anarchist commons survives only through constant maintenance, while facing the pull of a wealthier capitalist neighbour. According to the critical theorist McKenzie Wark: "It is not knowledge which is power, but secrecy." AI often runs on closed, proprietary models that scrape whatever is available. Wikipedia's counter-model is public curation with legible histories and accountability. But if Google's AI summaries and rankings start privileging Grokipedia, habits could change fast. This would repeat the "Californian ideology" that journalist-author Wendy M. Grossman was warned about in the year Wikipedia launched - namely, internet openness becoming fuel for Silicon Valley market power. Wikipedia and generative AI both alter knowledge circulation. One is a human publishing system with rules and revision histories. The other is a text production system that mimics knowledge without reliably grounding it. The choice, for the moment at least, is all of ours.
[3]
Wikipedia unveils new AI licensing deals as it marks 25th birthday
LONDON (AP) -- Wikipedia unveiled new business deals with a slew of artificial intelligence companies on Thursday as it marked its 25th anniversary. The online crowdsourced encyclopedia revealed that it has signed licensing deals with AI companies including Amazon, Meta Platforms, Perplexity, Microsoft and France's Mistral AI. Wikipedia is one of the last bastions of the early internet, but that original vision of a free online space has been clouded by the dominance of Big Tech platforms and the rise of generative AI chatbots trained on content scraped from the web. Aggressive data collection methods by AI developers, including from Wikipedia's vast repository of free knowledge, has raised questions about who ultimately pays for the artificial intelligence boom. The nonprofit that runs the site signed Google as one of its first customers in 2022 and announced other agreements last year with smaller AI players like search engine Ecosia. The new deals will help one of the world's most popular websites monetize heavy traffic from AI companies. They're paying to access Wikipedia content "at a volume and speed designed specifically for their needs," the Wikimedia Foundation said. It did not provide financial or other details. While AI training has sparked legal battles elsewhere over copyright and other issues, Wikipedia founder Jimmy Wales said he welcomes it. "I'm very happy personally that AI models are training on Wikipedia data because it's human curated," Wales told The Associated Press in an interview. "I wouldn't really want to use an AI that's trained only on X, you know, like a very angry AI," Wales said, referring to billionaire Elon Musk's social media platform. Wales said the site wants to work with AI companies, not block them. But "you should probably chip in and pay for your fair share of the cost that you're putting on us." The Wikimedia Foundation, a nonprofit group that runs Wikipedia, last year urged AI developers to pay for access through its enterprise platform and said human traffic had fallen 8%. Meanwhile, visits from bots, sometimes disguised to evade detection, were heavily taxing its servers as they scrape masses of content to feed AI large language models. The findings highlighted shifting online trends as search engine AI overviews and chatbots summarize information instead of sending users to sites by showing them links. Wikipedia is the ninth most visited site on the internet. It has more than 65 million articles in 300 languages that are edited by some 250,000 volunteers. The site has become so popular in part because its free for anyone to use. "But our infrastructure is not free, right?" Wikimedia Foundation CEO Maryana Iskander said in a separate interview in Johannesburg, South Africa. It costs money to maintain servers and other infrastructure that allows both individuals and tech companies to "draw data from Wikipedia," said Iskander, who's stepping down on Jan. 20, and will be replaced by Bernadette Meehan. The bulk of Wikipedia's funding comes from 8 million donors, most of them individuals. "They're not donating in order to subsidize these huge AI companies," Wales said. They're saying, "You know what, actually you can't just smash our website. You have to sort of come in the right way." Editors and users could benefit from AI in other ways. The Wikimedia Foundation has outlined an AI strategy that Wales said could result in tools that reduce tedious work for editors. While AI isn't good enough to write Wikipedia entries from scratch, it could, for example, be used to update dead links by scanning the surrounding text and then searching online to find other sources. "We don't have that yet but that's the kind of thing that I think we will see in the future." Artificial intelligence could also improve the Wikipedia search experience, by evolving from the traditional keyword method to more of a chatbot style, Wales said. "You can imagine a world where you can ask the Wikipedia search box a question and it will quote to you from Wikipedia," he said. It could respond by saying "here's the answer to your question from this article and here's the actual paragraph. That sounds really useful to me and so I think we'll move in that direction as well. " Reflecting on the early days, Wales said it was a thrilling time because many people were motivated to help build Wikipedia after he and co-founder Larry Sanger, who departed long ago, set it up as an experiment. However, while some might look back wistfully on what seems now to be a more innocent time, Wales said those early days of the internet also had a dark side. "People were pretty toxic back then as well. We didn't need algorithms to be mean to each other," he said. "But, you know, it was a time of great excitement and a real spirit of possibility." Wikipedia has lately found itself under fire from figures on the political right, who have dubbed the site "Wokepedia" and accused it of being biased in favor of the left. Republican lawmakers in the U.S. Congress are investigating alleged "manipulation efforts" in Wikipedia's editing process that they said could inject bias and undermine neutral points of view on its platform and the AI systems that rely on it. A notable source of criticism is Musk, who last year launched his own AI-powered rival, Grokipedia. He has criticized Wikipedia for being filled with "propaganda" and urged people to stop donating to the site. Wales said he doesn't consider Grokipedia a "real threat" to Wikipedia because it's based on large language models, which are the troves of online text that AI systems are trained on. "Large language models aren't good enough to write really quality reference material. So a lot of it is just regurgitated Wikipedia," he said. "It often is quite rambling and sort of talks nonsense. And I think the more obscure topic you look into, the worse it is." He stressed that he wasn't singling out criticism of Grokipedia. "It's just the way large language models work." Wales say he's known Musk for years but they haven't been in touch since Grokipedia launched. "'How's your family?' I'm a nice person, I don't really want to pick a fight with anybody." ____ AP writer Mogomotsi Magome in Johannesburg contributed to this report
[4]
Wikipedia unveils new AI licensing deals as it marks 25th birthday
LONDON -- Wikipedia unveiled new business deals with a slew of artificial intelligence companies on Thursday as it marked its 25th anniversary. The online crowdsourced encyclopedia revealed that it has signed licensing deals with AI companies including Amazon, Meta Platforms, Perplexity, Microsoft and France's Mistral AI. Wikipedia is one of the last bastions of the early internet, but that original vision of a free online space has been clouded by the dominance of Big Tech platforms and the rise of generative AI chatbots trained on content scraped from the web. Aggressive data collection methods by AI developers, including from Wikipedia's vast repository of free knowledge, has raised questions about who ultimately pays for the artificial intelligence boom. The nonprofit that runs the site signed Google as one of its first customers in 2022 and announced other agreements last year with smaller AI players like search engine Ecosia. The new deals will help one of the world's most popular websites monetize heavy traffic from AI companies. They're paying to access Wikipedia content "at a volume and speed designed specifically for their needs," the Wikimedia Foundation said. It did not provide financial or other details. While AI training has sparked legal battles elsewhere over copyright and other issues, Wikipedia founder Jimmy Wales said he welcomes it. "I'm very happy personally that AI models are training on Wikipedia data because it's human curated," Wales told The Associated Press in an interview. "I wouldn't really want to use an AI that's trained only on X, you know, like a very angry AI," Wales said, referring to billionaire Elon Musk's social media platform. Wales said the site wants to work with AI companies, not block them. But "you should probably chip in and pay for your fair share of the cost that you're putting on us." The Wikimedia Foundation, a nonprofit group that runs Wikipedia, last year urged AI developers to pay for access through its enterprise platform and said human traffic had fallen 8%. Meanwhile, visits from bots, sometimes disguised to evade detection, were heavily taxing its servers as they scrape masses of content to feed AI large language models. The findings highlighted shifting online trends as search engine AI overviews and chatbots summarize information instead of sending users to sites by showing them links. Wikipedia is the ninth most visited site on the internet. It has more than 65 million articles in 300 languages that are edited by some 250,000 volunteers. The site has become so popular in part because its free for anyone to use. "But our infrastructure is not free, right?" Wikimedia Foundation CEO Maryana Iskander said in a separate interview in Johannesburg, South Africa. It costs money to maintain servers and other infrastructure that allows both individuals and tech companies to "draw data from Wikipedia," said Iskander, who's stepping down on Jan. 20, and will be replaced by Bernadette Meehan. The bulk of Wikipedia's funding comes from 8 million donors, most of them individuals. "They're not donating in order to subsidize these huge AI companies," Wales said. They're saying, "You know what, actually you can't just smash our website. You have to sort of come in the right way." Editors and users could benefit from AI in other ways. The Wikimedia Foundation has outlined an AI strategy that Wales said could result in tools that reduce tedious work for editors. While AI isn't good enough to write Wikipedia entries from scratch, it could, for example, be used to update dead links by scanning the surrounding text and then searching online to find other sources. "We don't have that yet but that's the kind of thing that I think we will see in the future." Artificial intelligence could also improve the Wikipedia search experience, by evolving from the traditional keyword method to more of a chatbot style, Wales said. "You can imagine a world where you can ask the Wikipedia search box a question and it will quote to you from Wikipedia," he said. It could respond by saying "here's the answer to your question from this article and here's the actual paragraph. That sounds really useful to me and so I think we'll move in that direction as well. " Reflecting on the early days, Wales said it was a thrilling time because many people were motivated to help build Wikipedia after he and co-founder Larry Sanger, who departed long ago, set it up as an experiment. However, while some might look back wistfully on what seems now to be a more innocent time, Wales said those early days of the internet also had a dark side. "People were pretty toxic back then as well. We didn't need algorithms to be mean to each other," he said. "But, you know, it was a time of great excitement and a real spirit of possibility." Wikipedia has lately found itself under fire from figures on the political right, who have dubbed the site "Wokepedia" and accused it of being biased in favor of the left. Republican lawmakers in the U.S. Congress are investigating alleged "manipulation efforts" in Wikipedia's editing process that they said could inject bias and undermine neutral points of view on its platform and the AI systems that rely on it. A notable source of criticism is Musk, who last year launched his own AI-powered rival, Grokipedia. He has criticized Wikipedia for being filled with "propaganda" and urged people to stop donating to the site. Wales said he doesn't consider Grokipedia a "real threat" to Wikipedia because it's based on large language models, which are the troves of online text that AI systems are trained on. "Large language models aren't good enough to write really quality reference material. So a lot of it is just regurgitated Wikipedia," he said. "It often is quite rambling and sort of talks nonsense. And I think the more obscure topic you look into, the worse it is." He stressed that he wasn't singling out criticism of Grokipedia. "It's just the way large language models work." Wales say he's known Musk for years but they haven't been in touch since Grokipedia launched. "'How's your family?' I'm a nice person, I don't really want to pick a fight with anybody." ____ AP writer Mogomotsi Magome in Johannesburg contributed to this report
[5]
At 25, Wikipedia Navigates a Quarter-Life Crisis in the Age of A.I.
As A.I. search reshapes how people get answers, the encyclopedia confronts falling page views and a fight for relevance. Traffic to Wikipedia, the world's largest online encyclopedia, naturally ebbs and flows with the rhythms of daily life -- rising and falling with the school calendar, the news cycle or even the day of the week -- making routine fluctuations unremarkable for a site that draws roughly 15 billion page views a month. But sustained declines tell a different story. Last October, the Wikimedia Foundation, the nonprofit that oversees Wikipedia, disclosed that human traffic to the site had fallen 8 percent in recent months as a growing number of users turned to A.I. search engines and chatbots for answers. Sign Up For Our Daily Newsletter Sign Up Thank you for signing up! By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime. See all of our newsletters "I don't think that we've seen something like this happen in the last seven to eight years or so," Marshall Miller, senior director of product at the Wikimedia Foundation, told Observer. Launched on Jan. 15, 2001, Wikipedia turns 25 today. This milestone comes at a pivotal point for the online encyclopedia, which is straddling a delicate line between fending off existential risks posed by A.I. and avoiding irrelevance as the technology transforms how people find and consume information. "It's really this question of long-term sustainability," Lane Becker, senior director of earned revenue at the Wikimedia Foundation, told Observer. "We'd like to make it at least another 25 years -- and ideally much longer." While it's difficult to pinpoint Wikipedia's recent traffic declines on any single factor, it's evident that the drop coincides with the emergence of A.I. search features, according to Miller. Chatbots such as ChatGPT and Perplexity often cite and link to Wikipedia, but because the information is already embedded in the A.I.-generated response, users are less likely to click through to the source, depriving the site of page views. Yet the spread of A.I.-generated content also underscores Wikipedia's central role in the online information ecosystem. Wikipedia's vast archive -- more than 65 million articles across over 300 languages -- plays a prominent role within A.I. tools, with the site's data scraped by nearly all large language models (LLMs). "Yes, there is a decline in traffic to our sites, but there may well be more people getting Wikipedia knowledge than ever because of how much it's being distributed through those platforms that are upstream of us," said Miller. Surviving in the era of A.I. Wikipedia must find a way to stay financially and editorially viable as the internet changes. Declining page views not only mean that fewer visitors are likely to donate to the platform, threatening its main source of revenue, but also risk shrinking the community of volunteer editors who sustain it. Fewer contributors would mean slower content growth, ultimately leaving less material for LLMs to draw from. Metrics that track volunteer participation have already begun to slip, according to Miller. While noting that "it's hard to parse out all the different reasons that this happens," he conceded that the Foundation has "reason to believe that declines in page views will lead to declines in volunteer activity." To maintain a steady pipeline of contributors, users must first become aware of the platform and understand its collaborative model. That makes proper attribution by A.I. tools essential, Miller said. Beyond simply linking to Wikipedia, surfacing metadata -- such as when a page was last updated or how many editors contributed -- could spur curiosity and encourage users to engage more deeply with the platform. Tech companies are becoming aware of the value of keeping Wikipedia relevant. Over the past year, Microsoft, Mistral AI, Perplexity AI, Ecosia, Pleias and ProRata have joined Wikimedia Enterprise, a commercial product that allows corporations to pay for large-scale access and distribution of Wikipedia content. Google and Amazon have long been partners of the platform, which was launched in 2021. The basic premise is that Wikimedia Enterprise customers can access content from Wikipedia at a higher volume and speed while helping sustain the platform's mission. "I think there's a growing understanding on the part of these A.I. companies about the significance of the Wikipedia dataset, both as it currently exists and also its need to exist in the future," said Becker. Wikipedia is hardly alone in this shift. News organizations, including CNN, the Associated Press and The New York Times, have struck licensing deals with A.I. companies to supply editorial content in exchange for payment, while infrastructure providers like Cloudflare offer tools that allow websites to charge A.I. crawlers for access. Last month, the licensing nonprofit Creative Commons announced its support of a "pay-to-crawl" approach for managing A.I. bots. Preparing for an uncertain future Wikipedia itself is also adapting to a younger generation of internet users. In an effort to make editing Wikipedia more appealing, the platform is working to enhance its mobile edit features, reflecting the fact that younger audiences are far more likely to engage on smartphones than desktop computers. Younger users' preference for social video platforms such as YouTube and TikTok has also pushed Wikipedia's Future Audiences team -- a division tasked with expanding readership -- to experiment with video. The effort has already paid off, producing viral clips on topics ranging from Wikipedia's most hotly disputed edits to the courtship dance of the black-footed albatross and Sino-Roman relations. The organization is also exploring a deeper presence on gaming platforms, another major draw for younger users. Evolving with the times also means integrating A.I. further within the platform. Wikipedia has introduced features such as Edit Check, which offers real-time feedback on whether a proposed edit fits a page, and is developing features like Tone Check to help ensure articles adhere to a neutral point of view. A.I.-generated content has also begun to seep onto the platform. As of August 2024, roughly 5 percent of newly created English articles on the site were produced with the help of A.I., according to a Princeton study. Seeing this as a problem, Wikipedia introduced a "speedy deletion" policy that allows editors to quickly remove content that shows clear signs of being A.I.-generated. Still, the community remains divided over whether using A.I. for tasks such as drafting articles is inherently problematic, said Miller. "There's this active debate." From streamlining editing to distributing its content ever more widely, Wikipedia is betting that A.I. can ultimately be an ally rather than an adversary. If managed carefully, the technology could help accelerate the encyclopedia's mission over the next 25 years -- as long as it doesn't bring down the encyclopedia first. "Our whole thing is knowledge dissemination to anyone that wants it, anywhere that they want it," said Becker. "If this is how people are going to learn things -- and people are learning things and gaining value from the information that our community is able to bring forward -- we absolutely want to find a way to be there and support it in ways that align with our values."
[6]
Wikipedia unveils new AI licensing deals as it marks 25th birthday
LONDON -- Wikipedia unveiled new business deals with a slew of artificial intelligence companies on Thursday as it marked its 25th anniversary. The online crowdsourced encyclopedia revealed that it has signed licensing deals with AI companies including Amazon, Meta Platforms, Perplexity, Microsoft and France's Mistral AI. Wikipedia is one of the last bastions of the early internet, but that original vision of a free online space has been clouded by the dominance of Big Tech platforms and the rise of generative AI chatbots trained on content scraped from the web. Aggressive data collection methods by AI developers, including from Wikipedia's vast repository of free knowledge, has raised questions about who ultimately pays for the artificial intelligence boom. The nonprofit that runs the site signed Google as one of its first customers in 2022 and announced other agreements last year with smaller AI players like search engine Ecosia. The new deals will help one of the world's most popular websites monetize heavy traffic from AI companies. They're paying to access Wikipedia content "at a volume and speed designed specifically for their needs," the Wikimedia Foundation said. It did not provide financial or other details. While AI training has sparked legal battles elsewhere over copyright and other issues, Wikipedia founder Jimmy Wales said he welcomes it. "I'm very happy personally that AI models are training on Wikipedia data because it's human curated," Wales told The Associated Press in an interview. "I wouldn't really want to use an AI that's trained only on X, you know, like a very angry AI," Wales said, referring to billionaire Elon Musk's social media platform. Wales said the site wants to work with AI companies, not block them. But "you should probably chip in and pay for your fair share of the cost that you're putting on us." The Wikimedia Foundation, a nonprofit group that runs Wikipedia, last year urged AI developers to pay for access through its enterprise platform and said human traffic had fallen 8%. Meanwhile, visits from bots, sometimes disguised to evade detection, were heavily taxing its servers as they scrape masses of content to feed AI large language models. The findings highlighted shifting online trends as search engine AI overviews and chatbots summarize information instead of sending users to sites by showing them links. Wikipedia is the ninth most visited site on the internet. It has more than 65 million articles in 300 languages that are edited by some 250,000 volunteers. The site has become so popular in part because its free for anyone to use. "But our infrastructure is not free, right?" Wikimedia Foundation CEO Maryana Iskander said in a separate interview in Johannesburg, South Africa. It costs money to maintain servers and other infrastructure that allows both individuals and tech companies to "draw data from Wikipedia," said Iskander, who's stepping down on Jan. 20, and will be replaced by Bernadette Meehan. The bulk of Wikipedia's funding comes from 8 million donors, most of them individuals. "They're not donating in order to subsidize these huge AI companies," Wales said. They're saying, "You know what, actually you can't just smash our website. You have to sort of come in the right way." Editors and users could benefit from AI in other ways. The Wikimedia Foundation has outlined an AI strategy that Wales said could result in tools that reduce tedious work for editors. While AI isn't good enough to write Wikipedia entries from scratch, it could, for example, be used to update dead links by scanning the surrounding text and then searching online to find other sources. "We don't have that yet but that's the kind of thing that I think we will see in the future." Artificial intelligence could also improve the Wikipedia search experience, by evolving from the traditional keyword method to more of a chatbot style, Wales said. "You can imagine a world where you can ask the Wikipedia search box a question and it will quote to you from Wikipedia," he said. It could respond by saying "here's the answer to your question from this article and here's the actual paragraph. That sounds really useful to me and so I think we'll move in that direction as well. " Reflecting on the early days, Wales said it was a thrilling time because many people were motivated to help build Wikipedia after he and co-founder Larry Sanger, who departed long ago, set it up as an experiment. However, while some might look back wistfully on what seems now to be a more innocent time, Wales said those early days of the internet also had a dark side. "People were pretty toxic back then as well. We didn't need algorithms to be mean to each other," he said. "But, you know, it was a time of great excitement and a real spirit of possibility." Wikipedia has lately found itself under fire from figures on the political right, who have dubbed the site "Wokepedia" and accused it of being biased in favor of the left. Republican lawmakers in the U.S. Congress are investigating alleged "manipulation efforts" in Wikipedia's editing process that they said could inject bias and undermine neutral points of view on its platform and the AI systems that rely on it. A notable source of criticism is Musk, who last year launched his own AI-powered rival, Grokipedia. He has criticized Wikipedia for being filled with "propaganda" and urged people to stop donating to the site. Wales said he doesn't consider Grokipedia a "real threat" to Wikipedia because it's based on large language models, which are the troves of online text that AI systems are trained on. "Large language models aren't good enough to write really quality reference material. So a lot of it is just regurgitated Wikipedia," he said. "It often is quite rambling and sort of talks nonsense. And I think the more obscure topic you look into, the worse it is." He stressed that he wasn't singling out criticism of Grokipedia. "It's just the way large language models work." Wales say he's known Musk for years but they haven't been in touch since Grokipedia launched. "'How's your family?' I'm a nice person, I don't really want to pick a fight with anybody."
Share
Share
Copy Link
Wikipedia celebrates its 25th anniversary by signing AI licensing deals with Amazon, Meta, Microsoft, Perplexity, and Mistral AI. But the online encyclopedia faces an 8% drop in human traffic as chatbots and AI search engines reshape how people access information. The Wikimedia Foundation is monetizing AI access while defending its volunteer-driven model in an era where AI-generated content threatens its sustainability.
Wikipedia reached its 25th anniversary on January 15, marking a milestone that few could have predicted when Jimmy Wales and Larry Sanger launched the crowdsourced encyclopedia in 2001
1
3
. To commemorate the occasion, the Wikimedia Foundation unveiled AI licensing deals with major technology companies including Amazon, Meta, Microsoft, Perplexity, and France's Mistral AI3
4
. These agreements allow AI companies to access Wikipedia content "at a volume and speed designed specifically for their needs," though financial details remain undisclosed3
. The ninth most visited site on the internet, with more than 65 million articles in 300 languages edited by some 250,000 volunteer editors, now finds itself navigating a delicate balance in the age of AI3
4
.
Source: The Conversation
The Wikipedia 25th anniversary celebration comes amid concerning signs that AI is reshaping the information ecosystem. Last October, the Wikimedia Foundation disclosed that human traffic to the site had fallen 8% in recent months, a sustained decline not seen in seven to eight years
5
. Marshall Miller, senior director of product at the Wikimedia Foundation, told Observer that while Wikipedia naturally experiences routine fluctuations in its roughly 15 billion monthly page views, this drop coincides with the emergence of AI search features5
. Chatbots such as ChatGPT and Perplexity often cite and link to Wikipedia, but because information is already embedded in AI-generated responses, users are less likely to click through to the source, depriving the site of page views5
. Meanwhile, visits from bots, sometimes disguised to evade detection, were heavily taxing servers as they scrape masses of content to feed AI large language models3
.
Source: Observer
Wikipedia founder Jimmy Wales emphasized that he welcomes AI training on the platform's vast repository of free knowledge. "I'm very happy personally that AI models are training on Wikipedia data because it's human curated," Wales told The Associated Press
3
4
. However, he stressed that AI companies should "chip in and pay for your fair share of the cost that you're putting on us"3
. The nonprofit signed Google as one of its first customers in 2022 and announced agreements last year with smaller AI players like search engine Ecosia3
. Through Wikimedia Enterprise, launched in 2021, companies including Microsoft, Mistral AI, Perplexity AI, Ecosia, Pleias, and ProRata have joined as partners5
. Lane Becker, senior director of earned revenue at the Wikimedia Foundation, noted growing understanding among AI companies about the significance of the Wikipedia dataset5
.
Source: AP
The bulk of Wikipedia's funding comes from 8 million donors, most of them individuals, and Wikimedia Foundation CEO Maryana Iskander emphasized that infrastructure costs remain substantial
3
. Wales pointed out that donors "are not donating in order to subsidize these huge AI companies"3
. Beyond financial concerns, declining page views threaten the volunteer editors who sustain the platform. Metrics tracking volunteer participation have already begun to slip, and Miller acknowledged "reason to believe that declines in page views will lead to declines in volunteer activity"5
. The platform has spent 25 years managing challenges including maintaining reliability at scale, preventing volunteer burnout, and defending against political and legal attacks1
. Systemic biases persist, as most editors are male and concentrated in North America and Europe, with new and minority editors often reporting a hostile climate1
.Related Stories
The rise of Elon Musk's Grokipedia, which he dismissively refers to Wikipedia as "Wokepedia," presents another challenge
2
. Launched on October 27, 2025, this AI-powered version already has more than 5.6 million entries compared with Wikipedia's total of over 7.1 million2
. While Grokipedia uses AI to generate most entries, some are near-identical to Wikipedia's, all available under creative commons licensing2
. Wikipedia has also found itself under fire from figures on the political right, who accuse it of bias3
. Yet Wikipedia embodied the web 2.0 dream of a non-hierarchical, user-led internet built on participation and sharing, launched during the aftermath of the 2000-2001 dotcom crash2
.While AI poses existential risks, Wales outlined ways the technology could support Wikipedia. The Wikimedia Foundation has developed an AI strategy that could reduce tedious work for volunteer editors
3
. Though AI isn't good enough to write Wikipedia entries from scratch, it could update dead links by scanning surrounding text and searching online to find other sources3
4
. Artificial intelligence could also improve the Wikipedia search experience by evolving from traditional keyword methods to a chatbot style, allowing users to ask questions and receive quoted paragraphs with source attribution3
. The Wikimedia Foundation's total assets now stand at more than $310 million, supporting its question of long-term sustainability2
5
. As Becker noted, the goal is clear: "We'd like to make it at least another 25 years -- and ideally much longer"5
.Summarized by
Navi
[2]
17 Oct 2025•Technology

10 Nov 2025•Business and Economy

01 May 2025•Technology

1
Policy and Regulation

2
Technology

3
Policy and Regulation
