
Wikipedia Announces Plans To Use AI
Wikimedia Foundation, the non-profit behind Wikipedia, has announced plans to integrate artificial intelligence (AI) over the next three years. The organization plans to use “open-source or open-weight AI” to support volunteers by automating technical and repetitive tasks, while maintaining the role of human editors.
Wikipedia emphasized that it will take “a human-centered approach” that prioritizes human agency. The goal is to use AI to “remove technical barriers,” and perform “tedious tasks,” freeing up time for editors, moderators, and patrollers to accomplish what they need to, rather than worry about how to “technically achieve it.”
Wikipedia already uses AI to predict readability, translate content, and detect vandalism. However, until this announcement, the foundation had not offered AI tools directly to its editors. Moving forward, it intends to rely on AI tools for specific use cases in which AI excels, like automating tasks with AI-assisted workflows, automating translations, and increasing its discoverability in search results.
The non-profit also believes AI will give editors more time to strategize what to cover on Wikipedia and maintain accurate, up-to-date content. AI will also help editors with onboarding new volunteers.
“We believe that our future work with AI will be successful not only because of what we do, but how we do it,” reads the blog post on the AI implementation announcement news.
One of Wikipedia’s biggest challenges is the volume of content outpacing the number of volunteers able to moderate it. With the help of AI, the company expects to automate routine tasks, ultimately reducing the burden on human editors and maintaining greater accuracy across its millions of articles.
Some experts have warned that AI might render Wikipedia obsolete. However, Wikipedia sees its role gaining greater importance in the AI-dominated era due to AI being notorious for making mistakes, fabricating facts, and hallucinating.
Wikipedia has also faced strain on its servers from AI bots scraping data to train large language models (LLMs). The traffic from the growing number of AI bots has overloaded Wikipedia’s servers, resulting in an increased bandwidth consumption by 50%. To address this, Wikipedia released a dedicated dataset specifically optimized for machine learning, keeping bots away from the human browsing experience.