Wikipedia is now relying on artificial intelligence (AI) to help it keep track of bad edits on its site.
More than half a million changes are made to Wikipedia’s articles every day and is usually monitored by a large number of volunteers who ensure that all edits are correct and factual. Each edit has to be reviewed manually so the organisation is looking to AI to help out.
The Objective Revision Evaluation Service (ORES) is said to be like “a pair of X-ray specs”, highlighting edits that could be potentially damaging.
“By combining open data and open source machine learning algorithms, our goal is to make quality control in Wikipedia more transparent, auditable and easy to experiment with”, says the Wikimedia Foundation in a blog post.
Our hope is that ORES will enable critical advancements in how we do quality control – changes that will both make quality control work more efficient and make Wikipedia a more welcoming place for new editors.
The system works by training it against edit and article quality assessments made by the site’s editors and giving scores for both edits and articles. The AI judges the quality of an edit by looking at the context of the change and the language used.
The idea is that by using smarter AI to tackle obvious errors or changes, it would take some of the workload off current volunteers and encourage new ones to join up and contribute.
Alongside the introduction of ORES, the organisation is working on three different things to improve the service such as classifying edit types and bias detection.
Read: WhatsApp has been accused of blocking links to this rival app >