MIT develops AI tech to edit outdated Wikipedia articles
The technology uses machine learning to automatically spot and replace information, with the goal of aiding human editors
Artificial intelligence technology could be used to re-write outdated Wikipedia articles, reducing the workload for human editors, thanks to a system developed by Massachusetts Institute of Technology (MIT).
Bots have been used to edit Wikipedia in the past, but while AI systems are now capable of generating text and can check facts, they aren’t always the most capable and often struggle to mimic the tone of human writers.
By training its system on two databases, one with pairs of sentences and another with a relevant Wikipedia sentence, the researchers at MIT were able to train an AI system that can find outdated text in Wikipedia pages and then rewrite the passages in a human-like fashion.
The AI system works by looking at new or updated information typed into the user interface in an unstructured sentence; one without grammar or style. The algorithm then finds the relevant Wikipedia entry, pinpoints the section the updated information is referencing and then rewrites the sentence with amended details in a human-like fashion.
The idea is that the AI removed the need for human editors to laboriously search through and amend Wikipedia entries.
Diversity in the digital workplace
The future of work is a collaborative effort between humans and robotsDownload now
“There are so many updates constantly needed to Wikipedia articles. It would be beneficial to automatically modify exact portions of the articles, with little to no human intervention,” says Darsh Shah, a PhD student in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and one of the lead authors of a paper detailing the AI research. “Instead of hundreds of people working on modifying each Wikipedia article, then you’ll only need a few, because the model is helping or doing it automatically. That offers dramatic improvements in efficiency.”
The AI can also help identify so-called ‘fake news’ as it has been trained to recognise the difference between legitimate additions to a Wikipedia entry and information that is factually incorrect. The system seeks evidence and other information to ascertain if such a claim is true or false.
The system is not ready for full use on Wikipedia just yet, as it has been given a score of four out of five for accuracy by human editors and a score of three-and-a-half out of five for gramma accuracy. As such, it outperforms other AI tools but isn’t quite able to mimic human editors just yet.
However, it is a sign of the continued development and refinement of AI technology and how it can be used to make life easier for humans rather than replace them, thus enshrining the ideals of digital transformation.
Managing security risk and compliance in a challenging landscape
How key technology partners grow with your organisationDownload now
Evaluate your order-to-cash process
15 recommended metrics to benchmark your O2C operationsDownload now
AI 360: Hold, fold, or double down?
How AI can benefit your businessDownload now
Getting started with Azure Red Hat OpenShift
A developer’s guide to improving application building and deployment capabilitiesDownload now