MIT develops AI tech to edit outdated Wikipedia articles

Artificial intelligence technology could be used to re-write outdated Wikipedia articles, reducing the workload for human editors, thanks to a system developed by Massachusetts Institute of Technology (MIT).

Bots have been used to edit Wikipedia in the past, but while AI systems are now capable of generating text and can check facts, they aren’t always the most capable and often struggle to mimic the tone of human writers.

By training its system on two databases, one with pairs of sentences and another with a relevant Wikipedia sentence, the researchers at MIT were able to train an AI system that can find outdated text in Wikipedia pages and then rewrite the passages in a human-like fashion.

The AI system works by looking at new or updated information typed into the user interface in an unstructured sentence; one without grammar or style. The algorithm then finds the relevant Wikipedia entry, pinpoints the section the updated information is referencing and then rewrites the sentence with amended details in a human-like fashion.

The idea is that the AI removed the need for human editors to laboriously search through and amend Wikipedia entries.

RELATED RESOURCE

Diversity in the digital workplace

The future of work is a collaborative effort between humans and robots

FREE DOWNLOAD

“There are so many updates constantly needed to Wikipedia articles. It would be beneficial to automatically modify exact portions of the articles, with little to no human intervention,” says Darsh Shah, a PhD student in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and one of the lead authors of a paper detailing the AI research. “Instead of hundreds of people working on modifying each Wikipedia article, then you’ll only need a few, because the model is helping or doing it automatically. That offers dramatic improvements in efficiency.”

The AI can also help identify so-called ‘fake news’ as it has been trained to recognise the difference between legitimate additions to a Wikipedia entry and information that is factually incorrect. The system seeks evidence and other information to ascertain if such a claim is true or false.

The system is not ready for full use on Wikipedia just yet, as it has been given a score of four out of five for accuracy by human editors and a score of three-and-a-half out of five for gramma accuracy. As such, it outperforms other AI tools but isn’t quite able to mimic human editors just yet.

However, it is a sign of the continued development and refinement of AI technology and how it can be used to make life easier for humans rather than replace them, thus enshrining the ideals of digital transformation.

Roland Moore-Colyer

Roland is a passionate newshound whose journalism training initially involved a broadcast specialism, but he’s since found his home in breaking news stories online and in print.

He held a freelance news editor position at ITPro for a number of years after his lengthy stint writing news, analysis, features, and columns for The Inquirer, V3, and Computing. He was also the news editor at Silicon UK before joining Tom’s Guide in April 2020 where he started as the UK Editor and now assumes the role of Managing Editor of News.

Roland’s career has seen him develop expertise in both consumer and business technology, and during his freelance days, he dabbled in the world of automotive and gaming journalism, too.