Meta AI researchers develop a model to check Wikipedia citations

Meta AI researchers have developed a model that can help verify citations on Wikipedia. Wikipedia is often the first stop when someone is looking for information online, but the validity of that information depends entirely on where it comes from.

Volunteers verify citations, researchers say, but they can often struggle to keep up with the 17,000+ articles added to the site each month. There are already automated tools that check for statements that lack citations entirely, but determining whether a cited source confirms a claim is more complex.

Meta AI’s new model can analyze hundreds of thousands of citations at once and notify volunteer editors when it encounters a questionable citation. The model can also make suggestions on better sources to replace a bad source.

The ultimate goal of this new model, according to the researchers, is to build a platform that can help Wikipedia editors increase their ability to quickly correct citations. This will help ensure that Wikipedia pages are as accurate as possible.

“Open source projects like these, which teach algorithms to understand dense materials with an ever-increasing degree of sophistication, help AI make sense of the real world. Although we cannot yet design a computer system with human-level understanding of language, our research is creating smarter and more flexible algorithms. This improvement will only grow in importance as we rely on computers to interpret the growing volume of text citations generated each day,” the researchers wrote in a blog post.

They also noted that this model is the first step towards an editor who could check documents in real time by suggesting auto-complete text based on documents found online and providing proofreading. These future models would include multiple languages ​​and could use various types of media, such as videos, images, and data tables.

James G. Williams