- AI can assist in clearing up these erroneous and incomplete Wikipedia references.
- SIDE can automate the process of evaluating the sources cited on Wikipedia pages.
- The Indian Supreme Court issued a warning earlier this year to authorities not to rely solely on information from Wikipedia.
For millions of people worldwide, Wikipedia is the first resource they use to research virtually anything online. Thousands of volunteers work for the non-profit organization, which regularly verifies the reliability and correctness of sources cited on its website. However, some of the sources are faulty, contain false information, or have links to unreliable sources.
However, it appears that AI could help improve the circumstances. A Nature Machine Intelligence study claims that artificial intelligence can assist in clearing up these erroneous and incomplete Wikipedia references.
Wikipedia
The neural network side, called SIDE, was created by the London-based startup Samaya AI. It can automate the process of evaluating the sources cited on Wikipedia pages by determining whether or not they bolster the assertion made in the article and can offer substitutes for references that don’t match.
The neural network component known as SIDE, created by the London-based startup Samaya AI, can automate the process of evaluating the references listed on Wikipedia pages to see if they bolster the assertions stated in the article and can offer substitutes for any that don’t match.
Based on a database of high-quality online references that are frequently endorsed and attract editors’ and moderators’ attention, researchers claim that SIDE is trained. As per the study, SIDE followed the reference mentioned in the article in nearly 50% of the cases, while in other cases, it recommended a more appropriate reference.
Upon presenting the AI’s findings to a group of Wikipedia users, statistics revealed that 21% of the users expressed satisfaction with the AI’s recommendations, while 10% said they preferred pre-existing sources. 39% of the population expressed no preference.
The Indian Supreme Court issued a warning earlier this year to authorities not to rely solely on information from Wikipedia and other user-generated editing models, stating that these models can even “promote misleading information” and are not entirely reliable in terms of academic veracity.