HomeTechnology"Badly written and source" content: how Wikipedia tries to combat the articles...

“Badly written and source” content: how Wikipedia tries to combat the articles generated by AI

The increase in the content number generated by the volunteer Wikipedia forces to rethink their rules. Several taxpayers unlock the sites generated by cities as a source and try to eliminate articles written by a chatbot as quickly as possible.

It is an ants work. For several months, the famous collaborative encyclopedia of Wikipedia has faced an influx of content generated by artificial intelligence. As expected, this information is often false or fallacious.

To counteract this phenomenon, the wikipedians, volunteers who feed and regulate the encyclopedia online, are mobilized. As the edge points out, it is the Wikiproject AI cleaning project. Your goal? Fight against the “growing problem of content generated by AI, poorly written and without a source.” Marshall Miller, product director of the Wikimedia Foundation, even compares this mobilization with a kind of “immune system” response.

Adapt to the new challenge of AI

To prevent the proliferation content, the volunteers to track false information sites created by AI that publishes plagiarized or invented articles and that are cited as sources in certain pages. In total, almost 150 false sites have been identified in Wikipedia sources. In detail, “105 use to write texts, 65 use plagiarism (with or without translation, with or without reformulation by AI)”, specifies the community in a note.

They also try to eliminate articles written by an AI as soon as possible. For this, Wikipedia adapted its rules. In general, the articles reported for the elimination of the site are subject to a seven -day discussion period during which community members determine whether the site must eliminate the article. The new rule allows administrators to overcome these discussions if an article is clearly generated by an AI and has not been reviewed by the person who presented it.

To detect them, several clues put the chip in the ear. Because these articles are often very badly written. For example, they cite incorrect references to authors or publications or include links that refer to non -existent sites. Some sentences such as “here are your Wikipedia article” may be present in these articles. There are also expressions or format characteristics that are generally found in articles written by Chatbots. This list includes the excessive use of scripts associated with chatbots, certain conjunctions, such as “also”, as well as promotional expressions, such as “impressive.”

A “double -edged” weapon

But politics around AI inside Wikipedia has not always been so clear. Last June, the Wikimedia Foundation, which houses the encyclopedia but does not participate in the development of website policies, proposed to certain taxpayers to test generative tools of AI to write articles. Summaries generated by AI have been placed at the top of the items.

A concept far from seducing Wikipedios. The community has indicated the risk of misinformation linked to the hallucinations of AI, as well as the readability problems between the content generated by the machine and those of human work. So that the test was suspended.

Therefore, the Wikimedia Foundation plans to use AI to help editors in repetitive tasks and translation.

Author: Salome Ferraris
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here