The bots start to gain ground in several disciplines, some do incredibly well while others are complete disaster. Nevertheless, its share continues to grow to be a tool with hundreds of possibilities that can be used relatively easily. For example we have the case of Wikipedia, where bots are able to identify and undo cases of vandalism, add links, correct spelling, and remain aware of any changes, which also means potential collisions with other bots that do exactly what same.
A group of researchers from Oxford and Alan Turing Institute have published a study that shows how bots can work completely unpredictable ways, as if it were a group of children fighting to impose their rules, yes, much like the humans, but the difference is that here there is no agreement because there is no communication between them.
Bots are reversed to each other again and again
This study analyzed billion edits to Wikipedia between 2001 and 2010, which were conducted by five million humans and bots 2,000 where the interesting part is that all these issues, only 4.7 million are reversions , which has been increasing year after year even though the number of human and bots, editors users has declined since 2001.
The volume of revisions has mainly increased because the bots are programmed to check your changes every certain period of time, which causes it to fall into a kind of editing without end to find the last version was modified, causing reversal again and again, something that will happen as long as the creators of these bots do not agree, an almost impossible task and where only the same Wikipedia could come to bring order.
You may also like to read another article on Lab-Soft: Google will remove the ‘Chrome apps’ on Windows, Mac and Linux; will be exclusive to Chrome OS
Part of what presents the study is a sign that the bots are still far from equal human behavior, as they have a real vision of what they are doing, undo the work of other bots and this becomes an endless battle. However, changes mostly come in the form of expression used by the bot, which depends on the country where it is programmed, for example, many of reversals come for change “Palestine” to “Palestinian Territory”, or ” Persian Gulf “to” Arabian Gulf “and so with millions of words and concepts that do not coincide with each other in the various regions of the world.
Another curious element of all these changes is the subject of languages, as for example the entries in German have been the least changes have suffered, with an average of 24 times, while in Portugal have reached 185 rollbacks on average for the same input. As noted by the study, this is due to the differences cultic people who program these bots.
On the other hand, in this period of 9 years humans are reversed each other an average of three times, while bots Anglophone an average of 105 times reversed. This shows that the guidelines for review between humans and bots are completely different, since humans usually modify an entry and rarely returns to verify that the information published is still in force, while the bots are programmed to perform this task so systematic.
Something that is believed can help this little chaos and unproductive conflict is that the Wikipedia authorize the creation of cooperative bots that can manage disagreements, and thus to fulfill the tasks efficiently in order that everyone can agree.