The use of software tools and autonomous bots against vandalism: eroding Wikipedia's moral order?

Paul B. de Laat*

*Bijbehorende auteur voor dit werk

OnderzoeksoutputAcademicpeer review

9 Citaten (Scopus)
301 Downloads (Pure)


English-language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the 'coactivity' in use between humans and bots, this research 'discloses' the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical layer. Further, surveillance exhibits several troubling features: questionable profiling practices (concerning anonymous users in particular), the use of the controversial measure of reputation (under consideration), 'oversurveillance' where quantity trumps quality, and a prospective loss of the required moral skills whenever bots take over from humans. The most troubling aspect, though, is that Wikipedia has become a Janus-faced institution. One face is the basic platform of MediaWiki software, transparent to all. Its other face is the anti-vandalism system, which, in contrast, is opaque to the average user, in particular as a result of the algorithms and neural networks in use. Finally it is argued that this secrecy impedes a much needed discussion to unfold; a discussion that should focus on a 'rebalancing' of the anti-vandalism system and the development of more ethical information practices towards contributors.

Originele taal-2English
Pagina's (van-tot)175-188
Aantal pagina's14
TijdschriftEthics and Information Technology
Nummer van het tijdschrift3
StatusPublished - sep.-2015

Citeer dit