Mobile devices as stigmatizing security sensors: The GDPR and a future of crowdsourced ‘broken windows’

Research output: Contribution to journalArticleAcademicpeer-review

6 Citations (Scopus)
323 Downloads (Pure)

Abstract

Various smartphone apps and services are available which encourage users to report where and when they feel they are in an unsafe or threatening environment. This user generated content may be used to build datasets, which can show areas that are considered ‘bad,’ and to map out ‘safe’ routes through such neighbourhoods. Despite certain advantages, this data inherently carries the danger that streets or neighbourhoods become stigmatized and already existing prejudices might be reinforced. Such stigmas might also result in negative consequences for property values and businesses, causing irreversible damage to certain parts of a municipality. Overcoming such an “evidence-based stigma” — even if based on biased, unreviewed, outdated, or inaccurate data — becomes nearly impossible and raises the question how such data should be managed.
Original languageEnglish
Pages (from-to)69-85
Number of pages17
JournalInternational Data Privacy Law
Volume8
Issue number1
Early online date19-Dec-2017
DOIs
Publication statusPublished - Feb-2018

Keywords

  • Apps
  • Crowdsourcing
  • GDPR
  • Privacy
  • Public Space
  • Security
  • DATA PROTECTION
  • SURVEILLANCE
  • PRIVACY
  • DEFENSE
  • NUMBERS

Fingerprint

Dive into the research topics of 'Mobile devices as stigmatizing security sensors: The GDPR and a future of crowdsourced ‘broken windows’'. Together they form a unique fingerprint.

Cite this