Addressing discrimination in algorithmic profiling: Examining risk governance in Dutch public social security agencies

Research output: Contribution to journalArticleAcademicpeer-review

11 Downloads (Pure)

Abstract

In the social security sector, both in the Netherlands and abroad, the irresponsible use of algorithmic profiling technologies to combat misallocation of social security benefits has contributed to instances of discrimination. While a developing legal framework—made up of fundamental rights, the GDPR, and the recently adopted AI Act—requires risks of discrimination to be identified and mitigated, it offers limited guidance on how to implement these obligations. Social security agencies seeking to address the systemic risk of discrimination in algorithmic profiling are frustrated by the associated sociotechnical complexity, scientific uncertainty, and socio-legal ambiguity. This study examines how two Dutch social security agencies address discrimination risks in algorithmic profiling, using van Asselt and Renn's principles of systemic risk governance—communication and inclusion, integration, and reflection—as a theoretical framework. Through case studies involving document analysis and interviews, the research explores how these agencies address discrimination risks. This study highlights the importance of socially robust risk governance structures that encompass both simpler rule-based selection systems and trained algorithmic systems, that include scientific and client perspectives, and draw on the experiences of other agencies.

Original languageEnglish
Pages (from-to)191-212
Number of pages22
JournalEuropean Journal of Social Security
Volume27
Issue number2
DOIs
Publication statusPublished - Jun-2025

Keywords

  • Algorithmic profiling
  • discrimination
  • risk governance
  • social security

Fingerprint

Dive into the research topics of 'Addressing discrimination in algorithmic profiling: Examining risk governance in Dutch public social security agencies'. Together they form a unique fingerprint.

Cite this