Inscrutable systems, tangible harms

In 2018, it came to light that thousands of parents in the Netherlands had been wrongfully accused of fraud by the Dutch Tax Authority. A driving factor? The irresponsible deployment of an algorithmic profiling system that disproportionately selected and contributed to the mislabelling of thousands of parents with a migration background and lower socio-economic status as fraudsters. This scandal, now widely known as the Child Care Benefits Scandal, exposed the discriminatory harms that can result from the irresponsible use of algorithmic profiling in supervision and enforcement. This dissertation examines the regulation and governance of discrimination in algorithmic profiling.
Three core parts of this dissertation can be discerned. The first part, examines discrimination risks in algorithmic profiling and critically analyses the applicable and emerging European regulatory framework. The second part, examines how Dutch national and implementation organizations operationalize the European regulatory framework and address the discrimination risks in algorithmic profiling. In this sense, the focus is shifted from the regulation to the governance of risks of discrimination in the context of algorithmic profiling in supervision and enforcement. In the third part, the research is concluded by considering whether the current regulatory framework guarantees effective risk governance or if additional legislation is necessary to safeguard against discrimination.