Skip to ContentSkip to Navigation
Jantina Tammes School of Digital Society, Technology and AI
Digital prosperity for all
Jantina Tammes School of Digital Society, Technology and AI Education Menu

Digital Technologies: Flagged and Failed

digital dilemmas

Context

Your predictive policing software, marketed for crime hotspot identification, is trained on biased historical arrest data. This disproportionately targets Black communities, using socioeconomic factors as racial proxies, thus amplifying systemic racism. Critics warn of "tech-washing" masking deep inequities.

Dilemma

A) Halt sales and redesign the software to explicitly mitigate bias and ensure transparency, even if it impacts "accuracy" or profits.
B) Sell the software, securing a major contract and market share, despite reinforcing existing societal biases.

Summary

Predictive policing algorithms perpetuate systemic racism due to biased training data and lack of transparency. These tools, used to predict crime hotspots or individual risk, rely on arrest rates that disproportionately target Black communities. Even without explicit racial data, algorithms use proxies like socioeconomic factors, reinforcing existing biases. Critics argue that these systems, rather than reducing bias, amplify it, creating a "tech-washing" effect that masks underlying inequities. Calls are growing to dismantle these flawed systems, as they contribute to a cycle of discriminatory policing.

Resources:


Last modified:06 June 2025 2.33 p.m.
Volg ons optwitter linkedin youtube