Digital Technologies: Flagged and Failed

Context
Your predictive policing software, marketed for crime hotspot identification, is trained on biased historical arrest data. This disproportionately targets Black communities, using socioeconomic factors as racial proxies, thus amplifying systemic racism. Critics warn of "tech-washing" masking deep inequities.
Dilemma
A) Halt sales and redesign the software to explicitly mitigate bias and ensure transparency, even if it impacts "accuracy" or profits.
B) Sell the software, securing a major contract and market share, despite reinforcing existing societal biases.
Summary
Predictive policing algorithms perpetuate systemic racism due to biased training data and lack of transparency. These tools, used to predict crime hotspots or individual risk, rely on arrest rates that disproportionately target Black communities. Even without explicit racial data, algorithms use proxies like socioeconomic factors, reinforcing existing biases. Critics argue that these systems, rather than reducing bias, amplify it, creating a "tech-washing" effect that masks underlying inequities. Calls are growing to dismantle these flawed systems, as they contribute to a cycle of discriminatory policing.
Resources:
- https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/
- https://www.aclu-mn.org/en/news/biased-technology-automated-discrimination-facial-recognition
Last modified: | 06 June 2025 2.33 p.m. |