Educational Technology: Racial bias

Context
Your company profits from online test proctoring software. Your system uses facial recognition and biometrics to detect cheating. However, there is evidence that it disproportionately flags marginalized students (students of color, trans, and individuals with disabilities).
Dilemma
A) Halt use of your proctoring software, publicly acknowledge its biases, and invest in a complete redesign with independent ethical oversight, foregoing significant current revenue.
B) Continue selling the software and publicly defend its utility in academic integrity.
Summary
The COVID-19 pandemic significantly boosted online test proctoring, with companies like Proctorio seeing massive growth. However, this AI-powered surveillance technology, which records students' cameras, audio, and movements, is facing intense criticism. Experts argue it inherently reinforces white supremacy, sexism, ableism, and transphobia due to biased facial recognition and movement tracking algorithms, disproportionately flagging marginalized students.
Beyond discrimination, the software is a significant privacy invasion, filming students' homes and allowing professors access to recordings. Critics also note a lack of peer-reviewed evidence proving its effectiveness in preventing cheating. Many advocate for abandoning these tools entirely, calling for compassion over surveillance and trust in students.
Resources:
Last modified: | 06 June 2025 2.33 p.m. |