Companies such as Proctorio, Respondus, ProctorU, HonorLock, Kryterion Global Testing Solutions, and Examity, have adapted human-proctoring models for programs with algorithmic proctoring of exams. The automated proctoring program records the audio and video of students and at the same time, the algorithms flag dishonest conduct. For these programs to recognize behavior, the algorithm is exposed to a dataset that allows it to form a baseline of normal behavior from which to make decisions.
Eugenics is the practice of selecting desired heritable characteristics to improve the future human population. A gaze is a societal outlook that pursues creating a power disparity between groups of people. The eugenic gaze aims to analyze people’s bodies and behaviors, compare them to an ideal model, and punish those who do not fit the norm. The algorithms in the online proctoring services use the eugenic gaze by evaluating students’ bodies and behaviors through facial recognition software, characterizing features and actions that are associated with the ideal student. Any deviation from the cisgender, male, able-bodied, white model may be flagged as suspicious, and risk being suspected of academic misconduct. The eugenic gaze within these programs is sexist, racist, and ableist. Companies that build these technologies are able to exploit higher education’s inclination to prevent academic dishonesty.
The programs will flag loud noises or any motion away from the camera as suspicious activity, however, not all students have a quiet and distraction-free environment. Students with neuromuscular disorders or other medical conditions that make it difficult to sit still for long periods of time are put at a disadvantage, for if their behaviors don’t satisfy the algorithmic model, they will be marked as suspicious. These programs also have a hard time identifying students with darker skin. Students with dark skin are frequently asked to shine more light on their faces in order to be recognized. As well, Asian students have reported that the programs have had issues identifying the difference between various Chinese students. Before beginning the exam, students must hold their ID in front of the camera, which may be uncomfortable for students who identify as trans or non-binary. Not only will the algorithm most likely flag them as suspicious, but the students will be outed to their teachers, which they may not feel comfortable with and may not be aware of. The instructor and program receive access to the recorded audio and video and can keep the recordings for as long as they want, which is a huge privacy concern.