

In many cases, professors can access the recordings of their students at any time, and even download these recordings to their personal machines. These products film students in their homes and often require them to complete “room scans,” which involve using their camera to show their surroundings.
#HUMANATIC INVASION SOFTWARE SOFTWARE#
That means students with medical conditions who must use the bathroom or administer medication frequently would be considered similarly suspect.īeyond all the ways that proctoring software can discriminate against students, algorithmic proctoring is also a significant invasion of privacy. But several proctoring programs will flag noises in the room or anyone who leaves the camera’s view as nefarious.

If you’ve ever tried to answer emails while caring for kids, you know how impossible it can be to get even a few uninterrupted minutes in front of the computer. Students with children are also penalized by these systems. But if you’re a white cis man (like most of the developers who make facial recognition software), you’ll probably be fine. Similar kinds of discrimination can happen if a student is trans or non-binary. The software couldn’t validate her identity and she was denied access to tests so often that she had to go to her professor to make other arrangements. Now these same biases are showing up in test proctoring software that disproportionately hurts marginalized students.Ī Black woman at my university once told me that whenever she used Proctorio's test proctoring software, it always prompted her to shine more light on her face. In general, technology has a pattern of reinforcing structural oppression like racism and sexism. The problem is that facial recognition and detection have proven to be racist, sexist, and transphobic over, and over, and over again. If you do anything that the software deems suspicious, it will alert your professor to view the recording and provide them a color-coded probability of your academic misconduct.ĭepending on which company made the software, it will use some combination of machine learning, AI, and biometrics (including facial recognition, facial detection, or eye tracking) to do all of this. It measures your body and watches you for the duration of the exam, tracking your movements to identify what it considers cheating behaviors. If you’re a student taking an algorithmically proctored test, here’s how it works: When you begin, the software starts recording your computer’s camera, audio, and the websites you visit. The use of these tools is an invasion of students’ privacy and, often, a civil rights violation. It’s become clear to me that algorithmic proctoring is a modern surveillance technology that reinforces white supremacy, sexism, ableism, and transphobia.

My own employer, the University of Colorado Denver, has a contract with Proctorio. I'm a university librarian and I've seen the impacts of these systems up close.

Proctorio told the New York Times in May that business had increased by 900% during the first few months of the pandemic, to the point where the company proctored 2.5 million tests worldwide in April alone. While there’s no official tally, it’s reasonable to say that millions of algorithmically proctored tests are happening every month around the world. Examity, HonorLock, Proctorio, ProctorU, Respondus and others have rapidly grown since colleges and universities switched to remote classes. About half a dozen companies in the US claim their software can accurately detect and prevent cheating in online tests. The coronavirus pandemic has been a boon for the test proctoring industry.
