Column: Respondus is a dystopian nightmare

Luke Taylor

Online classes are breaking down the barriers between school time and personal time, in some ways more innocuous than others.

We’ve all noticed it; weekends don’t feel like weekends because school and work are now 24/7. On days off, many of us still have a nagging feeling that we need to be checking our online classes. Professors seem to forget that we still have jobs and need time to rest.

These are all somewhat expected issues with online classes, as well as with a global pandemic, but one crossover between the classroom and home took me by surprise.

A few days before I’m writing this, I wrote an article about the Respondus Browser.

This program intends to prevent cheating by recording video and audio of students as they take tests.

That already creeped me out, but it went further. The program tracks faces in the videos to see where students are looking. It also uses photo ID to confirm they are who they say they are.

It also tracks keystrokes and clicks and analyzes the data, using some mysterious algorithm to find suspicious behavior.

Further details about the program are available on the company’s website, but even that was enough for me to hope I’m never required to use the Respondus Browser.

While I understand that preventing cheating in online courses is difficult, these measures feel extreme.

Last year, a New York Times article quoted professors and students alike who felt that there “has to be a better way.”

While I don’t agree with one quote that said this was like “communist Russia” (communism is not when computers are scary), I fully share in the sentiment that this technology feels like it is straight out of a dystopian novel.

Picture it: thousands of employees, probably at a huge corporation like Amazon, hunch over computers at lines and lines of desks. They can’t look up, they can’t stop typing for too long, they can’t step away from their computers, or dollars will be deducted from their paycheck for low productivity.

Yes, I’m being dramatic, but it seems bizarre to me to expect a computer algorithm to be able to identify human behavior in this way.

Sometimes you just need to stop staring at a screen- this doesn’t mean you’re looking at a sheet of test answers.

What if someone knocks on the door? Do you say, “Sorry, come back later,” and trigger the technologies sensors? Or will the knock itself be interpreted as Morse code secretly sharing answers?

There does have to be another way.

Have tests in person, socially distanced.

Or- and stay with me here- try trusting students to behave like the adults we’re meant to be.

 

Luke Taylor is a sophomore journalism major. He can be reached at 581-2812 or [email protected].