Letter: ProctorU Raises Ethical Concerns
October 8, 2020
Malcolm opens the computer he has borrowed from a friend. He normally takes his exams on his Chromebook, tablet or phone, but his biology class uses a proctoring software that only works with Macs or PCs. He let his three roommates know earlier this week that he has a quiz for his class that he has to take, and that any movement or noise could cause him to get a zero if ProctorU picks it up. They understand because two of their classes also have the online proctoring requirement and they made plans to work outside the house today. He’s secretly relieved because the bandwidth in their house isn’t very strong and if his connection goes out (as it frequently does when all four of them have classes) he will receive a zero. He logs in to his test and tries to do the identity verification. It doesn’t recognize him, which means he can’t start his quiz. Panicking, he emails his professor, who responds: “this has been happening to other Black students this week as well…”
Malcolm’s situation, though fictional, mimics a very real account of what thousands of students are experiencing this semester since the University of Utah entered into their contract with ProctorU, an online proctoring company. This situation also only represents the tip of the iceberg of the issues associated with this technology. ProctorU and it’s competitors have gained national attention for issues with their algorithms, including racial biases, flagging neurodivergent students for eye-tracking, hyperactivity, or reading questions to themselves, data breaches of 400,000 student accounts and selling personal information. Additionally, if ProctorU ever went under, they would retain the right to transfer personal information if the company was ever sold or went bankrupt. The same policy would allow them to disclose information to ICE, potentially putting undocumented students at the U directly in harm’s way. Furthermore, they share information with third-party applications and websites using API integrations. Students who are enrolled in classes mandating the use of this software are being placed into harmful and impossible situations since they are given no other alternatives to comply with testing. Even without these grave concerns, it’s worth considering how this level of surveillance — which requires students to submit scans of their room, sit almost perfectly still and silent for upwards of hours at a time in some cases — erodes the relationships between instructors and students.
In her book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor Virginia Eubanks offers two “gut-check” questions to help determine whether an automated tool should be administered to poor populations. The questions are:
1. Does the tool increase the self-determination and agency of the poor?
2. Would the tool be tolerated if it were targeted at non-poor people?
While Eubanks uses these questions to critique relationships between the administrative state and the poor, her questions are helpful ethical benchmarks for any relationship which features imbalanced power dynamics — including classrooms.
For the first question, algorithmic proctoring systems don’t even pass the sniff test. Algorithmic proctoring systems have been criticized for their normalizing of a “Eugenic Gaze” due to being prone to flag neurodivergent individuals or individuals with neuromuscular disorders as potential cheaters; proctoring systems which use facial recognition are less likely to make an accurate identification of people with darker skin. Further, invasive tools like ProctorU rely heavily on students’ access to reliable internet access. Even brief lapses in bandwidth may flag a student as a potential cheater, which will invariably affect low-income students more heavily. Even in the case where there is a good appeal process, low-income, first-generation and other types of marginalized students often struggle to navigate the bureaucracy of institutions.
As for the second question, University administrators and professors should seriously consider whether such technology would be tolerated if it were targeted at them. Proctoring systems gather tremendous amounts of intimate data about the students they surveil — including audio and video recordings of the inside of students’ homes, screen captures and key logs, and even approximate location data. Such intrusive levels of surveillance over professors and administrators — if they were allowed at all — would at the very least be met with intense levels of scrutiny and calls for transparent and accountable feedback of their use. We should demand nothing less for students.
What Can Be Done?
A useful start would be the creation of an accessible and intelligible register which discloses all third-party uses of student data — particularly those used by automated tools. Examples for this include the City of Amsterdam’s Algorithm Register, which provides summaries of each tool and its uses, explanations of what data is used and generated by the tool (and who owns/manages the data), what decisions they are empowered to make, who is responsible for human oversight and appeals processes and what risks or known failures are associated with the tool’s use.
Second, the Academic Senate should also have a permanent committee, with graduate and undergraduate student representation, that would vote and approve university contracts with third-party organizations who will be privy to sensitive data. Students, whenever possible, should have a viable option to opt-out.
Last, instructors should also make efforts to simply not use such intrusive technology whenever possible. If a proctor and ID verification are required, using a simple process of logging in on a video software to manually check an ID would suffice. Often, however, well-designed exams and other measurements of student achievement simply don’t require a proctor at all. Instructors should make use of the University’s Center for Teaching and Learning Excellence to maximize the use of assignments that measure student’s higher-order learning and understanding of the material — the analysis, synthesis, and application of ideas — rather than simple measurements of rote memorization. Such exams may be open note and open book — like the “real world” where students have 24/7 access to magic rocks with complete world knowledge — but still incorporate rigorous intellectual challenges for students.
— Devon Cantwell and Zach Stickney, University of Utah Graduate Instructors
Melanie • Oct 21, 2020 at 4:43 pm
As someone who has been testing using ProctorU and other remote proctors for years, this op-ed doesn’t pass the sniff test. Describing your scenario as fictional is really the only valid thing in this article. The privacy policy linked is pretty standard. I know of non-profits that are collecting more personal data than ProctorU. None of the data is being sold to 3rd parties. The data that is collected does not contain PII. Cookies can be opted out of, again, pretty standard. Your personal data is more compromised by twitter or Facebook. This feels like an extremist view from people who are resisting the changes that need to happen to ensure the safety of our communities right now. I am personally grateful to be able to test from the safety of my own home. I am aware of what is required to do so and hold myself accountable for being prepared for testing. None of what is required is unreasonable. Quite honestly, the only reason I can see to resist online proctoring is that the students are less likely to get away with cheating. That’s a welcomed side-affect, in my opinion.
zek • Oct 12, 2020 at 9:35 am
WAs good