You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

In re Online Test Proctoring Companies

Calling on the D.C. Attorney General to take enforcement action against five online test proctoring firms

On December 9, 2020, EPIC filed a complaint with the Office of the Attorney General for the District of Columbia alleging that five major providers of online test proctoring services have engaged in unfair and deceptive trade practices in violation of the D.C. Consumer Protection Procedures Act (DCCPPA) and the Federal Trade Commission Act. Specifically, EPIC's complaint charges that Respondus, ProctorU, Proctorio, Examity, and Honorlock have engaged in excessive collection of students' biometric and other personal data and have routinely relied on opaque, unproven, and potentially biased AI analysis to detect alleged signs of cheating. On the same day, EPIC sent letters to all five firms warning that EPIC is prepared to bring suit under the DCCPPA unless the companies agree to limit their collection of personal data, comply with basic requirements for trustworthy AI, and submit to annual third-party audits. EPIC aims to ensure that students are not subjected to unfair, unreliable AI determinations or forced to choose between preserving their privacy and receiving an education.

Background

Overview

Since the start of the COVID-19 pandemic, educational institutions have rapidly accelerated their adoption of online test proctoring systems as part of the shift to remote learning. Online proctoring systems collect extensive personal information from students during remote exam sessions, including video, audio, keystroke patterns, and other biometric data captured through students' computers. This vast array of personal data is analyzed by artificial intelligence systems (and in some cases live proctors) to assign a risk score or otherwise flag students for possible indications of cheating.

The growth of online test proctoring has all but forced many students to trade away their privacy rights in order to meet their academic obligations. Students enrolled in a class that uses a remote proctoring system cannot opt out of personal data collection or video surveillance of their intimate surroundings; cannot avoid the system's use of facial recognition and AI analysis; and are generally denied access to the system's underlying logic and determinations. The result is an opaque, unaccountable platform that can flag a student for cheating based on little more than atypical eye movements or unexpected shadows.

Even if a proctoring system does not raise a false flag—potentially resulting in severe disciplinary consequences—it inherently invades students' privacy. Pointing a camera into a student's home for hours at a time can reveal sensitive details about a student's physical features, behaviors, disabilities, and family members and can induce undue stress on test-takers that undermines the integrity of exam sessions.

Moreover, the fundamental fairness of online proctoring systems has been called into serious doubt. Research has shown that AI—particularly facial recognition systems—can encode bias and disproportionately harm students of color and students with disabilities. Test-takers subjected to AI-based proctoring have reported that the systems struggle to recognize faces of color and flag students with certain disabilities at higher rates.

Online Test Proctoring Platforms

EPIC's complaint addresses the five largest providers of online test proctoring systems: Respondus, ProctorU, Proctorio, Examity, and Honorlock.

Respondus, the most widely used online proctoring service in the United States, offers several remote proctoring tools. One in particular, Respondus Monitor, uses a student's webcam and microphone to produce a recording of the student during an exam session. Respondus Monitor's AI analyzes facial imagery, motions, lighting, keyboard activity, mouse movements, hardware changes, and comparisons to other students who took the exam to detect purported instances of cheating.

ProctorU, which has proctored over 2 million tests from more than 750,000 students last year and has "expanded rapidly" during the pandemic, offers several remote proctoring tools. One in particular, Review+, offers a "live proctored launch, end-to-end recording solution with artificial intelligence, professional review and incident reporting." ProctorU requires each student to provide camera access, microphone access, screen access, and a photo ID to begin an exam. ProctorU uses facial recognition software and biometric keystroke measurements to authenticate a student's identity and monitors a student's screen, camera, and microphone continually during an exam. ProctorU also uses AI to flag certain behavior such as lighting changes and unusual noises. For example, ProctorU's AI monitors for low audible voices, slight lighting variations, and "other behavioral cues."

Proctorio uses a Google Chrome browser extension to collect video, audio, and screen captures and to create a recording of a student's exam session. Proctorio's system monitor students "by webcam, microphone, browser, desktop, or any other means necessary to uphold integrity." Proctorio tracks speech, eye movements, mouse clicks, and how long each student took to complete an exam in order to calculate a "suspicion level" for the student. This level is calculated from "the aggregation of frames during the exam which were deemed suspicious and the detection of abnormal behavior." Proctorio claims, without proof, that its software "eliminates human error [and] bias[.]"

Examity's online proctoring tools falls into two categories: "Automated Proctoring" and "Live Proctoring." Examity offers two Automated Proctoring options: Automated Standard, which takes an image of a student's official ID, creates a digital "signature" from a student's keystrokes, and produces recording of the exam with time-stamped comments; and Automated Premium, which provides the same services as Automated Standard but also includes a human audit of the authentication, exam, and AI-based findings. Examity also offers two Live Proctoring options: Live Standard, which combines the features of Automated Proctoring with human authentication and review (plus a required 360° camera sweep of the student's workspace); and Live Premium, which has the same capabilities as Live Standard but includes a live proctor throughout the duration of the exam. Examity states that it "may collect" a biometric record from students, defined as "a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual, such as fingerprints, retina and iris patterns, voiceprints, DNA sequence, facial characteristics, and handwriting." Examity's "flag system" relies on AI that analyzes behaviors, such as a student's typing rhythm, to detect purported instances of cheating.

Honorlock uses an in-browser extension to record and review students during exams and employs AI to monitor exam sessions and generate incident reports. Honorlock collects video and audio recordings of test-takers, monitors desktop activity, and tracks webpages visited by each student during an exam. Honorlock uses a combination of AI proctoring and live proctoring. While taking an exam, Honorlock's AI monitors a student and will trigger a live proctor to drop in on the student's feed if the AI "senses that something is wrong." For example, Honorlock's AI may trigger a live proctor when a student diverts their eyes from the screen or gets up from their desk.

Unfair and Deceptive Trade Practices

EPIC's complaint identifies four categories of unfair and deceptive trade practices that the five test proctoring companies have engaged in. Each category violates both the DCCPPA and the FTC Act.

A. Unfair and Deceptive Collection of Excessive Personal Data

Despite claims by Respondus, ProctorU, Proctorio, Examity, and Honorlock that the firms only collect as much personal information as is required to provide test proctoring services, each company has in fact collected excessive amounts of students' personal data. Forcibly collecting personal information from test-takers, including sensitive biometric data, is inherently invasive. Such collection deprives students of control over their personal data; can reveal intimate details about a student's physical features, behaviors, disabilities, family members, and home; and can induce undue stress on test-takers that undermines the integrity of exam sessions—the very thing that test proctoring systems are ostensibly meant to guard against. Moreover, students enrolled in a course that uses online proctoring cannot reasonably avoid the collection of their personal information. Given the unavoidable harms suffered by students subject to online proctoring; the failure of test proctoring firms to provide a reasonable basis for such broad data collection; and the lack of countervailing benefits to consumers, EPIC alleges that each of the five companies has engaged in unfair business practices.

B. Unfair Use of Opaque, Unproven AI Systems

Respondus, ProctorU, Proctorio, Examity, and Honorlock use AI to analyze extensive personal data and allegedly identify signs of cheating. Yet these companies routinely fail to disclose to students the logic, factors, and determinations of those AI systems. The use of opaque and secret AI systems and algorithms prevent students from understanding how their biometric information is analyzed, making it impossible for students to consent to such uses. And students enrolled in a course that uses online proctoring cannot reasonably avoid these opaque, unproven, and potentially biased AI systems. Given the unavoidable harms suffered by students subject to AI evaluation and the lack of countervailing benefits to consumers, EPIC alleges that each of the five companies has engaged in unfair business practices. EPIC's complaint further explains how the companies' uses of AI violate the OECD AI Principles and the Universal Guidelines for Artificial Intelligence.

C. Proctorio and Honorlock’s Deceptive Uses of Facial Recognition

Proctorio and Honorlock have engaged in deceptive trade practices by misrepresenting their use of facial recognition technology. Proctorio and Honorlock each claim that, rather than using facial recognition, they use "facial detection" technology. Proctorio contends that facial detection technology "is used to see if a test taker is looking away from the screen for an extended period of time, if there are other people present in the test-taking environment, or if the test taker has left the exam for any reason," and Honorlock claims that it "only detects that there is a clear human face in the webcam." But as the FTC explained in a 2012 report, facial detection is facial recognition—not a separate technology. As the FTC wrote: "[C]ompanies are deploying facial recognition technologies in a wide array of contexts, reflecting a spectrum of increasing technological sophistication. At the simplest level, the technology can be used for facial detection; that is, merely to detect and locate a face in a photo." Accordingly, EPIC alleges that both firms' claims concerning facial recognition are misleading and deceptive.

D. Deceptive Claims About the Reliability of Test Proctoring Systems

Respondus, ProctorU, Proctorio, Examity, and Honorlock represent that their online test proctoring services can reliably detect signs of cheating. Yet many students subjected to these proctoring systems have reported groundless accusations of cheating, major technical glitches, indications of racial bias in the companies' facial recognition algorithms, and signs that students with disabilities or atypical patterns of movement are disproportionately flagged as potential "cheaters." EPIC argues that these reports are a sufficient basis for the Attorney General to investigate whether the systems do, in fact, detect signs of cheating. EPIC also argues that the companies' unverified claims of reliability are overstated and deceptive.

EPIC's Interest

EPIC is a longstanding advocate of legal, ethical, and human rights safeguards for the use of artificial intelligence. As EPIC has explained, "Algorithmic accountability is a complex topic, but the impact cuts broadly across life in America, from jobs and credit to housing and criminal justice." EPIC has also warned of the urgency to act now: "The United States must work with other democratic countries to establish red lines for certain AI applications and ensure fairness, accountability and transparency as AI systems are deployed."

EPIC has specifically urged regulators to respond to the emergence of commercial AI practices that are unfair or deceptive. In February 2020, EPIC petitioned the FTC to conduct a rulemaking on the use of AI and filed a complaint highlighting Airbnb's secret customer screening algorithm. In 2019, EPIC filed a complaint against recruiting company HireVue alleging that the company falsely denied using facial recognition and failed to comply with baseline standards for AI decision-making. And in 2018, EPIC filed a complaint against Facebook concerning the company's facial recognition practices.

EPIC also publishes the AI Policy Sourcebook, the first reference book on AI policy.

Legal Documents

Resources

EPIC's Complaint in the News

Online Proctoring in the News

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security