Privacy and Equity Considerations for Instructors Remote Automated Proctoring Software: Respondus Monitor

Privacy and Equity Considerations for Instructors Remote Automated Proctoring Software: Respondus Monitor

Pilot Project Background

In April 2020, during the COVID-19 crisis, a small group of instructors at Lethbridge College (LC) expressed interest in piloting software used for remote automated proctoring. Their interest arose from the unavailability of the usual face-to-face proctoring options, both in-class and in the testing centre. Examity—the approved proctoring solution at LC— was unavailable due to their own office closures. These instructors were anticipating an increase in the number of assessments being delivered online due to COVID and were looking for a proctoring solution that would provide a deterrent to cheating in their Spring/Summer courses.

The Centre for Teaching, Learning and Innovation (CTLI) initiated a short pilot project given several important factors: Respondus Monitor (RM)—the remote automated proctoring solution used in the pilot—relies on technology, not live proctors, to detect suspicious behaviours; such kinds of software are becoming increasingly common—even prior to the pandemic; and, students’ and instructors’ experience of such type of software was not well understood. In addition to investigating instructor and student perceptions of RM, the pilot project also set out to learn more about potential privacy and equity-related issues as reflected in recent popular and academic literature concerning exam proctoring. The use of automated proctoring software does raises ethical and equity-related concerns—as documented in this report. In the context of teaching during a pandemic, popular online journals such as Inside Higher Ed and Educause have drawn attention to various issues:

Online proctoring has surged during the coronavirus pandemic, and so too have concerns about the practice, in which students take exams under the watchful eyes (human or automated) of third-party programs (Flaherty, 2020, para 2).

In their haste to deploy some forms of remote proctoring, institutions are spending money they don’t have to acquire products they don’t fully understand. Cost and concerns about students’ privacy are the most widespread challenges of adopting online proctoring solutions. Three in ten institutions are considering an alternative approach. Rather than moving traditional assessment online, they are exploring the idea of changing assessment entirely to adopt more authentic demonstrations of knowledge and skills (Grajek, 2020, para 4).

Students, likewise, have expressed concerns, having initiated petitions to ban varying types of proctoring software at a number of Canadian institutions including, but not limited to the University of Ottawa (Benning, 2020), Carleton University (Aleman, 2020), the University of Regina, (Sandin, 2020), the University of Alberta (Wong, 2020), in addition to a variety of U.S. institutions (Kelly, 2020).  As noted in the 2020 Educause Horizon Report (Brown et al., 2020), concerns expressed by students, such as those discussed in the articles above, serve to illustrate that social trends concerning equity and fair practices and well-being and mental health are finding their way into education. Perhaps this helps to explain, in part, the dissatisfaction expressed by such a large subset of students.

Under a short timeline, a team from the CTLI was assembled to develop a small-scale, one-semester pilot project on the use of Respondus Monitor in response to a request from instructors. The project entailed configuring the Learning Management System (Canvas); instructor self-paced training; provision of technical support to instructors; research; data collection; and reporting. The information acquired from the pilot is intended to help inform future decision making concerning the adoption and implementation of automated proctoring software at Lethbridge College.

The CTLI project team included:

  • Erin Howard: Associate Dean
  • Kyle Snowden: Senior Manager, Library and Digital Learning
  • Lorne Deimert: LMS Administrator
  • Cameron Reimer: Testing Services Coordinator
  • Andy Benoit: Educational Development Specialist

Project Methodology and Goals

The design of the pilot was modelled after an approach referred to as the Rapid Approach to Evaluating Educational Technologies Information Technology (RAIT) model developed by Sinkinson, Werner and Sieber (2014). The CTLI has used this model on one previous occasion to evaluate what was then called Lynda.com (Benoit, 2016). For the current pilot, two surveys were developed based on input from the project team. The surveys were released online using Microsoft Forms. Question types included multiple choice, multiple response, and open-ended questions. The student survey consisted of twenty questions, which participating instructors were asked to review. Per the RAIT Model, questions encompassed perceived benefits, drawbacks, overall satisfaction, and ease of use. The instructor survey consisted of seventeen questions. Per the RAIT model, questions encompassed demographic information, participant attitudes towards the technology, actions taken to prepare courses for online delivery, and overall satisfaction. Developed as a form of quality assurance, the project did not require approval from the Research Ethics Board.

The pilot project identified two broad goals:

  1. For the CTLI to learn more about the strengths/weaknesses of automated proctoring software from the viewpoint of students and instructors and to investigate the issues of privacy and equity, as discussed in current literature.
  2. For students and instructors to experience the Respondus Monitor (RM) software in the context of their disciplines and classes and to create an opportunity to share their experience / perceptions.

Why a Discussion Paper

The process of assessing student learning is integral to the academic success of students in the classroom and their longer-term success following completion of their studies. Developing and administering assessments, followed by evaluating student work, presents a broad range of ethical considerations, most commonly issues of reliability, validity and fairness (see next page). Changes to the assessment process, especially where technology mediates the delivery, presents additional issues that may amplify or create new concerns. For these reasons, the issue of remote, online proctoring merits discussion.

This discussion paper begins from the viewpoint that the process of integrating technology into one’s teaching practice requires careful consideration—going beyond identifying what a technology can simply do to also account for its broader effects and consequences, especially as they relate to ethics, pedagogy and ones’ philosophy of teaching. This viewpoint is generally consistent with the definition of Educational Technology put forward by the Association for Educational Communications and Technology (AECT):

Educational Technology is the study and ethical practice [italics added] of facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources (Januszewski & Molenda 2013, p.1).

This discussion paper will share what the Centre for Teaching, Learning and Innovation (CTLI) learned from the pilot project, including findings from available literature and from student and instructor survey feedback. The purpose of this discussion paper is to:

  1. Share information from a brief scan of recent literature on the topics of privacy and equity as they relate to remote automated proctoring.
  2. Summarize information from the pilot project concerning the student and instructor experience of Respondus Monitor.
  3. Facilitate discussion and decision making, while aiding instructors in making an informed decision with respect to integrating or not integrating remote proctoring software into their teaching practice.

Additional work is required at Lethbridge College to (1) explore how to best support instructors interested in developing assessments that do not utilize automated or human proctored software; (2) to determine how best to support those instructors that do require proctoring software; and (3) to identify whether Lethbridge College requires standard-operating procedures that reflect consideration for privacy and equity-related issues.

null

Reports

Equity and Privacy Considerations for Instructors – full report (internal version for staff at Lethbridge College).

Equity and Privacy Considerations for Instructors – full report (external version not including pilot findings).

Comments are closed.