Perspectives II: A Series of Articles on Teaching at Langara during the Pandemic

Early in 2020, students all over the world experienced a sudden, dramatic, and unprecedented shift in their academic world. Four months on, we have found ourselves finishing one semester remotely,  delivering another fully online, and preparing for at least two more semesters teaching and learning in the virtual environment. At the end of the spring semester, two of our colleagues collected information to help them understand the impact of this move to remote learning.

Our second contribution comes from Cameron MacDonald in the department of Biology. Cameron has worked with his colleagues to collect and analyze data on learning before and after the shift to remote teaching. In this piece, the authors statistically evaluate performance on online assessments compared to face to face assessments.

TCDC is providing a platform for instructors and instructional staff to share their reflections, opinions, and findings. If you want to contribute a piece to the perspectives series, please contact Jessica Kalra at

Online Assessments Benefit Biology Students that Performed Poorly in Face-to-Face Assessments

by Cam MacDonald, Anoush Dadgar, Ken Naumann, Melissa Hamilton, & Stephen Connor


At Langara, like at many Canadian postsecondary institutions, there was a sudden switch from face-to-face to fully online instruction during the Spring 2020 semester due to the COVID-19 pandemic. This switch presented a unique opportunity to compare student performance at face-to-face assessments to their performance at online assessments. Many published studies of this sort have compared different sections of the same course (e.g. an online section compared to a face-to-face section) with instructors and/or students varying between compared sections. The COVID-19 pandemic allowed these comparisons to occur within sections, keeping instructors and students constant.

Here, we compare face-to-face to online exam performance within five biology courses: BIOL 1111, 1115, 1191, 1215, and 1218. Two questions drove these analyses: 1) Is face-to-face performance a good predictor of online performance, and 2) Did online assessments preferentially benefit students that performed poorly in face-to-face assessments?


For each course, two sections taught by the same experienced instructor were combined for all analyses. These courses all have two midterms and a final exam, and in this study the combined percent for two face-to-face midterms were compared to one online final (except for BIOL 1215 where one face-to-face midterm was compared to an online midterm and online final). Students who withdrew or did not write all exams were dropped from analyses. Quiz and lab grades were not used for these comparisons. No outliers were removed. All analyses were done using Microsoft Excel.


Looking at all five courses together (Table 1), online percentages were +3.25% higher than face-to-face percentages, although there were large discrepancies between courses (range = -2.2 % to +11.6%).

Table 1: Percent averages for the face-to-face and online portions of five biology courses (10 sections). The two sections of each course were taught by the same experienced instructor.

There were significant, positive correlations between face-to-face % and online % for all five courses indicating that students that performed better in face-to-face assessments also performed better in online assessments (Fig. 1). However, the relatively small r-squared values indicate a large amount of variability in the ability of face-to-face % to predict online %.

Figure 1: Correlations between percent grades in the face-to-face and online portions of five biology courses. Students that withdrew or did not write all exams were dropped from all analyses.

If face-to-face % were direct predictors of online %, data would cluster along a trendline forced through a zero, zero intercept. However, data is generally a poor fit to a trendline that is forced through the zero, zero intercept (Fig. 2).

Figure 2: For these correlations the trendline is forced through the origin (0, 0 intercept). Students that achieved a similar grade in face-to-face and online components would fall on the trendline.

The previous correlations (Fig. 1 and 2) obscure the fact that online assessments typically benefitted students that performed poorly in face-to-face assessments and hurt students that performed well (Fig. 3). It was not uncommon for weak students to perform >20% better in online assessments and good students to perform >20% worse. If online assessments corresponded closely with face-to-face assessments, data here would fit tightly to the x-axes (low r-squared value but with small residuals).

Figure 3: Correlation between face-to-face % and difference between online % and face-to-face %. Students that achieved the same grade in face-to-face and online components would fall on the x-axis (bolded).

Combining data from these five courses is fraught with problems as these courses are different in content and cater to different populations of students. That said, the increased sample size and the significant negative correlation that results clarifies the pattern seen in Fig. 3; online assessments benefitted students that performed poorly in face-to-face assessments and hurt students that performed well (Fig. 4).

Figure 4: Significant, negative correlation between face-to-face % and difference between online % and face-to-face % for all five courses combined.


While it is challenging to deliver course content in an online environment, it is much harder to properly/fairly assess students in a fully online environment. The data here show that our online assessments disproportionately benefitted students who performed poorly in face-to-face assessments, and also did a poor job of mirroring student performance at face-to-face assessments. The five instructors involved in teaching these courses combined have over 100 years of face-to-face teaching experience at Langara, and thus we are reasonably confident that our face-to-face assessments were indicative of the students’ actual abilities.

Why did students that performed poorly in face-to-face assessments fair better at online assessments? Perhaps these students processed course material better in an online environment and were consequently better at online assessments—this is indeed likely the case for some of these students. There were undoubtedly also some students that were highly motivated to use all available means to improve assessment performance (e.g. online resources, classmates, ghostwriters, google lens, etc.). Despite employing strategies to limit nefarious exam writing, such as tight time restrictions, no moving backward through exam questions, and a focus on comprehensive long answer questions, data here suggest it is difficult to properly/fairly assess students in a fully online environment. This is an environment where a student who has a “helpful” friend with an advanced degree in biology has a great advantage over those students that do not. As students write more online assessments, we imagine their abilities to cheat at online assessments will improve (e.g., and more than offset our limited abilities to counteract their efforts. Even the most experienced online instructor cannot eliminate the challenges that exist around cheating when all assessments are fully online and lack even basic invigilation. This is a serious concern for all biology instructors at Langara, and it is the primary reason the biology department has insisted on offering almost entirely face-to-face courses over the years despite pressures to offer more online courses.

As important, and often forgotten in the rush to hinder potential cheaters, is the fact that online assessments can hurt honest students that use only instructor-permitted resources to write assessments. Over ten percent (n=28) of our students achieved >65% in the face-to-face portion of their course but did 10% – 35% worse in the online portion. We are worried that online exams with hopelessly tight time restrictions and harder-than-usual questions will force honest students to pursue all available means to improve assessment performance. The grades of honest students, and student honesty in general, may be yet more collateral damage of COVID-19.

The COVID-19 pandemic unquestionably necessitated moving content and assessments fully online in Spring and Summer 2020. But we hope the Langara community is as keen to get back to face-to-face assessments as we are. Restaurants are opening and kids are returning to school; it should be possible to soon have major assessments on campus while respecting students’ physical distance. Of course, a serious second wave of COVID-19 would force courses fully back into the online realm, and this a scenario that instructors will have to be ready for. But it also worthwhile to plan for new normal (Spring 2021?) that involves courses where content is largely delivered online but coupled with well-spaced, face-to-face assessments on campus. Then, eventually, hopefully, back to what we prefer … the old normal where both classes and assessments are face-to-face.

This entry was posted in TCDC General and tagged . Bookmark the permalink.

One Response to Perspectives II: A Series of Articles on Teaching at Langara during the Pandemic

  1. Jeremy says:

    Thanks for sharing these finds, it’s great to quantify statements like “we think students are cheating more frequently in online classes”.

Comments are closed.