Teaching the same course for the third time, I felt fairly confident about its content. How it would need to be adopted for the “Hy-Flex” format was still to be determined. The uncertainty of the Fall 2020 semester required all instructors to be prepared for the unknown: be prepared if the university is in a lockdown, be prepared if students are in quarantine, be prepared if the instructor needs to quarantine; yet deliver the lectures in-person for residential engagement. As I have highlighted in my previous post, I implemented various technologies and techniques in my class, some of which supported flexibility such as the use of automatic recording and uploading of lectures, while others are dysfunctional during a pandemic such as the use of nametags in a large classroom where walking around the aisle would be inappropriate when trying to maintain social distancing. In this article, I highlight what worked and what did not.
Look for resources provided by others
In the Spring of 2020, many instructors had to modify their courses abruptly as universities were shutting down. Fortunately, I was not teaching in the Spring semester and I was able to focus on my research, but I did take note of the methods that were being used by my colleagues. When I started preparing for my Fall 2020 course, I looked for resources on what others had experimented with and selected a few to try out myself. I found the updated post from Brent and Felder as well as a post from Derek Bruff at Vanderbilt University. I had also gone through the Impact X Access course material offered by my university on revising our courses for Hy-Flex delivery. Although the Impact series of courses have been very helpful for faculty when participating in-person, the online version was quite useless. Despite a lot of carefully drafted content, it did not have any practical advice or help me modify my course.
Based on other people's experiences, I did notice that keeping students engaged would be the most challenging part of the course delivery. Usually implementing think-pair-share activities in-class allows for such engagement, but social distancing and online learning made this more difficult. After reviewing the resources, I settled on continuing to use many of my previously implemented tools: Boilercast (Kaltura) for recording lectures (while using a personal over-the-ear microphone provided by the university); Hotseat (online polling system); Respondus Lockdown Monitor (for quizzes and exams); Gradescope (for homework and written answers); Qualtrics (for peer-review); and nametags (as I had in the classes before). I also experimented with a few new approaches: Google Drive and Jamboard (suggested by the Vanderbilt article); Kahoot! (a gamified option for questions, suggested by students midway through the course); Webex (as a backup for remote online lectures); and Brightspace (learning management platform implemented by my university recently). Of course, I was wearing a mask during the lectures, which softened my already mumbled voice, and I would have been completely inaudible if it were not for the personal microphone.
Experiments with tools and adapt on the fly
To begin with, I had prepared for technology failures. For example, early in the semester, the Kaltura service was unable to handle the workload of all the universities using their systems simultaneously and this hurdle was leading to delays in uploading of videos. Similarly, sometimes the live stream feature in my classroom was not working because of issues with signing into the desktop computer in that room. For both scenarios, I listed my Webex personal room link in the syllabus and used it as a backup. Webex (like Zoom) also allowed for recording and posting of lectures. I would have used the same link if I had to quarantine during the semester. I did indeed use Webex when delivering the last two lecturers after Thanksgiving which were meant to be completely online.
On the first day of class, I enjoyed the energy in the classroom because it had been months since I had seen and heard a crowd in-person. The engagement of students was refreshing, although it did wither out in the coming weeks as I saw in-person attendance drop to about 15 to 20% and online live attendance of an additional 30 to 40%. The remaining students were most likely catching up on content asynchronously since 100% of the students were still completing the quizzes and exams.
In an attempt to mimic the think-pair-share activities, I introduced a Google Sheet where students could enter their names and then respond to questions. It took about 10 minutes to set up during class. I also posted a link to Google Jamboard where students could collectively draw out potential solutions. After a few lectures, I quickly realized—with feedback from students—that the Google Drive approach was not helping anybody’s learning. It wasn't an effective way for the online students to communicate quickly for these short activities (less than five minutes). So, I stopped using Google Drive (Sheets and Jamboard) after a while and instead focused on Hotseat as the main tool for engagement. Hotseat allowed me to associate responses with bonus credit since all students are signed in with their university identification when using Hotseat, which is not the case when using Kahoot!. Although Kahoot! is more fun due to its music and scoring (based on response time for answers), the limitations on the number of answer choices, question types, and lack of student identification prevented me from completely adopting it. I think it is still a useful tool if a series of ungraded questions are to be delivered during a single lecture.
For quizzes and exams, I use Respondus Lockdown Monitor to record the students (video and audio) and Gradescope to allow submission for partial credits. Since I had already used both tools before, they were easy to adopt. Aside from the usual technical hiccups early in the course, the students seem to have figured out ways of using both the tools.
As described in my previous post, I experimented with a series of mini-projects this year. These projects focused on targeting end-user jobs and designing solutions to their problems instead of simply on the technical concepts in the course. The final deliverable was a 3-minute video where students presented what they learned and how they could solve the problem using spectroscopic methods. Some of the students did an excellent job being creative and engaged (as seen by an example here and the video below).
Obtain feedback and adjust course
Since I was still experimenting with some aspects of the course, I obtained early feedback and made changes to the course (e.g., removing Google Drive). The most common feedback was on providing more practice problems. Even though I found several practice problems for the students to complete from multiple textbooks, they seem to have ignored them simply because no credit was associated with them. I mean, they are practice problems after all and so meant to prepare them for quizzes and exams, which do have credit. The need for practice problems has been a constant ask from the three classes that I have taught. Even though I have found and provided tens of practice problems, this request seems unfulfilled. I believe it is because the questions in exams are different from the practice problems. Exams and quizzes are open-book and thus, I cannot include very similar problems on them, or it wouldn't be much of a test. Instead, I expect students to break apart the problem, determine what is still unknown, and then determine which parts of the course need to be applied to the problem. Instilling this process will take some more work and I think illustrating the process in class was helpful but perhaps I need to spend more time on it (e.g., the whole lecture). I think practice exams would be useful as well in the future and I'd most likely work through some of these parts of the exam.
I did obtain a 96% response on course evaluations and responses have gotten more consistent this time around instead of being bimodal. It seems that providing bonus points for filling out course evaluations at least increases the response rate. Overall, some students really enjoyed the course and understood that it was about learning in biological engineering, while others are stuck on being unable to decipher the exams (due to apparent lack of practice problems). There are still some conflicting comments and thus impossible to satisfy everyone. If you wish to read how the student evaluations have evolved over the years, you are welcome to look through the evaluations for 2018, 2019, and 2020.
To conclude, most of my previously implemented tools and approaches helped me maintain flexibility in my course. Fortunately, we did not face severe disruptions throughout the semester and maintained in-person lectures all the way through Thanksgiving (as planned). Given the turmoil of the entire year of 2020, I am glad that teaching went smoothly enough. Since the vaccine for SARS-CoV-2 has arrived, I would presume that in-person instruction would resume in the Fall of 2021.
Commentaires