Over the last few months, many people have conducted instructional courses virtually and shared their experiences online. In that respect, our reflections below on what we learned conducting a virtual course this summer are not ground-breaking. We provide them for those who attended the course, for our future selves, and anyone else interested in our specific instructional circumstances.
online syllabus with links to course information, all lesson materials, and pre-course information
each lesson completely in text form as a website
series of short live-coding videos to supplement (not replace) each lesson
computational “lab” provided to participants via web-brower log-in
all software, data, and lesson materials pre-installed for participants in lab
office hours and exercise review Zoom calls with 2 instructors
Slack workspace for the course with channels for each team, each day, and general and welcome channels also
video calling with screen sharing for teams in Slack
personalized team coaching on Zoom calls
What worked as expected:
combination of tools for collaborative virtual work was successful (Slack, Zoom, etc.)
participants introduced themselves on the Slack #welcome channel, and made some connections across teams
teams liked using Slack to chat during the course; it proved a productive tool
participants used Slack to ask questions of the instructors or tell instructors of problems with lessons
recording office hours in Zoom allowed those with slower/intermittent internet connections to view the videos later
supplemental coding videos received very positive feedback from participants
teams were largely able to stay on task and produce a good product by the end of the week; they will likely be able to use these products in their future work
teams that were already somewhat established made a lot of progress (some of the newer teams less progress in the virtual format)
several teams met with one another on their own time to discuss potentially overlapping research…great!
What didn’t quite work as expected:
We offered a technology testing opportunity 5 days in advance, but no one called in. However, there were issues throughout the week with software installation, connection issues, other questions about technology that should have been solved in advance.
Participants didn’t attend office hours to review exercises or ask questions as much as we expected. Therefore, it was really difficult to understand whether participants actually engaged with the lesson materials or not. We expected these office hours to function as “collaborative debugging sessions” for that day’s assigned lessons, but participants rarely brought their lesson “bugs”.
Participants didn’t fill out the polls we created asking which lesson exercises they wanted to cover in office hours. We therefore stopped putting out the polls, but after the course received feedback that participants expected us to walk through the exercises.
During in-person courses, we used to use daily stickies (one sticky for positive comments, one sticky for struggles that day). We created daily channels in Slack for this type of feedback, but did not have a good way of assessing participants on a daily basis.
After the course, we received the feedback that it was unclear how to apply the lesson material to their own project. We provided ample office hours time, yet many participants didn’t ask us for assistance with this during office hours.
A few participants didn’t block off the entire week for this course, as we had expected them to when the course was advertised and open for applications. This made it difficult to foster a community among participants, because the “all participants” course meetings did not benefit from the input of everyone.
A lot of participants kept their video off most of the time during video meetings. This did not foster community, and inhibited non-verbal communication with instructors and fellow participants. It also made it really hard for the instructors to get to know the participants, which is one of the things we instructors enjoy the most.
What can we do differently next time:
Offer technology testing closer to the first day of the course. While this doesn’t give time to solve many technical problems, it may garner more participation/troubleshooting if closer to the course time.
Better assess participant learning continuously throughout the course. In particular, daily assessments to replace sticky note feedback, and assessment of participants understanding of the application of lesson material to their on-going/future work.
Use probing questions about the day’s lesson material that require answers from all participants during Zoom calls. Have participants put their answers in the Zoom chat, Slack, or even a Google Doc. Wait until all participants have answered befor continuing the discussion.
Strongly encourage everyone to keep their video on at all times during course video calls to foster community and a colleagial atmosphere.
Make acceptance to the course contingent on agreeing to a participant code of conduct that will address the issues outlined above. This combines strategies from the Carpentries and Central Queensland University.