In this abridged extract from our new e-book, The Professor’s Guide to Agile Teaching, we look at how technology has made it easier for professors to get student feedback at scale, identify gaps in learning and quickly make changes to their course.
Agile teachers need mechanisms to make sure students are actually doing the things for which they’re accountable, and to get insights into what’s going on. Multiple types of technologies have been developed to make it easy for the instructor to assess or deliver feedback on active student work: Clickers or cloud-based response systems like Top Hat, adaptive systems for at-home work and assessments delivered through learning management system (LMS) software can all be used to collect insights into how students are progressing through a course.
Technology enables active learning and agile teaching to work at scale. Here are two professors who have used student feedback from classroom technology and achieved great results.
Professor of Chemistry, University at Buffalo
Clickers were once Troy Wood’s hobby, the thing he tinkered with as a college educator. Then he got more serious about them, to the point where they also became, at various points in time, his Achilles’ heel, his moon mission and the bane of his existence.
“I started using classroom response in 2002, and back in those days we had receivers that we literally had to bring to the classroom with us, which was a management nightmare,” recalls the SUNY Buffalo professor, who teaches general chemistry and analytical chemistry to freshmen and sophomores respectively. He became so flustered by what he felt was a high-potential idea hampered by poor technology that he abandoned his clicker experiment in frustration for several years.
Wood rediscovered classroom response systems (CRSs) in 2011, or about three years following the commercial introduction of the iPhone. When he learned that Top Hat was using student smartphones to double as clicker devices, with responses submitted via Wi-Fi, his reaction was, “I’m going to try again. Students are bringing their cell phones all of the time to class anyway. And I’m tired of fighting against that. Now I’m going to embrace it.”
When the technology had advanced to the point where it was robust and reliable, he became adept at using it. Then, he realized he was suddenly sitting on a goldmine of student learning data.
Now, he no longer organizes class time around his lectures, but around his CRS. He’ll ask as many classroom response questions as he can muster and check the results in real time. If he sees a common mistake, he’ll stop and address it; if not, he’ll move on. “It turns out that by using the classroom response systems, I’ve become much more efficient in my lectures,” he says. “I have actually gained time because of this.”
Associate professor at Washington State University’s College of Veterinary Medicine
Leslie Sprunger teaches small-animal anatomy to first-year students, guiding (and sometimes herding) them through large volumes of foundational knowledge every week. But as the college’s associate dean, she’s also keenly aware that she’s not merely trying to graduate experts; she’s trying to turn them into clinicians.
As an early adopter of technology, as well as someone with a keen interest in the scholarship of teaching and learning, Sprunger has become a lifelong student of formative assessment. “My personal definition of formative assessment is that it is essentially feedback,” she says.
And she’s learned that the feedback students get from their peers, and even from themselves, is no less valuable than feedback they get from instructors.
To help her first-year veterinary anatomy students learn clinical communication skills, Sprunger has them attend lab sessions four days per week, working in groups of three. Because the class as a whole is too large for a single lab, students are assigned to one of two consecutively-scheduled lab sessions, which always cover the same material each day.
After a 10-minute presentation, the presenter and his or her three audience members evaluate it using a standard rubric, which they complete on their tablets or smartphones using Top Hat. Sprunger retrieves the data, reformats it, and then sends out the results. “The same day that students gave one of these informal presentations, they’ll get an e-mail that has their own self-assessment data in it and, adjacent to that, the anonymized peer assessment data from their three-person audience,” she says.
The system generates reams of data over the course of a semester, but Sprunger says it also produces results. “The students actually get better over the course of the semester in doing these brief professional communications exercises,” she explains. “And I know that because I can look at all the data that we collect. I have both the self-assessment data and, importantly, the peer assessment data. I can see that the average scores for each of the items on the rubric improve over the course of the semester for both.”
Explore agile teaching in detail with our new e-book: from gleaning real-time insights from data to tearing your course apart and starting afresh.
Fill in the form below to get free exclusive access to The Professor’s Guide to Agile Teaching.