Welcome to Teaching, a free weekly newsletter from The Chronicle of Higher Education. This week:
- Dan tells you about a professor who asked his students to write their own test questions.
- Beth fills us in on some new research about online learning.
- We share some noteworthy books.
Asking for Answers
About a decade ago, Max Teplitski noticed something about the students he had been teaching online: They tended to collaborate, often asking and answering one another’s questions.
“They have a hive mind,” he said. “I thought, Why not capitalize on that?”
So here’s what Teplitski, who was then a professor of soil and water science at the University of Florida, did with students in “Ecology of Waterborne Pathogens,” a blended, online and in-person course offered chiefly as an elective for biology majors.
He asked each of them to write about 20 to 25 of their own multiple-choice questions, usually with four possible answers, before the second of two exams that semester (one year, they did it between the second and third exams). They could ask Teplitski and his teaching assistant for feedback along the way.
Many learning experts criticize multiple-choice questions because they seldom challenge students intellectually. Teplitski wanted his students to focus on higher-order cognitive processes, like analyzing or evaluating. So he had them learn about Bloom’s Taxonomy, too.
The students, most of them seniors and graduate students, saw a presentation on the taxonomy, which is an influential model for categorizing types of thinking and learning. They were told that only 20 percent of their questions could draw on the lower two domains, which emphasize recall and understanding of basic facts. True/false questions wouldn’t be accepted. Nor would questions that were based on a single PowerPoint slide from a lecture.
“By explaining to them what a properly constructed question was, we were directing them to a more-complex question that makes them think,” he said. “We really wanted to engage them in a higher level of learning.”
For example, one student’s question asked how fecal coliform, a kind of bacteria, would behave in a particular medium. The question was an example of applying knowledge, according to Teplitski’s interpretation of the taxonomy, because it requires not just the recall of a basic definition, but also analysis and deduction.
The questions that were scientifically accurate and grammatically correct were put in a test bank, without the answer key. To prepare for the exam, students consulted the test bank and were encouraged to discuss the questions with one another in an e-learning platform.
According to a study of the strategy, which he conducted over several years, most of the students’ questions tested higher-order learning. The average grade increased by about 7.5 percent relative to a control group. Students who had scored between 60 and 80 percent on the first exam, before the method was used, benefited the most.
“Gains were not observed in the semesters when the intervention was not implemented,” he and his co-authors wrote in an article that appeared in the Journal of Food Science Education last year.
I wondered whether familiarity was a factor, because the students’ contributions accounted for about three-quarters of the exam’s 20 questions. In other words, maybe students performed better because they had already seen many of the questions and answers?
Teplitski explained that, for each semester that he used this approach, there were about 600 questions in the test bank. “We figured if they had 600 questions and they sat down and memorized answers to all of them,” he said, “then how different is that from really studying?”
A more likely explanation, according to Teplitski and, he said, some of the scholars who reviewed his paper, was that the strategy made low-stakes, low-stress use of the “testing effect.” The act of devising questions and trying to answer them, as well as those written by peers, encouraged students to practice retrieving information and related material.
“Testing,” said Teplitski, “is a learning opportunity.”
Teplitski has since accepted a position with the federal government, but his co-author and former teaching assistant, Cory Krediet, is now an assistant professor of marine science and biology at Eckerd College. He’s used a modified version of the test-writing strategy with his students, but he hasn’t emphasized Bloom’s Taxonomy. Partly, he says, that’s because he’s not sure how that aspect of the effort will fare with less-experienced students. And, as a relatively new professor, he’s still finding his way as a teacher.
But Krediet has still gotten positive feedback from students on his version of the exercise, and he believes it encourages them to think through the course material outside of class. It also draws on the idea that the most effective way to learn something is to teach it. “You’re solidifying in your mind what the answer is,” he said, “and why it’s that way.”
Have you involved students in designing parts of your course? How? If you send me an email at dan.berrett@chronicle.com it might appear in a future newsletter.