Specially trained undergraduates like Guatam Panakkal (right) lead sessions where students assess an instructor’s teaching in a course they’re currently taking.
Marcos E. García-Ojeda wants to improve his teaching. He has flipped his classroom and embraced active-learning techniques. And he’s even invited some observers to sit in on his “General Microbiology” class here at the University of California at Merced on a recent afternoon.
The observers will give Mr. García-Ojeda, an associate teaching professor of biology, a detailed depiction of the teaching and learning in his class — actions that are central to a college’s purpose but rarely examined.
We’re sorry. Something went wrong.
We are unable to fully display the content of this page.
The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows
javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.
Once javascript and access to those URLs are allowed, please refresh this page.
You may then be asked to log in, create an account if you don't already have one,
or subscribe.
If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com
Max Whittaker for The Chronicle
Specially trained undergraduates like Guatam Panakkal (right) lead sessions where students assess an instructor’s teaching in a course they’re currently taking.
Marcos E. García-Ojeda wants to improve his teaching. He has flipped his classroom and embraced active-learning techniques. And he’s even invited some observers to sit in on his “General Microbiology” class here at the University of California at Merced on a recent afternoon.
The observers will give Mr. García-Ojeda, an associate teaching professor of biology, a detailed depiction of the teaching and learning in his class — actions that are central to a college’s purpose but rarely examined.
This examination is especially unusual because of who’s performing it: undergraduates. The observers — three current students and a recent graduate — are part of a program called Students Assessing Teaching and Learning, or Satal. And they have been trained in an array of techniques: observations, interviews, focus groups.
Professors may be fond of students, on balance. But if an interloper wanted to blend in at a faculty gathering, casually dismissing the opinions of undergraduates would be a convincing approach. Course evaluations? Useless. Designing a college around students’ needs? Selling out.
ADVERTISEMENT
The program pushes back against this view. At the heart of its work is the assertion that students have something valuable to say about teaching. In fact, it’s precisely because they’re students that they possess a unique ability to translate the perspectives of their peers.
The four observers arrive just as class is about to begin. The classroom they’re walking into is an unconventional one, specifically designed to foster active learning. It has no clear front or back — there’s a podium in the middle, with rectangular tables, each equipped with a screen, arranged in a square around it. As a result, there’s no obvious place for the observers to sit, so they improvise quickly, borrowing some chairs from the classroom next door and positioning themselves at the edges of the room, two on each side.
Gautam Panakkal is one of them. He pulls his chair to the far side of the room, across from the door. Leaning back in his seat with his backpack between his feet, he pulls up a spreadsheet on his laptop. The first of its five columns is for the time — the observers record what is happening every two minutes. The remaining columns are for notes on what the students and instructor are each doing and for coding those behaviors using a classroom-observation tool. The code “MG,” for instance, indicates that the instructor is “moving through class guiding ongoing student work” during an “active learning task.”
The observers had split up who would track the behavior of the instructor and that of the students, but Mr. Panakkal, who is a psychology major, finds he has time to do both. The class unfolds without tangent or disruption, and students remain on task.
The class has nearly 90 students, who work with their tablemates in groups of about 10. Mr. García-Ojeda wanders from table to table, speaking briefly with one group before moving along to the next. Sometimes he gathers the whole class together and asks a series of questions.
ADVERTISEMENT
About half an hour into the class, Mr. García-Ojeda projects a diagram showing the forces that influence how a positive ion moves across a membrane. He asks if the membrane potential and gradient always work in the same direction. A student murmurs “no.”
“Somebody said no,” Mr. García-Ojeda says. “An example of an ion in which that will not be the case, then?”
There’s a pause. Finally, a student says he has a guess: “Sodium.”
Mr. García-Ojeda welcomes the attempt and draws the student out. “Tell me about sodium,” he says. The student begins to explain, but trails off, apparently losing confidence. “Sodium … or potassium?” Mr. García-Ojeda offers. “So now tell me about potassium.”
The student finishes explaining, this time using the correct ion.
ADVERTISEMENT
“There we go,” says the professor.
On his spreadsheet, Mr. Panakkal types: “Students feel comfortable answering questions because of the informal atmosphere the professor creates.”
It’s easy to dismiss students’ feedback about teaching as inexpert and rife with biases. But the program at Merced is rooted in an opposing view — that students may actually bring some special advantages to the task.
Because they’re taking other classes at the university, students in the program can tell professors how a class fits into a cross-section of students’ experiences of Merced. The students working in the Satal program can help explain where their peers’ feedback is coming from, and what it might mean. And because of that peer relationship, the student workers may be able to get better feedback in the first place. Students are often willing to talk more openly with fellow students than they are with a faculty member about challenges they encounter.
At some colleges, getting a new initiative off of the ground can be a challenge. At Merced, it’s practically unavoidable — the university, which has some 7,000 undergraduates today, enrolled its first ones in 2005. So when Adriana Signorini returned from a conference excited about a program at Brigham Young University in which trained students gave teaching feedback, Merced’s Center for Engaged Teaching and Learning, where she works, gave her a green light to create something similar.
ADVERTISEMENT
The Satal program began in 2009 with five paid student workers, and now has 11. New students go through an orientation, and continuing training is a key feature of these campus jobs. In the course of an academic year, the program will provide Merced professors with about a hundred services, which include observations, interviews, and focus groups.
If professors aren’t convinced that they should listen to what students have to say about teaching, there’s good reason for their skepticism. Typically, feedback comes from course evaluations. And course evaluations are as ubiquitous as they are famously flawed. They’re an important moment in the relationship between student and instructor, but they don’t seem to be working all that well for anyone.
Instructors’ disdain for course evaluations is often personal. Most can probably recall comments that were irrelevant, inappropriate, or simply outside of their control.
But the problems are systemic. Students’ comments, research shows, are influenced by their instructors’ race, gender, and attractiveness. Even if these biases were corrected, course evaluations might not be a good measure of teaching, anyhow. What they really capture, their many critics argue, is student satisfaction.
All of that matters because course evaluations often feed into decisions about tenure and promotion, or in the case of adjunct professors, continued employment. On top of everything else, feedback in evaluations is often quantified in ways that one mathematically informed critique argues makes little statistical sense.
ADVERTISEMENT
Students, for their part, have little incentive to put much effort into their feedback. Evaluations are usually distributed at the end of the term. So even if students have a great suggestion for how a class could be improved, and the professor decides to make the change, they won’t be around to benefit from it.
Ms. Signorini believes the Satal program can improve the quality of course evaluations at Merced. The plan: offer a short training on how to provide valuable feedback to a broad swath of the university’s students.
If that project is successful, it might have a side benefit: priming more of the university’s professors to seek out the program’s other services — its observations and interviews that can give professors insight into what’s really happening in their classrooms.
Those insights only make a difference, though, if professors are willing to adjust their teaching.
Noemi Petra, an assistant professor of applied mathematics, regularly seeks out the center’s services for new insights on her teaching. On this day, Mr. Panakkal and Brianna Vasquez are interviewing students in her numerical-analysis class, a small upper-level math course. About a dozen of them are clustered in the first few rows of a lecture hall.
ADVERTISEMENT
Mr. Panakkal and Ms. Vasquez guide students through three questions: “What helps your learning in this class? What changes could the instructor make to improve your learning?,” and “What actions could you take to improve your learning?” The questions are designed to keep the focus on students’ learning, not their satisfaction.
First, Ms. Petra’s students answer these questions in individual written surveys. Then they break into groups to discuss and record their answers. Students in one group have an animated conversation about how Ms. Petra should give them more examples. Those in another group keep to themselves, looking down at their papers.
Finally, Ms. Vasquez and Mr. Panakkal open the floor for students to share their comments and to gauge consensus. When the discussion turns to changes the instructor could make, the students say they want Ms. Petra to give more examples. “Could you put, like, in reference to, like, with actual numbers?,” one student says. “More examples with numbers.”
Max Whittaker for The Chronicle
The questions asked by Guatam Panakkal (right) and other student workers during assessment sessions are designed to keep the focus on students’ learning, not their satisfaction with the class or the instructor.
Mr. Panakkal types the comments, which appear on a projector so that everyone can see. Ms. Vasquez asks students to raise their hands if they agree. The whole class thinks that more examples with real numbers would help. Mr. Panakkal takes note of that, too.
Seven of the students agree with the next suggestion, that Ms. Petra could offer more office hours. One student adds they should be Mondays and Fridays, but not everyone agrees. The students go on to raise several points about the homework, including a desire for the professor to post correct solutions after their work is graded.
ADVERTISEMENT
The structure through which Mr. Panakkal and Ms. Vasquez solicit feedback is built deliberately. It helps Ms. Petra’s students articulate their own opinions instead of resorting to group-think and gives them space to share thoughts they might not be comfortable expressing in a group.
The next morning, the students and Ms. Signorini walk Ms. Petra through her results. “Apparently,” says Mr. Panakkal, “all 11 of them who were there said that they would like more math problems with actual numbers in them.”
Ms. Petra is not surprised — she’s heard students make this point before. “They’re hitting the level in math where things are getting more abstract,” she explains.
It’s only natural for students to ask for more real numbers. After all, that’s what math has been about up until this point. But it’s Ms. Petra’s job to get them to the next level of understanding the discipline. That puts her at odds with her students, at least a little bit. From the professor’s perspective, the students are asking for something that won’t ultimately help them.
It’s easy to dismiss students’ feedback about teaching. But students may bring some special advantages to the task.
Interviews like these can reveal small problems with easy fixes, misunderstandings between professors and their students, say, or things the instructor does that inconvenience students and serve no educational purpose. Sometimes, though, they describe students’ frustration with challenges that are inherent in the learning process.
ADVERTISEMENT
Ms. Signorini mentions that she and the students discussed similar feedback with a different professor recently. It can help, Ms. Signorini explains to Ms. Petra, to link challenging work back to the course’s learning outcomes. A professor could even begin a course by having students buy into the idea that it will develop deeper abilities than rote memorization, she says.
Conversations like these are why Ms. Petra is a big fan of the program. Students observed the very first course she taught at Merced, in the spring of 2015, and she’s had a class interview done every semester since then. The reason? “Every single class is different,” Ms. Petra tells the interviewers. “I’m just going to fine-tune a couple of things.”
Improving how she frames her students’ expectations, as Ms. Signorini suggested, is something that Ms. Petra thinks she can do. She offers an analogy. She writes in pretty — but very small — cursive, and students often find it hard to read. Ms. Petra knows this, warns her students, and posts her notes online so that they can revisit them. “Not many complain about my writing,” Ms. Petra says, “because I tell them from the beginning.” Perhaps she needs to do the same thing when it comes to the absence of real numbers. “They have to know in advance, probably,” she says.
Sometimes students use the class interview as an opportunity to try to negotiate with their instructor, Ms. Signorini says. “It opens the conversation,” she says. “This is what we’re going to change. But I’m not going to change this because this is the outcome for the course.”
And sometimes, students will be content with the status quo if a professor explains the thinking behind it — or simply hears them out.
ADVERTISEMENT
The conversation works in both directions. Ms. Petra asks the student interviewers for suggestions about one of the pain points in her class: Students don’t set aside enough time for their homework.
“They actually do know they should be doing it earlier,” Mr. Panakkal says. Eight of the students agreed it was something they could do to improve their learning. One idea, he adds, is to design the homework so that the first question is relatively easy.
Ms. Petra already does that. Eventually, she comes up with a solution that she later shares with her students: She adds an hour of office hours, not long after her class ends on Tuesday afternoons, which is the day she assigns the week’s homework. That way, her students can start on at least the first problem or two in her presence. And as Ms. Signorini suggested, Ms. Petra tells her students about how the absence of real numbers relates to the learning outcomes for the course — though she’ll give some more concrete examples when it makes sense.
Listening to students, in other words, doesn’t require giving them everything they ask for. Professors are still in charge.
What’s true for math students confronting problems without numbers is also true for professors working on their teaching: Learning does not move in a straight line. Carefully collected feedback, closely listened to, might result in small changes. Another hour of office hours. A sprinkling of concrete examples. Once in a while, professors might decide to overhaul the way they teach. But change often happens in fits and starts.
ADVERTISEMENT
Before any of that can happen, though, professors need to know whether their actions in the classroom match their intentions.
All of the codes the observers jotted down during Mr. García-Ojeda’s microbiology class were converted into pie charts to illustrate how he and his students spent the 75-minute period. The largest slice of the professor’s time, 25 percent, went to guiding students through an activity. The next largest pieces were for posing a question, working one-on-one, and waiting. The students’ pie, meanwhile, showed that they were listening 24 percent of the time and working in groups 18 percent.
To Ms. Signorini, it all looks great. But the more important issue, she says, is whether the picture of Mr. García-Ojeda’s class reflects what he’s trying to do with his teaching.
“Are you happy with this?,” she asks.
“I’m happy that the lecture time is only 10 percent,” Mr. García-Ojeda says.
ADVERTISEMENT
The student observers have lots of positive things to say about his class. They liked the informal atmosphere, the high energy level, the fact he knew his students’ names.
Most of the changes they suggest revolve around the finer points of his students’ engagement. One of these concerns relates to team-based learning. Students who don’t understand the material can go undetected when they work in teams, Mr. Panakkal says, since their collaborators provide cover.
“To counter that, because that’s been one of my big concerns,” Mr. García-Ojeda says, “I use the clicker questions.” Students’ individual answers count as their class participation, he says — and give him a sense of whether or not they grasp the material before they take a formal test. When he teaches, Mr. García-Ojeda makes a point of calling on students who have not yet participated. He does the same thing during the feedback session, specifically asking Valezka Murillo, an observer who’s been relatively quiet, what she thinks.
It turns out she thinks Mr. García-Ojeda’s solution is incomplete because, she says, he gives students time to discuss clicker questions before they punch in their answers. “So some students may be able to get away with not watching the lecture,” she says, “or not doing the readings because they’re, like, mooching off of what other people are saying.”
Ms. Vasquez suggests a solution. “What if you were to give them the question to answer on their own,” she says, “and then after, they could discuss it as a group and then answer again?”
ADVERTISEMENT
“Good point,” Mr. García-Ojeda says.
Mr. García-Ojeda used to lecture much more. Back in 2010, he was teaching in a traditional lecture format, but he wasn’t satisfied with how his classes were going. He asked the program to observe one of his classes, hoping to find ways to get his students interested in the material.
The feedback he received sparked Mr. García-Ojeda’s interest in active learning, and he made some changes in his teaching. Then in the fall of 2014, he was asked to teach two sections of cellular biology, because Merced was short on large lecture halls. So he flipped both of his sections, and then studied how his students’ outcomes differed from those of the students who had taken the lecture-based version. His students’ exam scores were higher in the flipped version. For Mr. García-Ojeda, the recent class observation was further confirmation that his new approach was good for students.
Feedback serves more than one purpose. It can lead professors to make changes, big or small. It can also tell them something just as important — whether those changes are paying off.
Beckie Supiano writes about teaching, learning, and the human interactions that shape them. Follow her on Twitter @becksup, or drop her a line at beckie.supiano@chronicle.com.
Beckie Supiano writes about teaching, learning, and the human interactions that shape them. Follow her on Twitter @becksup, or drop her a line at beckie.supiano@chronicle.com.