by

Exam Questions: Outsourcing vs. Crowdsourcing

Liverpool Street Station crowd blurLast week, academic scandal-mongers were treated to an unusual spectacle at the University of Central Florida: What at first looked like a juicy incident of mass cheating turned out, on closer inspection, to also point up a less-attractive aspect of contemporary pedagogy: the reliance on publisher-provided test banks for exam questions. If this story is news to you, Robert Talbert has two great posts exploring it here and here. (The role of YouTube in this whole affair–both in establishing the professor’s ethos and in facilitating a student response–is particularly interesting.)

It’s hard to defend faculty who rely exclusively, or even largely, on publisher’s test banks–although if you ask around on campus, you can probably find some folks who do this. (The fact that some publisher’s test banks are crafted to plug directly into prominent LMSes, such as Blackboard/Vista, is worth some reflection.) At the same time, it’s not immediately clear that the faculty are entirely to blame. Some departments may well turn to such publisher-provided test banks as a way of responding to ever-heightening demands for assessment, and of course many part-timers are (explicitly or subtly) steered into such practices. (And, as Robert mentions in his second post on this story, the local context at UCF–often lauded as a model of cost-cutting educational innovation–is not irrelevant.)

Besides obvious benefits such as convenience, test banks do offer an appealing legitimacy to exam questions. Many students complain, or, at least, faculty worry that students will complain, that their final exam was unfairly arbitrary or idiosyncratic, and doesn’t really reflect the student’s knowledge of the subject. Outsourcing exam writing to textbook publishers appears to solve this problem: an exam drawn up by the publisher of the course textbook will (hopefully!) reflect that book and its material accurately. But the disadvantage of this process is pretty clear–it reduces the faculty member to an appendage of the textbook. Indeed, on such a model it’s a bit hard to understand what the need for PhDs really is.

A more interesting alternative to outsourcing the production of exams is to crowdsource them. My wiki-based courses feature student-written exams:

I then give students a week to comb through their notes and their books to come up with passages for identification, short answer questions, and essay exams. The deal I always make is that if the students come up with an adequate number of smart questions, then I’ll draw the exam entirely (or close to it) from their questions, and will usually post it as a study guide a day or two in advance of the final.

What’s nice, and sometimes terrifying, about this approach is that the resulting questions usually do genuinely reflect the class’s work. That is, it quickly becomes clear what your students will be taking away from your class. Further, when the students collaborate in this way, they both have to do the reflective, synthesizing work of question-writing (which is better than cramming) and to come to an implicit agreement about what our course was about.

I like this assignment, and its wikified class notes component (background: here, here, and, from the pre-ProfHacker days, here), a great deal, but will freely confess that it doesn’t meet one of the needs of test banks: it does nothing to reduce the amount of work in a given semester. The point of these new technologies, though, ought to be to help us create pedagogically interesting courses, not deliver someone else’s content more cheaply.

Have you experimented with writing exam questions? Let us know in comments!

Image by Flickr user victoriapeckham / Creative Commons licensed

Return to Top