Bang Your Head on Your Desk - the thread of teaching despair!

<< < (1631/2885) > >>

lohai0:
Quote from: sciencegrad on April 13, 2013, 11:46:53 am

Quote from: anakin on April 13, 2013, 11:29:41 am

Another thing you can do is offer the chance to earn back up to 25% of the points they missed if they complete a post-exam analysis. Although, mathematically, this is an insignificant portion their final grade, even under the very best scenario, unless your exams are heavily weighted, it's been my experience that many students take this very seriously. In that post-exam analysis, they should both explain why their wrong answer was wrong (in 2-4 well-formed and complete sentences), and derive the correct answer. This exercise promotes metacognition and potentially provides really valuable feedback for you about what your students are actually thinking, rather than merely what you think they are thinking. For someone whose research partly centers on naive conceptions and metacognition, I had no idea what some of my students were thinking about X.

I want to chime in and say that some of my best classes I've taken did this. Not only does it help make me feel better about my exam grade as a whole, but I have an incentive to look closely at the mistakes I made and to think more deeply about the questions.

I do this anyway. The B students got B's and the C students got C's, but the A students imploded. 1/3 or so got A's like usual, 1/3 got B's, 1/3 got C's and one got a D. Their lost points came on three questions (Stating definitions/theorems, one problem completed on the requested homework solution, and the True/False questions; most of which were converses/contrapositives of statements in their homework). That looks like test prep effort issue to me.

usukprof:
Quote from: anakin on April 13, 2013, 12:20:11 pm

Quote from: sciencegrad on April 13, 2013, 11:46:53 am

Quote from: anakin on April 13, 2013, 11:29:41 am

Another thing you can do is offer the chance to earn back up to 25% of the points they missed if they complete a post-exam analysis. Although, mathematically, this is an insignificant portion their final grade, even under the very best scenario, unless your exams are heavily weighted, it's been my experience that many students take this very seriously. In that post-exam analysis, they should both explain why their wrong answer was wrong (in 2-4 well-formed and complete sentences), and derive the correct answer. This exercise promotes metacognition and potentially provides really valuable feedback for you about what your students are actually thinking, rather than merely what you think they are thinking. For someone whose research partly centers on naive conceptions and metacognition, I had no idea what some of my students were thinking about X.

I want to chime in and say that some of my best classes I've taken did this. Not only does it help make me feel better about my exam grade as a whole, but I have an incentive to look closely at the mistakes I made and to think more deeply about the questions.

Precisely. And usukprof (heck, everyone) you may want to consider it for your undergrad courses as well. I'd submit this is a practice of critical thinking that is worth doing no matter what the discipline.

I should have qualified my answer better.  I teach mostly graduate courses and one mixed upper division undergraduate and graduate.  I do the redo-for-half-credit on these exams (except the final for which there is no practical way).  I don't do this in the large freshman digital design class I sometimes teach, which is a long sequence of problems to solve.  While there are a few longish problems for which this might work (e.g. design and sketch a 4-bit add/subtract unit), the size of the class would make doing this difficult.  I would do this for any class that wasn't (1) huge and (2) had exams of a style for which this works.

lohai0:
Quote from: anakin on April 13, 2013, 12:20:11 pm

Quote from: sciencegrad on April 13, 2013, 11:46:53 am

Quote from: anakin on April 13, 2013, 11:29:41 am

Another thing you can do is offer the chance to earn back up to 25% of the points they missed if they complete a post-exam analysis. Although, mathematically, this is an insignificant portion their final grade, even under the very best scenario, unless your exams are heavily weighted, it's been my experience that many students take this very seriously. In that post-exam analysis, they should both explain why their wrong answer was wrong (in 2-4 well-formed and complete sentences), and derive the correct answer. This exercise promotes metacognition and potentially provides really valuable feedback for you about what your students are actually thinking, rather than merely what you think they are thinking. For someone whose research partly centers on naive conceptions and metacognition, I had no idea what some of my students were thinking about X.

I want to chime in and say that some of my best classes I've taken did this. Not only does it help make me feel better about my exam grade as a whole, but I have an incentive to look closely at the mistakes I made and to think more deeply about the questions.

Precisely. And usukprof (heck, everyone) you may want to consider it for your undergrad courses as well. I'd submit this is a practice of critical thinking that is worth doing no matter what the discipline.

I have some actual data, if anyone is interested. These are not published yet and subject to replication, of course. (Anyone interested in a little experiment?)

In fall ecology, I allowed students to do an optional post-exam analysis. I wondered whether students who completed the Exam 1 post-analysis would perform any differently on Exam 2.

So here's what happened on exam 2 between the two populations:

Exam 1 mean (a priori score)   Exam 2 mean   change
Non-analyzers (N=89)    83.5                                  76.7                   -6.8
Post-analyzers (N=19)   76.4                                  79.9                    3.5

Three things popped out at me. First, students who undertook the post-exam analysis on exam 1 did significantly better on exam 2 compared to their own exam 1 mean score. Second, students who did not do a post-exam 1 analysis did significantly worse on exam 2 as a population. (Two-tailed t and non-parametric Wilcoxon, for the small N of the analyzers.) If this holds up, this is a very robust result. Third, it wasn't just the better students who did the post-exam 1 analysis; in fact, the post-exam 1 analyzers scored significantly lower than non-exam analyzers on exam 1, but significantly higher than the exam 1 non-analyzers on exam 2. (Exact same exams and formats administered to both populations, which I only split a posteriori.) So, that suggests that 1) there was no self-selection by better students to complete the analysis on the first exam, and 2) self-selection therefore does not explain the subsequent gain.

In the bigger picture of things: isn't it almost educational malpractice NOT to do something I KNOW increases my students' learning both proximately and ultimately?

I've been meaning to something quantitative on test corrections for ages. I didn't record grades this time with corrections separate (I just added it where applicable to their score), but wanna add it to the list, maybe for fall?

polly_mer:
Quote from: anakin on April 13, 2013, 12:20:11 pm

In the bigger picture of things: isn't it almost educational malpractice NOT to do something I KNOW increases my students' learning both proximately and ultimately?

I go back and forth on this because what I KNOW sometimes is conflicting with other things I know.  If you'd asked me last year about this time about test corrections, then I'd be with you on how this has to be done.

But, I've got another year of experience and now I have a big caveat: test correction and analysis works with students who want to learn, but happened to fall short this time.  My motivated-but-background-deficient students improve just as you've seen.  They need additional time and practice to get up to speed, but they use it well.

However, test meta-analysis with points for doing so is an utter disaster with students who can do the work, but refuse to do more than the bare minimum while counting on the professor to not let them fail.  What happens is people who should have earned an A if they'd done due diligence (i.e., done all the practice work instead of looking at everything as an isolated activity and only doing about 75% of it while not reading the book or paying attention to lectures), but decided that a D was good enough, earn a D or F on the test, scurry around to get back all allowed the points on the correction version,  and repeat the cycle.  I have no idea how someone who did all the work for the correction could then fail the same question on the next test, but I'm watching it happen for basic questions like "Pick the definition for this term from this list of 6 vocabulary words" and anything that involves math of more than one plug-and-chug step.  Some students have a huge capacity for mindwiping after the points have been earned.  Learning is not a goal for them.

ptarmigan:
Quote from: anakin on April 13, 2013, 12:20:11 pm

Quote from: sciencegrad on April 13, 2013, 11:46:53 am

Quote from: anakin on April 13, 2013, 11:29:41 am

Another thing you can do is offer the chance to earn back up to 25% of the points they missed if they complete a post-exam analysis. Although, mathematically, this is an insignificant portion their final grade, even under the very best scenario, unless your exams are heavily weighted, it's been my experience that many students take this very seriously. In that post-exam analysis, they should both explain why their wrong answer was wrong (in 2-4 well-formed and complete sentences), and derive the correct answer. This exercise promotes metacognition and potentially provides really valuable feedback for you about what your students are actually thinking, rather than merely what you think they are thinking. For someone whose research partly centers on naive conceptions and metacognition, I had no idea what some of my students were thinking about X.

I want to chime in and say that some of my best classes I've taken did this. Not only does it help make me feel better about my exam grade as a whole, but I have an incentive to look closely at the mistakes I made and to think more deeply about the questions.

Precisely. And usukprof (heck, everyone) you may want to consider it for your undergrad courses as well. I'd submit this is a practice of critical thinking that is worth doing no matter what the discipline.

I have some actual data, if anyone is interested. These are not published yet and subject to replication, of course. (Anyone interested in a little experiment?)

In fall ecology, I allowed students to do an optional post-exam analysis. I wondered whether students who completed the Exam 1 post-analysis would perform any differently on Exam 2.

So here's what happened on exam 2 between the two populations:

Exam 1 mean (a priori score)   Exam 2 mean   change
Non-analyzers (N=89)    83.5                                  76.7                   -6.8
Post-analyzers (N=19)   76.4                                  79.9                    3.5

Three things popped out at me. First, students who undertook the post-exam analysis on exam 1 did significantly better on exam 2 compared to their own exam 1 mean score. Second, students who did not do a post-exam 1 analysis did significantly worse on exam 2 as a population. (Two-tailed t and non-parametric Wilcoxon, for the small N of the analyzers.) If this holds up, this is a very robust result. Third, it wasn't just the better students who did the post-exam 1 analysis; in fact, the post-exam 1 analyzers scored significantly lower than non-exam analyzers on exam 1, but significantly higher than the exam 1 non-analyzers on exam 2. (Exact same exams and formats administered to both populations, which I only split a posteriori.) So, that suggests that 1) there was no self-selection by better students to complete the analysis on the first exam, and 2) self-selection therefore does not explain the subsequent gain.

In the bigger picture of things: isn't it almost educational malpractice NOT to do something I KNOW increases my students' learning both proximately and ultimately?

HOF'd. I really appreciated this and need to remember to do this in my classes, where feasible.

Navigation

[#] Next page