This post continues the series of posts about the inverted/flipped calculus class that I taught in the Fall. In the previous post, I described the theoretical framework for the design of this course: self-regulated learning, as formulated by Paul Pintrich. In this post, I want to get into some of the design detail of how we (myself, and my colleague Marcia Frobish who also taught a flipped section of calculus) tried to build self-regulated learning into the course structure itself.

We said last time that self-regulated learning is marked by four distinct kinds of behavior:

- Self-regulating learners are an active participants in the learning process.
- Self-regulating learners can, and do, monitor and control aspects of their cognition, motivation, and learning behaviors.
- Self-regulating learners have criteria against which they can judge whether their current learning status is sufficient or whether more learning needs to take place. (And then they take initiative to close the gap, if it exists, because of #2.)
- Self-regulating learners select learning activities to serve as mediators between their learning goals and their own personal environment and circumstances.

This is really the vision that I have for each one of my students – that they would eventually become this kind of learner, and that when they take a class with me, the class moves them incrementally toward being a self-regulated learner. In fact I’ve come to believe that **the end goal of all of higher education is to produce self-regulating learners**.

I also said last time that the inverted/flipped classroom is an ideal setting for working on self-regulated learning behaviors because of its emphasis on independent acquisition of new content prior to class. While I think the real magic of the flipped classroom takes place *in class*, when students are working together on difficult problem solving tasks, it’s in the *pre-class* phase of the flipped design that the best chance for developing self-regulation happens. So one of the main design goals of the course was to build a recurring form of pre-class activity that not only leads students through new content but also explicitly builds basic skills pertaining to self-regulated learning. That role was filled by what I call **Guided Practice**.

I’ve written about Guided Practice before and much of what I am about to write recapitulates that post. I’d say, however, that the idea of self-regulated learning was not on my radar screen back then, but it is now, and it’s changed the way I think about this kind of assignment.

In the course, we had about one Guided Practice assignment per section of our textbook. The Guided Practice assignments were assigned about 10 days ahead of the day which we’d scheduled to do work on a particular section (so students could work ahead if they wanted). They were always due no later than one hour before class time. Each Guided Practice consisted of five parts:

- An
*overview* of the section (including any connections to previous sections); *Two lists of learning objectives* for the section, one labelled “Basic” which the students are expected to master prior to class, the other labelled “Advanced” which the students are expected to master during and after the class with additional practice. - A list of
*resources* for students to use to encounter the new material, usually a combination of screencasts and readings from the book. - A collection of
*exercises* that help students practice with the ideas from the new section and master the Basic learning objectives. - A link to a Google form where students submitted their work.

Here is an example of Guided Practice that I left on the course website if you want to see what it’s really like. The entire collection of Guided Practice assignments is available at the GitHub repository for the course.

So, what makes this structure for Guided Practice useful for developing self-regulated learning skills?

**Students have to be active participants** in the learning process in order to complete the assignment. There is some passive listening going on by watching the screencasts, but soon the rubber meets the road in the sense that students have to *do something* with the information from the viewing and reading through the exercises. - Many of the exercises ask students to
**do the same thing in different ways**. In the example I linked, students have to estimate the distance traveled by the runner using a basic rectangle sum, and then calculate it exactly using an antiderivative. This sets students up in a situation where they might detect differences in the outcomes of their work, which in turn gives them an opportunity to pay attention to (= regulate) what they are doing. Not all students take this opportunity of course, but it’s hard not to notice when you get to wildly different answers for the same computation done two different ways. Having one’s B.S. detector go off when this happens is a very basic form of self-regulation. - With the learning objectives clearly stated, students are given
**a clear set of criteria** for what they need to know. With the exercises aimed right at the learning objectives, they get **a clear set of information** about whether they are meeting the learning objectives. - The exercises given here are made up mainly by me thinking about the sort of activities that I, as an expert learner, would choose to do if
*I* were learning calculus. In general, the entire Guided Practice is sort of the script that an expert learner would follow if put into the same situation as the student. By having students follow that script, they will build the basic skills and behaviors that experts use until they become habitual.

These Guided Practice assignments were graded on a scale of 0–2 on the basis of completeness and effort. Mathematical correctness was **not** part of the grading criteria. I wanted students to get the message that when you are first learning a subject, it’s OK – not just OK but *inevitable* – that you will make mistakes and misunderstand concepts. It takes concerted work over a long period with other people to finally iron those misconceptions out, and I don’t want to penalize students for making them early on. Indeed, I need to know what students don’t understand about a new subject to know how to calibrate the class time on that subject. The only way to lose credit on a Guided Practice was to simply not do it, or do it late, or put down “I don’t understand” for your answers.

Does this form of Guided Practice actually produce measurable gains in self-regulated learning skills? That’s a sticky question, and I don’t have the answer for it yet. It’s hard to measure self-regulated learning. Pintrich developed the Motivated Strategies for Learning Questionnaire (MSLQ) to do this, and it’s a good instrument. But the one time I tried administering this to students (in another class) in a pretest/posttest fashion to measure gains in self-regulating behaviors, I encountered a ceiling effect. That is, students rated themselves so highly as self-regulating learners in the pretest – almost certainly overrating themselves – that the posttest had nowhere to go, so the “gains” were minimal. I think this overrating happens with many student populations. I’m currently mapping out a study that, if approved, will run this fall that involves a largish population of calculus students doing the MSLQ and the Calculus Concept Inventory side-by-side, to see generally what data come out of it, particularly if there are any big differences between the flipped sections, the traditional sections, and some of the sections that are kind of in between.

I do think that Guided Practice helped. I definitely saw a change in the kinds of questions the students were asking by the end of the class – not “Can you tell me what the answer is?” but “Can you recommend a good place to get more information about this problem?”

And if you’re wondering if students actually did the work, the answer is yes. In both my sections of this class, there were 24 Guided Practices given for a total of 48 points. The median scores in the two sections were 44 and 46 points. So most of the time, most students were doing nearly all the Guided Practices. I didn’t need any coercive measures in place to make sure students did the work – they did it because it helped, and I think primarily because it was *doable*. There was enough challenge to make it interesting, enough structure to give adequate guidance, and very low risk.

One last thing: If you’re paying very close attention, you will notice that my formulation of learning objectives on Guided Practice is different than it used to be. Whereas I used to have just a single list of objectives, I now have two, the “Basic” and the “Advanced”. What’s the deal with this? I’ll explain in the next post.

*Image: “Maze Puzzle”, http://www.flickr.com/photos/61423903@N06/*