Carly Haddon (right), the creator of Pierce College’s student-data dashboards, talks with Rajesh Lal, a math professor, during a training session for faculty members.Matthew Ryan Williams for The Chronicle
When Pierce College, in Washington State, redesigned its precollege-level math courses in 2013, administrators asked students what had helped them succeed. The response was surprising: What mattered most, the students said, was which instructor you were assigned.
Or subscribe now to read with unlimited access for less than $10/month.
Don’t have an account? Sign up now.
A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.
If you need assistance, please contact us at 202-466-1032 or help@chronicle.com.
When Pierce College, in Washington State, redesigned its precollege-level math courses in 2013, administrators asked students what had helped them succeed. The response was surprising: What mattered most, the students said, was which instructor you were assigned.
Tom Broxson, the community college’s dean of natural sciences and math, looked at the data and found that the students were right: In one Algebra 2 section, only 30 percent of students earned a 2.0 or better, the required minimum for moving on to college-level math. In another section of Algebra 2 taught by a different instructor, 95 percent of the students earned a 2.0 or better. Administrators found similar discrepancies in many other courses.
“It was a crapshoot for the students,” Mr. Broxson says. “If they didn’t know who to take, their chances for success could vary dramatically.”
It was also a problem for the college, which serves more than 20,000 students at two campuses about 30 minutes south of Seattle and on a local military base, as well as online. In 2012, Pierce had signed on with Achieving the Dream, a national effort to improve college-completion rates. The grading variances were thwarting the college’s progress toward those goals.
The frustration over the grading discrepancies set in motion an experiment that led to significantly higher college-completion rates at Pierce, and is earning the college national attention.
For the past three years, the college’s institutional-research office has created data dashboards that show course-completion rates for the classes of every Pierce instructor.
The dashboards allow instructors to see and compare, among other things, how black and Hispanic students are doing in their classes, how women are doing, and how first-generation college students and Pell Grant recipients are doing. And each instructor can look at those same data for every other course, and every other instructor, at the college.
“You can look at anybody who has taught at Pierce in the last 10 years,” Mr. Broxson says. “It’s like an Autotrader menu.”
An instructor who spends a few minutes looking at the dashboards might find, for example, that when he and his colleague across the hall each teach the same course, African-American men are 20 percent more likely to successfully complete the course taught by his colleague.
One of the reports most viewed by faculty members online is the “Successful Course Completion” dashboard, which examines how well students are doing in a course or individual classes. The version shown below lists average course-completion rates for all instructors of English 101, a course required for all students, from 2013 to 2017. (Click on the yellow tabs to learn more.)
ADVERTISEMENT
Offering such information is only the first step. Getting often-skeptical faculty members to act on it is next. Pierce has been able to convince professors of its value and to start conversations about how to use it to improve teaching.
“Teaching has always been somewhat of a solitary role — you have your class and your students, and everything that happens in your class may seem normal,” says Matthew Campbell, vice president of learning and student success at Pierce’s Puyallup campus. “This is a way to go in, and with a few clicks, instantly have useful data that tells you something about what’s going on in your own class and across the institution.”
Since introducing the completion dashboard, the college has added dozens of others, including a panel that shows the faculty how students perform in a subsequent course following the initial course — a way to identify the instructors who are helping students learn and acquire skills as opposed to those who may simply be inflating grades.
You can’t expect change unless you give the data to the people who can make that change.
Lee West, an assistant professor of chemistry, says he and his colleagues in the chemistry department dig into the dashboards — and one another’s grading practices — at staff meetings.
ADVERTISEMENT
“Amazingly, people don’t get upset — we have a civilized conversation about it,” Mr. West says. “If somebody is giving abnormally high or low grades, that isn’t necessarily a problem. We can look at how the students perform in the subsequent class.”
Mr. West says that while full-time chemistry instructors grade in a tight range, the grading practices of adjunct faculty vary wildly, and their grades tend to be higher. One possible explanation for the higher grades: Adjunct faculty members are subject to more student-satisfaction assessments than full-time faculty are. Adjuncts, who make up nearly two-thirds of the roughly 380 instructors at Pierce, have less job security, and student feedback is one factor in determining which adjuncts are retained.
The dashboards provide evidence of the grading disparity, Mr. West says, which makes it easier to initiate a difficult conversation.
“We had a feeling that was happening, but it was just a gut feeling,” he says. “We would have felt really nervous talking about it in the absence of quantitative data.”
The dashboards can encourage experimentation in the classroom by allowing instructors to see the impact of new approaches. Carly Haddon, an analyst in Pierce’s institutional-research office and the creator of the dashboards, jokes that they have expanded Pierce’s institutional-research staff from three to more than 300 (more than 200 instructors and another 130 administrators now have access to the dashboards).
ADVERTISEMENT
“The administration has been looking at this data for years,” Ms. Haddon says. “You can’t expect change unless you give the data to the people who can make that change.”
Melonie Rasmussen, a math professor, was surprised to see in the dashboards that large numbers of her students were relatively low-income, and were working and raising families. The dashboards don’t provide information linked to individual students, but Ms. Rasmussen began surveying students to learn more about what they were juggling: Do you work? How many credits are you taking? Can you study at night? What’s a reasonable turnaround time for homework?
One student replied that she typically worked on her homework from 10:30 p.m. to 2 a.m., after her oldest child went to bed. Ms. Rasmussen took action. Instead of requiring students to turn in assignments within a day or two — her custom before the dashboards — she gave them more time and typically included a weekend.
“I gained more sympathy for what students need to be successful here,” she says. “I was literally making them choose between working, feeding their kids, and spending time with their kids. There’s no need to have homework done every day when dealing with this population. I can wait for the weekend.”
That kind of adjustment is helping more students succeed. In 2010, fewer than 19 percent of Pierce students earned an associate degree or certificate within three years. By 2016, that number had risen to 31.4 percent. The college hopes to hit 45 percent by 2020.
ADVERTISEMENT
In February, Pierce was one of two community colleges in the country (the other was Miami Dade College) to receive the Leah Meyer Austin Award, handed out annually by Achieving the Dream to colleges that are making changes that lead to measurable improvement in student outcomes.
A troubling or promising data point is merely an indicator — you have to dig into it to find its meaning.
Michele Johnson, Pierce’s chancellor, says the dashboards have been a “game changer” for the college as it works toward 2020 goals to improve graduation rates and reduce racial achievement gaps.
“We’re getting there,” she says, “but we have a long way to go.”
More than 95 percent of full-time instructors now have access to the dashboards, according to Ms. Haddon. And while there was plenty of early suspicion about whether administrators would use the dashboards in a punitive way, most of those concerns have faded, faculty members say.
Now most of the raised eyebrows occur when administrators and institutional-research staff go on the road — at conferences or in presentations to other colleges — to explain how the dashboards work.
ADVERTISEMENT
When they describe how the dashboards show, by name, the teaching outcomes of every instructor at the college, what follows is “a big gasp that sucks all the oxygen out of the room,” Mr. Campbell says.
How do you do it, administrators at other colleges sometimes ask.
“There’s a feeling among other colleges that, ‘We couldn’t turn on that switch,’” Mr. Campbell says. “But we didn’t just flip the switch either. It took years of relationship building, and trust building.”
Ms. Johnson, who’s been at Pierce for 40 years, including 13 as chancellor, says Pierce began using data to improve outcomes for students in 2010, after she and the college’s board attended the Governance Institute for Student Success, a program offered by the Association of Community College Trustees that emphasizes data-informed governance and decision making.
“We’d always been about access and an open door, but we could see that we weren’t getting the level of success that we wanted,” Ms. Johnson says.
ADVERTISEMENT
The college joined Achieving the Dream in 2012 and began taking an even deeper look at its data. Mr. Broxson, the dean, says he addressed a crowd of 600 people that year, primarily Pierce instructors, and detailed outcomes for underrepresented minority students and first-generation college students that were so poor that they suggested the college was actually exacerbating gaps between the haves and have-nots.
Instructors were disturbed to hear that the college was failing many of its students, but most didn’t see how they fit personally into the goal of turning things around, Mr. Broxson says. They continued to assume that in their own classes, the students who deserved to succeed were doing so.
“It kept me up at night that a lot of faculty couldn’t see themselves in the work,” Mr. Broxson says. “They couldn’t make that connection.”
The breakthrough came in 2014, when the institutional-research department shared an early version of the dashboards with the math department as it set out to redesign precollege math. The dashboards showed that while some instructors were helping nearly all students succeed, others were escorting most of their students to failure.
Instructors in other departments heard about the dashboards and wanted a look, especially as a collegewide review of courses and pedagogy began in 2015.
ADVERTISEMENT
“Before there was frankly a conscious decision about this, people wanted to see the dashboard, and boom, out it came,” Ms. Johnson says.
Administrators emphasized from the beginning that the dashboards were a tool for improvement, not a way to punish lagging instructors.
And they acknowledged that the data were only a starting point. Erik Gimness, Pierce’s director of institutional research, likens the dashboards to metal detectors: A troubling or promising data point is merely an indicator — you have to dig into it to find its meaning.
“The course-level data is that beeping noise you hear on the beach,” Mr. Broxson says. “Then you have to dig in and do a qualitative assessment — is this a bottle cap or a gold coin?”
Before instructors or administrators get access to the dashboards, they must go through an hourlong training with Ms. Haddon. She wears a “Data to the People” shirt to help lighten the mood and make the trainings fun, but the college has a serious goal: It wants to make sure instructors know how to accurately interpret the data, and understand that the information should not be shared publicly.
ADVERTISEMENT
The college’s commitment to the dashboards does come at a financial cost. Ms. Johnson says the college had cut back in other areas to put more money toward the data effort, including trimming lightly used Saturday-morning hours at libraries, and recently spent $100,000 on software to make it easier to produce the dashboards.
The college is also tying some professional development to projects that use the dashboards. One program offers a $2,000 increase in salary if instructors use student data in a yearlong classroom-oriented project.
The incentives in the professional-development program have increased salaries at Pierce by a total of more than $300,000 since 2012. Ms. Johnson says the investment is worth it, if it encourages instructors to not only look at the data but to take action.
“If all we have is data, and we’re not asking what it means, what’s the point?” Ms. Johnson says. “And if the analysis doesn’t lead to doing something, what good is it?”
Ben Gose is freelance journalist and a regular contributor to The Chronicle of Higher Education. He was a senior editor at The Chronicle from 1994-2002.