The ability to digest quantitative information is crucial to everyday life, whether the purpose is trivial (following baseball) or profound (participating in democracy).
Yet college students don't always get exposure to activities that develop this ability, according to their answers to a new set of questions on this year's National Survey of Student Engagement, which was released on Thursday.
Since 2000, Nessie, as the survey is known, has collected wide-ranging data to help colleges develop effective educational practices and promote engagement. Students are asked, for instance, how much time they spend studying, how often they have discussions with people of a different race or ethnicity, and how they interact with their professors and peers.
This year the researchers, based at Indiana University at Bloomington, also explored quantitative reasoning among 335,000 students at 568 American colleges.
About 38 percent of first-year students surveyed and 44 percent of seniors said they had often or very often made use of numerical information to "examine a real-world problem or issue." Similar shares—37 percent of first-year students and 43 percent of seniors—said they had often or very often evaluated other people's conclusions that were drawn from numerical information.
The findings revealed differences according to gender and field of study. Unsurprisingly, seniors in science, technology, engineering, and mathematics fields did more quantitative reasoning than did students in other disciplines. Among students in non-STEM majors, men reported using quantitative reasoning more frequently than did women.
Disparities are to be expected among different fields of study, said Alexander C. McCormick, director of the survey. But colleges should be vigilant about how frequently their students use quantitative reasoning in their coursework, he said: "These are vital skills, not just in the workplace, but to be an effective, competent citizen." If some students are reporting much lower levels of quantitative reasoning than others, he said, those findings "are certainly worthy of attention."
New Measures of Learning
This year's Nessie results reflect the first substantial modification to the widely used survey since it was introduced, in 2000. In addition to new questions, researchers rephrased existing questions and reorganized and renamed the composite categories the survey uses to analyze students' responses.
In recent years, critics have faulted the researchers' approach. Nessie's questions were too vague for students' answers to be meaningful, critics said, and the composite categories—formerly known as "benchmarks"—had a high percentage of error and overlapped with one another.
One newly constituted category is "effective learning strategies," which reflects how often students report having identified key information in readings, reviewed notes after class, and summarized course material. Such activities, the authors note, bolster learning and retention. About two-thirds of freshmen and seniors surveyed reported that they often or very often reviewed notes after class.
A lukewarm level of academic effort was also reflected in other findings. Freshmen reported spending, on average, 14 hours a week preparing for class, including studying, reading, writing, and doing homework or lab work. For seniors, the average was 15 hours. Compared with students today, those in 1961 studied, on average, 10 more hours a week, two economists at University of California campuses found in 2010, using Nessie and other survey data.
How Much Time Do Students Spend on Schoolwork?
|Field of study||Preparing for class
(hours per week)
(hours per week)
|Arts and humanities||16||8||80|
|Biological sciences, agriculture, natural resources||16||7||66|
|Physical sciences, mathematics, computer science||17||6||58|
|Communications, media, and public relations||12||6||81|
|* Based on reported number of assigned papers of various lengths|
|Source: National Survey of Student Engagement|
The exception in this year's Nessie survey was students who took all of their courses online. They spent slightly more time studying and reading relative to those who took no online classes. And online-only seniors were assigned substantially more writing over the course of the year, an average of 107 pages, compared with 75 pages for students who took courses only in person. Online students also had more favorable views of the quality of their interactions with a range of people—faculty members, academic advisers, student-services staff members—who had a hand in their learning.
More encouragingly in general, most students of all types reported participating in what Nessie's researchers characterize as "high-impact practices." Those include special programs like learning communities, service-learning projects, research with faculty members, internships, study abroad, and capstone projects. Such opportunities, the authors argue, "can be life-changing."
Four out of five seniors reported participating in at least two high-impact practices, one during their first year and the other in their major. Service-learning projects were the most common, with 60 percent of seniors taking part, while study abroad remained rare, with 13-percent participation.
Colleges must examine whether existing services and activities meant to improve students' performance and satisfaction are as effective for those from nontraditional backgrounds as they are for traditional, residential learners, the authors urge.
Older students, new part-time or transfer students, military veterans, and students who take all of their courses online all described their campuses as less supportive. Compared with so-called traditional students, they reported less-favorable views of academic support programs, campus activities, and opportunities to interact with different kinds of people, among other items.
Nessie researchers also suggested that colleges could do more to support students in the STEM disciplines.
Using data from the 2012 Beginning College Survey of Student Engagement, the authors analyzed first-year students at 71 institutions who started off intending to major in a STEM field. Over the course of that year, researchers found, the characteristics of STEM majors changed substantially: For every 100 first-year students who planned to major in a STEM field, roughly a quarter had switched to a non-STEM major by the spring of their freshman year. Meanwhile, about a quarter of students who did not initially intend to major in a STEM field joined the group.
The analysis may provide more fodder for those concerned with attracting and retaining women in STEM fields. Male students were three times as likely as females to be "joiners," the report said, while female students were twice as likely to be "leavers." The percentage of first-generation students in STEM majors had dropped as well by the end of the year.
While the addition of the joiners meant that the total number of students in STEM majors remained roughly constant, Nessie researchers said STEM faculty members should make sure they're providing proper support to all students interested in those departments.
In addition to highlighting patterns like those, the authors also try to provide practical examples of how colleges can use their survey data. This year's report includes short case studies of four institutions and one university system that have used their data in different ways: at Kenyon College, to spark candid discussions about the ideal liberal-arts education; at Pace University, to grapple with sophomore retention, or "sophomore slump"; and at the Catholic University of America, to improve a first-year-experience program.
The survey is administered by the Indiana University Center for Postsecondary Research, sponsored by the Carnegie Foundation for the Advancement of Teaching, and paid for by participating colleges. This year's report, "A Fresh Look at Student Engagement," is available online from the National Survey of Student Engagement.