Reduce the Technology, Rescue Your Job

Brian Taylor

November 09, 2009

For most of this decade, professors embraced the pedagogy of engagement, wooing students via technology and ignoring the costs because traditional methods, from textbooks to lectures, purportedly bored students who multitasked in the wireless classroom.

Now many state institutions are facing huge budget cuts in the worst recession since the Great Depression. In Iowa, we began our fiscal year with a 15-percent cut and were informed in October that we have to trim an additional 10 percent because of state shortfalls. Other states have it as bad or worse. Budget analysts in Louisiana see money for higher education declining by nearly 60 percent in the next two years. Gov. David A. Paterson of New York, facing a $3-billion deficit, ordered state agencies to slash their budgets by 11 percent, in effect reducing support for the State University of New York by $90-million and to the City University of New York by $53-million. California is coping with a $60-billion budget shortfall resulting in 20-percent reductions for major universities in addition to furloughs and tuition increases.

Those facts alone merit an immediate technological and curricular assessment, or else hundreds more professors and staff members could lose their jobs in the coming weeks and months. You may lose your job.

In making decisions about who goes and who stays, administrators typically evaluate which positions and units are marginal, useful, or essential. They fire or furlough on that basis. The irony is that we seldom assess technology in that manner, ascertaining which devices, applications, and tech courses are marginal, useful, or essential. In a sense, you cannot blame the budget cutters for overlooking the implements of our demise. Technology, interwoven in nearly everything we do in academe, has become an invisible, autonomous system that is, at times, so convenient we cannot recall how we ever functioned without it.

Lessons in e-history. The bull market of 2005-7 treated higher education well. Benefactors and corporations donated billions, and our foundations flourished. Colleges and universities invested heavily in virtual worlds, student-response systems, and social networks without considering their escalating costs. After all, our learners deserved all the digital acumen we could divine so as to prepare them for those fabled technology jobs yet to be invented.

Many of us believed the Silicon Valley hype. Better still, we could raise fees and tuition to cover budget overruns because students had easy access to loans.

It was the best of nonlinear times. We expanded curricula, creating new courses to accompany the gadgets that students brought with them to class under the untested hypothesis that what entertained them would also engage them. For example, a record number of colleges and universities—254 in 37 states and the District of Columbia—offer courses and degrees in video-game programming and design, according to the Entertainment Software Association.

Campus administrators were often willing accomplices to those curricular efforts, which resulted in more credit hours than necessary for technology-related degrees and more sequences than ever within those degrees. The Chronicle has profiled many such courses, such as "eCommerce/eBay" and "Digital Editing: From Breaking News to Tweets." They proliferated to such an extent that the satirist Robert Lanham conjured a mock course called "Writing for Nonreaders in the Postprint Era."

Campus libraries jumped on the bandwidth bandwagon, too. Once the body and soul of the university, libraries and their archives divested paper holdings in return for pricey data feeds of digital journals and e-books.

The pursuit of digital engagement became so acute that many business students found it easier to market imaginary products in virtual worlds than in real ones. Geology and horticulture students took field trips in that ethereal realm rather than explore and grow in inclement weather.

In the past, classroom engagement implied deep critical thinking and inspired commitment. Somehow that metamorphosed into convenience, which technology provides, for a fee. Centers of teaching excellence, once celebrated for curricular assessment and peer mentorship, increasingly began offering more workshops and seminars on technology, as if those centers were brand managers for companies like Linden Lab, creator of the virtual-life world Second Life, or Turning Technologies LLC, proliferators of clickers surpassing 14,000 on my campus alone.

The same digital phenomenon seemed to happen on every college campus, often with administrators celebrating technology the way they used to boost athletics programs, showcasing how "in touch" they were with innovation. Too few of them, however, were monitoring costs. And why bother? Professors told them that clickers, social networks, and virtual worlds were "free" (at least initially). Then purchasing departments started receiving invoices for site licenses, application updates, bandwidth expansion, state-of-the-art multimedia labs, and upkeep and replacement of equipment.

On some campuses, departments that invested in digital engagement never publicized the fees they were spending to rent virtual land, allowing faculty members to use institutional credit cards to equip and clothe avatars and their cartoon environs. Worse, users ignored terms of service that shifted liability to institutions or caused conflicts associated with online harassment and student privacy. Many users simply clicked "I agree," participating in a virtual world that not only defied gravity (avatars can fly) but that also defied common sense.

All of that technology cost money, passed on to students, in a market-driven economy that proved as illusionary as a house of pixels. Now those bills have come due, and it's the avatar, clicker, tweeter, or you that is going to go.

Lessons in e-future. I'm not being nostalgic in raising these matters. Those who tweeteth their demurs already without finishing this article are being technostalgic, believing that we can continue to afford the costs and pedagogies spawned by digital devices as if the subprime and student-loan scandals never happened.

Neither am I hypothesizing here. I cannot speak for my entire university, but I can attest that our school of journalism and communication considered the price of digital and curricular expansion, reduced both or found external means to finance them, and so has been able to absorb reversions and budget cuts—with more to come—while maintaining essential technologies and reasonable workloads.

We had a supportive dean who encouraged us to face some hard truths. We did and decided we wanted to do the following:

• Instill in learners a commitment to make a difference in society, and do that through the fundamentals of teaching excellence—preparation, organization, and mastery of the subject matter.

• Negotiate with vendors for essential technology, such as statistical-analysis programs and videography, and introduce those applications and hardware into existing courses, rather than create new or experimental ones.

• Use seminar or workshop modules for curricular innovation, particularly with timely or topical subjects (often the case with technology), rather than propose new courses that need to be scheduled and staffed on a regular basis.

• Streamline curricula to maintain workload by eliminating low-enrolled or potentially duplicative courses, as well as end emphases that double as "silos." (We deleted emphases in public relations, print, electronic media, and visual and science communication in as much as they weren't acknowledged anyway in official journalism degrees.)

• Revise elective courses so that media specialties are taught across those former emphases, giving students a sense of the integrated media world. (Incidentally, intra- and interdisciplinary course combinations can work in any department, not only journalism.)

• Reserve smaller laboratory classes for majors only, to reduce the number of extra sections.

• Funnel nonmajors into larger elective classes, preventing them from claiming seats in required courses so as to safeguard degree progress for our majors.

• Look to corporate benefactors rather than our budgets to pay for the latest communications technology, providing paid internships for our majors.

• Ask professors to do academic advising, spending more face time rather than Facebook time with students so as to ensure a good retention rate. (Professors who resist advising can teach an additional class.)

If additional budget cuts materialize, we may have to streamline again, using technological convergence as the impetus to meld existing courses rather than fabricate new ones. Our success, so far, is a credit to professors and staff members who put the school's interests above their own.

I challenge anyone objecting to these arguments to look in the eye of secretaries, janitors, adjuncts, advisers, and professors of eliminated programs and say that avatars, clickers, social networks, and tweets—and the pedagogies, IT expenses, and teaching centers supporting them—are more important than feeding their families. To believe we can afford both indicates how incapable many of us are of making the difficult choices that the times require.

Michael J. Bugeja directs the Greenlee School of Journalism and Communication at Iowa State University. He is the author of "Interpersonal Divide: The Search for Community in a Technological Age," from Oxford University Press.