Today we are sending a growing number of students to study abroad. We are setting up branch campuses in other countries. We are seeking foreign students to fill our seats (and budgets). So why am I worried that the internationalization of our curricula has stalled?
We often use the terms “international” and “global” interchangeably, but they should not be confused. Crudely, “global” means “concerning the whole world.” Global phenomena are those not limited to particular places. Most important, they affect the entire world—climate change is probably the most obvious example. “International” means just what the compound word implies—something that transcends the nation-state, existing or occurring across borders. Trade, for example.
For education, globalizing the curriculum means, at the very least, exposing students to the range of global phenomena in the present world and historically. It also means exploring the ways in which those phenomena affect just about everything we do nationally. There are exciting possibilities for reconceptualizing curricula globally, but there aren’t many good examples.
One of the most ambitious approaches I have encountered was put forward last year by four presidents of vastly different liberal-arts institutions from around the world. In the Association of American Colleges and Universities’ Liberal Education, they called for “an education for the stewardship of the global commons.” “An appropriate university education for everyone, not just a privileged elite, must prepare women and men for participation in these [global] cultures and this commons,” the group said. “The purpose of a 21st-century education is to produce graduates who recognize themselves to be of the world and who also assume responsibility for the world.”
The presidents proposed a curriculum that would require students to acquire a range of “literacies,” “skills,” and “dispositions,” including respect, vulnerability, hospitality, compassion, agency, agility, fairness, service, and leadership. That is a huge agenda.
I don’t think most institutions are prepared to make such a leap.
But we have been thinking globally for only a relatively short time—whereas we have been thinking internationally for at least the last 300 years of Western history. We should therefore be prepared to undertake the task of internationalizing the curriculum now.
The challenge is to transcend the underlying nationalism that inspired the growth of the modern university starting in the 18th century. At one level, that simply requires us to understand the extent to which phenomena are transacted across borders, or take place simultaneously in different places. But, at a more profound level, internationalizing demands that we think extra-nationally, see ourselves as involved in decisions and processes that cannot be understood on a purely national basis, even if we ordinarily have to act nationally and locally.
Perhaps the most compelling reason for encouraging undergraduates to think internationally is that it will enable them to situate themselves in their own lives. My colleague Paul Krugman has noted, “The problem is that most of what a student is likely to read or hear about international economics is nonsense.” Therefore, he says, the most important thing students should learn is “how to detect that nonsense.” In other words, force them to develop intellectual BS detectors.
Most observers contend that the emergence of the Cold War in the 1950s stimulated the creation of international programs in American universities. It was partly a question of “knowing your enemy” and partly of knowing the terrain on which the war was being fought—mostly in the developing world. But the result was an influx of both government and philanthropic funds for the creation of a wide variety of international programs. The federal government was especially interested in supporting foreign-language programs, with an emphasis on non-Western languages. Both the government and private givers (particularly the Ford Foundation) were also concerned to support “area studies"—scholarship on the government, economy, and society of foreign nations and regions, again especially those behind the Iron Curtain and in what was called the Third World. Before World War II, the curricula of American higher education had been primarily oriented to our own continental island and the European nations from which most white Americans had emigrated.
With new support, the curriculum came to include more international content than traditional liberal-arts education had provided. Our universities also began to encourage American students to study abroad, and many colleges established campuses outside the United States. The Fulbright Program was founded in 1946, sending American undergraduates, graduate students, and faculty members for yearlong stays in other countries. By the 1960s, the trend was accelerated by the creation of the Peace Corps, which attracted American college graduates for life-changing experiences in formerly remote parts of the world, and then deposited them back on American campuses for graduate work. Private givers created fellowships and scholarships for foreign study, sending many young Americans abroad for postgraduate work. And the number of foreign students, mostly graduate students, enrolling in our universities expanded exponentially.
What had begun as an exercise in national-security preparedness became, over the decades, a program to internationalize the educational experience of American undergraduates. The study of Western civilization gave way either to world civilization or to particular non-Western civilizations. This trend reflected both the continuing stresses of the Cold War and the dynamics of postcolonialism, as the old European empires fell apart and new nations came into existence.
As a nation, Americans still traveled less and spoke fewer foreign languages than people in many other nations, but we were steadily becoming more cosmopolitan and less provincial. There had always been a tension, mainly hidden on campus, of education in the name of defense (against “the evil empire”) and the more positive message (“one world”) of internationalization, which was what most students perceived. That message was enhanced by the growing economic strength of the United States and the rapidly expanding foreign and international business opportunities for students.
The problem, however, was that the positive impetus provided by the Cold War began to dissipate by the 1980s, and a reaction against internationalism began to rear its head. Attitudes that verged on xenophobia emerged during the culture wars of the late 1980s and early 1990s and reached a dangerous level after the terrorist attacks of September 11, 2001. For me, the most disturbing reaction was that of Lynne Cheney (the wife of the then vice president of the United States and the former chairwoman of the National Endowment for the Humanities) who, less than a month after 9/11, claimed that calling for more study of the world was akin to blaming the attack on the World Trade Center on the failure to understand Islam. She called for more study of “traditional” American history. Cheney was not alone in contending that liberal international programs would weaken America.
Just as important: While faculty members recommended that students take foreign languages, area-studies courses, and international relations, they built their own research centers instead of concentrating on the curriculum. Scholars did not expend time and energy in reconceptualizing the undergraduate curriculum to reflect internationalism, and when they focused on education, it was on graduate training rather than undergraduate. The result was that when government and foundation funding for area studies diminished or disappeared by the mid-1990s, the area-studies centers declined or crumbled and left only a small imprint on the undergraduate curriculum.
Yes, a larger proportion of faculty members than a generation ago are studying places and topics that are accurately described as “foreign” or “international.” Yes, the international programs that originated or grew during the Cold War continue to play a role (if diminished) on our campuses. But the impact on the curriculum of faculty research on other parts of the world has declined significantly.
That is why I worry that internationalization as an educational strategy has stalled.
Some of the underlying national problems that hamper teaching students to think in international terms have been slow to disappear. First among them is American parochialism. We have improved, but Americans still command too few foreign languages and live (and study) abroad too infrequently. Our news media pay too little attention to the world outside our borders. We are too infrequently interested in the rest of the world. That not only limits the knowledge and skills available to American faculty members and students, but, more important, shapes faculty attitudes. How essential do most arts-and-sciences professors consider the internationalization of the curriculum?
I do not know the answer, but I fear that in the social sciences, which during the Cold War were the bulwark of international studies, fewer and fewer are concerned. The reason seems clear. For at least a decade, social scientists have been moving in the direction of pure science, aspiring to establish high-level conceptual hypotheses through complex mathematical and logical processes. The rewards go to those who are most adept at the conceptualization and manipulation of data. In practice, that has meant that an economist or a political scientist finds it hard (or impossible) to gain tenure for knowledge of the Japanese economy or the Kenyan political system. Purporting to establish universal propositions is what gets you jobs and promotions. Young scholars understand that perfectly. They have fewer incentives to learn new languages, travel globally, or build research networks in the developing world. Who then will be the area-studies scholars and teachers of the next generation?
And the funding environment for international research continues to deteriorate. Federal programs such as Title VI and Title VIII have been cut back substantially. The Fulbright Program is funded at much lower levels—and has been redesigned to favor shorter stays abroad. The same is true in the private sector. For decades the Ford Foundation, until recently the country’s largest private philanthropic foundation, was the principal supporter of area-studies research. But Ford began to withdraw from its commitment in the early 1990s, and there are few other private sources who support international research in our universities.
The net result has been that American universities have been forced to draw on their internal resources and their institutional fund-raising capacity to support international programs. Quite a few have responded to the challenge, but many others have lacked either the will or the capacity. On the research side, at least, we are witnessing a concentration of internationalization in a relatively small number of well-endowed institutions. The rich get richer, and the poor get poorer.
All American universities say they want to internationalize. Witness, as I mentioned, the vogue for setting up branch campuses abroad and recruiting foreign students. Some colleges stress study abroad, while others focus on research partnerships with institutions in different countries. I have no problem with this sort of diversity in approach. But it doesn’t make a substantial change in the curriculum for our undergraduates in the arts and sciences. What might is reaffirming that our goals of international education are those of liberal education. Liberal learning is a broad and interactive approach to undergraduate education that prepares students for a future of active and responsible democratic citizenship, and for fulfilling lives, including an appetite for lifelong learning. I know that approach leaves out an important word: jobs.
Many individuals and institutions believe that vocational goals belong on any list of the ends of international education. Only a few years ago, in 2010, Martha J. Kanter, then the U.S. under secretary of education, told a conference of international educators, “The skills and knowledge acquired in international education are the same skills graduates need to succeed in the economy.” At the same meeting, Nancy Zimpher, president of the State University of New York system, contended that universities’ international work has to be done in the context of trade and immigration policy. I don’t disagree with either speaker—if what they mean is that the skills acquired in a liberal education will prove useful for lifelong employment. But if what they mean is that undergraduates should primarily learn skills that are immediately and specifically employment-related, I disagree strongly.
Indeed, I am not certain that even that defender of liberal learning, that Association of American Colleges and Universities, has its priorities clear here. In its most recent strategic plan (2013-17), the association asserts that liberal education “should be reclaimed and repositioned as providing Americans with a comparative global advantage in preparing for work, citizenship, and lifelong learning.” Let’s set aside the “comparative global advantage” language, although I object to the nationalist attitude. But what about “work, citizenship, and lifelong learning”?
To start, the order is wrong. More, I’d prefer to prioritize the skills and values that the four international educators noted in that Liberal Education article (respect, vulnerability, hospitality, compassion, agency, agility, fairness, service, and leadership). In these days of high unemployment, even for college graduates, universities feel compelled to stress the narrowest utilitarian goals of higher education—just like the Obama administration.
That is wrong, disappointing, and dangerous. I am in favor of nearly all the measures that the universities have taken to internationalize their undergraduate education. But discrete efforts, by themselves, are not enough. They do not necessarily accumulate to constitute an internationalized undergraduate education.
These programs are educationally useful measures only if they relate to one another in a curricular and pedagogically meaningful manner. It is the curriculum, itself, that needs change. We need majors, minors, sequences of courses, and better connection to the extracurriculum to bring the intellectual side of internationalism to life for students.
More important, each of the international aspects of an undergraduate’s learning experience should also contribute to his or her cognitive development. Most of our educational programs for undergraduates focus on content, as they should, but their long-term impact, if any, will be less in the material retained than in the habits of mind formed. Which takes us back to the skills and values of respect, vulnerability, etc.
Hannah Arendt put her finger on the problem when she criticized the “professional problem solvers” who left the university for government and think tanks in the 1960s. They had, she wrote, “lost their minds because they trusted the calculating power of their brains at the expense of the mind’s capacity for experience and its ability to learn from it.” John Dewey would have agreed.
We will have truly internationalized the undergraduate curriculum when our students develop the capacity to understand what it means to think internationally. That is a huge challenge.