More than five years ago, the then-White House science adviser, John H. Marburger III, asked researchers a seemingly simple question: Given the billions of tax dollars they get each year, why don’t they have good data on the value of what they produce?
That question may finally be getting answered.
Sparked by job-tracking requirements in the $787-billion economic-stimulus bill approved last year by Congress, the government’s major science-financing agencies have been working with universities to devise a way to bring scientific precision to the explanation of how their expenditures help the national economy.
The organizers of the effort, known as Star Metrics, reached a milestone this month by declaring the success of what they consider a critical element—their ability to create a low-cost system for universities to easily compile and assemble their records of federal grants, including the faculty members and students employed by them, in a way that the information can be automatically fed into the overall nationwide database.
Their method, tested at seven universities, took the institutions only about 12 to 15 hours of staff time to set up, and virtually no time to run after that, said Julia I. Lane, a program director at the National Science Foundation.
Users agree. “It was literally a nonissue in terms of the administrative burden,” said Susan Wyatt Sedwick, associate vice president for research and director of the Office of Sponsored Projects at the University of Texas at Austin, one of the participating institutions.
The idea, eventually, is to create a comprehensive system that relies largely on existing computerized databases—such as science-journal publications, patent registrations, venture-capital expenditures, and employment histories—to give policy makers a precise idea of what they are getting for their more than $31-billion of annual federal spending on university-based research.
“There is huge pressure” from the White House and Congress, Ms. Lane said, “to show the value of investments.”
Benefits and Concerns for Institutions
Universities are largely cheering the project. For one thing, Ms. Sedwick said, the administrative burden alone on the university might be reduced, because current federal data requirements translate into more than two hours of work for each of the 160 or more grants that Austin received in the past year.
And, said Tobin L. Smith, associate vice president for federal relations at the Association of American Universities, institutions surely will be helped by the ability to tell taxpayers with greater precision the benefits of their expenditures. “It’s all in all a pretty good thing,” Mr. Smith said.
The organizers of Star Metrics are emphasizing to universities the voluntary and cooperative nature of the project, particularly its reliance on low-cost sources of existing data, Ms. Lane said.
The idea grew out of a 2005 speech by Mr. Marburger, then serving as director of the Office of Science and Technology Policy in the administration of President George W. Bush, to the American Association for the Advancement of Science. Mr. Marburger pointed out that systems for measuring the effectiveness of science spending were already nearly three decades old. A National Research Council report, he said, called them “often ill-suited for the purposes to which they have been employed.”
The effort took a more concrete form after the economic-stimulus measure, containing more than $15-billion in federal money for science research, required recipients to report back to the government on the jobs being created by their projects.
The resulting Star Metrics effort doesn’t come without some concern and uncertainties. For one thing, Ms. Lane said, the voluntary nature of the effort leaves unclear its ultimate comprehensiveness. And the final level of detail is also unclear at this stage, she said, with many questions remaining about how smoothly all the anticipated sources of data will be linked together.
Also, some colleges have objected in other contexts to the inclusion of personally identifiable data in nationwide research projects. In the case of Star Metrics, Ms. Lane said, all faculty members and students involved in federally sponsored research are expected to be assigned some form of identification number, to allow tracking of their outside activities such as journal publications, patents, and jobs.
And some had greeted Mr. Marburger’s original idea with suspicion, tied to their unhappiness with the Bush administration’s level of support for science spending. Mr. Marburger may largely be trying, with the new measures, “to make it impossible to assess the Bush record relative to past spending,” Sally T. Hillsman, executive officer of the American Sociological Association, wrote at the time.
Advocates of research spending also recognize the possible political risks of letting the government science budget be judged primarily for its job-producing qualities, even though they largely see that trade-off as positive.
New Incentives for Students
At the Polytechnic Institute of New York University, which was not one of the seven institutions that tested the data-collection system, the initiative raises hope of creating badly needed new incentives for students, said the institution’s president, Jerry M. Hultin.
Research students, including prospective new faculty members, have broad ambitions for applying their research expertise toward solving problems in the commercial marketplace, and they too often feel “boxed in” by current measures of academic success that are tied to traditional pathways, such as journal-publication citations, Mr. Hultin said.
“This is an unboxing of young faculty,” he said, “and letting them head to some of the places they really want to go.”
In fact, one startup enterprise that originated from the institute, he said, could be part of the solution. The company, ChubbyBrain, is compiling a directory of private companies that contains detailed information in such areas as their financing sources, mergers and acquisitions, and customers.
That’s exactly the type of easily obtainable information that organizers of Star Metrics may hope to incorporate into their network, Mr. Hultin said. Ms. Lane agrees. “The notion here is to leverage existing data,” she said.
The assembled information could eventually form the basis for reports that reviewers, at agencies such as the NSF and the National Institutes of Health, consider when they decide which grant applications to approve, Ms. Lane said.
But that doesn’t mean the potential for job creation will be the only or even the primary measure for future grant applications, she said. Although the impetus from the economic-stimulus measure centered on job creation, Star Metrics is being designed to include the effects of federal science spending in three other broad areas, including the generation of basic scientific knowledge and improvements in long-term health and environmental conditions.
A classic example that the developers of Star Metrics keep in mind, Ms. Lane said, is that of Sergey M. Brin, the co-founder of Google Inc. whose research at Stanford University—helped by a graduate fellowship from the National Science Foundation—led to the development, in a rented garage, of the world’s most popular Internet search engine.
“One grant,” she said, “doesn’t typically lead to one particular outcome.”