The “war on terror” has cost the United States something on the order of $4-trillion since 2001. The fighting in various countries has taken more than 200,000 lives.
Now, Neil F. Johnson, a professor of physics at the University of Miami, believes he’s found a mathematical formula that can both explain the course of the war and help predict future terrorist attacks.
In fact, Mr. Johnson said, a decade of testing has shown that the formula accurately models both the timing and the casualties in all manner of human strife—any time an asymmetrical power relationship exists between two sides.
“We predicted that all of these would follow that same rule, and they do,” he said, describing his research team’s findings, which cover 100,000 distinct events—as varied as armed conflict and crying babies. The work was described in a paper, “Simple Mathematical Law Benchmarks Human Confrontations,” that was published last month in Nature’s open-access journal Scientific Reports.
The study is intriguing, according to some who have reviewed it. But they say it’s not clear whether it’s reliable, useful, or based on sound science.
Rather, say experts in both the social and physical sciences, it’s another cautionary tale in the fast-growing practice of scientists who try to model and anticipate human behavior by crunching the vast volumes of computer data available to them. They’ve done that to analyze natural disasters, pandemics, rumors, criminal behavior—even people’s historical significance.
Scientists have always wanted to study such topics in depth, and now computers are giving them an entirely new way to do it. “When the data’s available,” said Russel E. Caflisch, a professor of mathematics at the University of California at Los Angeles, “then it actually is something that you can quantify rather just have the conjecture.”
Meaning in Patterns
Popular raw materials for such work include search-engine results, home-buying records, and social-network relationships. Sociologists are jumping in, though many early practitioners are scientists who haven’t traditionally studied human activity, such as computer scientists and physicists.
Several researchers familiar with such work—in fields that also include math, statistics, and psychology—were reluctant to talk on the record about Mr. Johnson’s work, not wanting to criticize it directly or saying they couldn’t be sure enough about the details.
But they described a world in which enthusiasm for mathematical analysis often far outpaces rigor. A key problem, they said, is that computers make it easy to find mathematical patterns in data, but a pattern by itself is not proof of anything unless the scientist can also offer a testable explanation for why the pattern might exist.
Examples of work that includes both a data-derived pattern and an explanation of an underlying human behavior that could account for it remain relatively rare.
One example is research by Matthew J. Salganik, a professor of sociology at Princeton University, and Duncan J. Watts, a principal researcher at Microsoft Research, who have studied music downloads to identify patterns in popularity. They have explained that users might initially rely on the opinions of others in new music purchases, but that lower-quality music described in their tests as popular eventually loses sales to better-quality alternatives.
Two physics professors, Sidney Redner of Boston University and Mark E. Newman of the University of Michigan at Ann Arbor, have studied theories of why journal citation rates often show an exponential growth pattern. Their work has involved tracking a series of case studies to identify how citation rates can snowball.
A more commonly known example is President Obama’s election campaign, which compiled large amounts of personal data on voters to make assumptions about what messages would persuade, but then meticulously tested its ideas on small groups to be sure that data-driven theories actually worked in real life.
A Forgiving Formula
The key finding in Mr. Johnson’s study, by contrast, is the mathematical pattern itself. For the severity of attacks, it’s A times B raised to the power of C, where B is the number of the most recent event in a series, such as a terrorist attack; A is the time since the previous event; and C is a variable that ranges from zero to one.
In the summary of his report, Mr. Johnson makes ambitious claims about the value of that formula. He cites problems as varied as child-parent disputes, sexual violence against women, civil unrest, and acts of terror and cyberattacks, before asserting: “Our findings provide quantitative predictions concerning future attacks.” The discovery, he writes, represents “a tool to help detect common perpetrators and abnormal behaviors.”
But the study, while heavy on statistical analysis, lacks any real-world explanation of why its examples—computer hackers, Colombian drug traffickers, religious partisans in Northern Ireland, and newborn infants, among many others—might all behave in very similar ways.
Mr. Johnson acknowledged that his mathematical pattern allows for large uncertainties. Explaining it in statistical jargon, he said a chart showing the typical pattern of attacks or other events can have “very fat tails—it’s a black-swan type of distribution"—a reference to a surprise event that can be rationalized afterward.
The formula is so forgiving, in fact, that it can absorb an anomaly such as the 3,000 deaths in the September 11 attacks by merely averaging them out over many years, according to Mr. Johnson. In general, he said, his findings suggest there will not be another attack that deadly in the next decade. But if it turns out that there is, he said, that also would show “a value of the work because that says that something absolutely fundamental has changed.”
‘Limitations of the Inquiry’
A co-author of the paper, John G. Horgan, a professor of security studies at the University of Massachusetts at Lowell, said he would defer to Mr. Johnson for public explanations of the study, though he said that both the article and its contributors acknowledge “the limitations of the inquiry.”
The study was supported by federal military and intelligence agencies, including $210,000 in grant money from the Office of Naval Research. The project’s program manager, Ivy Estabrooke, said her office understands the fundamental and preliminary nature of the study, and hopes to pursue it further.
“With future work, if these dynamics are found to be consistent and predictable,” she said in a written statement, “these mathematical models could assist naval forces in understanding and forecasting likely behavior of adversaries.”
Reuben Hersh, a professor emeritus of mathematics and statistics at the University of New Mexico, was one of the few outside experts willing to be quoted by name on Mr. Johnson’s work. He said the size and qualifications of Mr. Johnson’s research team and the large amounts of data it studied made the findings difficult to dismiss out of hand.
Yet, he said, “the claim is so broad and so pretentious that it seems preposterous.”
Mr. Johnson is holding firm. No other study compares to it, he said, “given the scope of the confrontations analyzed, together with the rigor of the statistical analysis that we employed and the simplicity of the findings.”
“We stand by all the statements in the paper, and every single claim made in the paper,” he said. “Any reader is free to download the data, repeat the analysis, and see the results themselves.”