Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    An AI-Driven Work Force
    AI and Microcredentials
Sign In
Technology

2 New Threats Highlight Human-Factor Gaps in Cybersecurity Research

By Paul Basken January 12, 2018
Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”
Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”NJIT

The latest cybersecurity scare is a big one.

Known as Meltdown and Spectre, the newly revealed security vulnerabilities expose major and longstanding flaws in virtually all computer chips, essentially giving hackers a widespread opportunity to steal data.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”
Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”NJIT

The latest cybersecurity scare is a big one.

Known as Meltdown and Spectre, the newly revealed security vulnerabilities expose major and longstanding flaws in virtually all computer chips, essentially giving hackers a widespread opportunity to steal data.

While these flaws represent hardware and software failures, they also highlight a larger struggle among university cybersecurity researchers to better incorporate an understanding of human behavior into their work.

In fact, said Kurt R. Rohloff, an associate professor of computer science at the New Jersey Institute of Technology, who is working on defenses against Meltdown and Spectre, the cases show how badly needed an understanding of human factors remains in the nation’s cybersecurity-research agenda.

In an interview with The Chronicle, Mr. Rohloff explained the basics of Meltdown and Spectre and described how academic structures and policies in areas such as tenure and promotion may hinder efforts to thwart computer-related crime. This interview has been edited for length and clarity.

Q. What are Meltdown and Spectre, and how do they work?

A. Modern software and processors are so sophisticated that they can basically make guesses about what the next computation is going to be and then run that computation before it’s actually called for. This is what’s known as “speculative execution,” and it helps computers run faster. The hacking opportunity comes when the chips begin performing tasks before the computer is ready for them. If one tries to run unauthorized code as a normal operation, it would be stopped by the processor. But these speculative-execution engines don’t have these guards.

Q. Is there any way to have both the speed of speculative execution and the necessary security built in?

A. Right now, no. In the future, yes. It means waiting a few years and then buying new computers when they put out new hardware.

Q. Are we ever going to get past that kind of cycle in cybersecurity? Or will we just end up waiting a few years for a new computer that is not vulnerable to Meltdown and Spectre, then find it’s vulnerable to some new problem discovered by college students hashing out theories over beers?

ADVERTISEMENT

A. That gets to the nature of cybersecurity and where the field is going. Right now the field looks like a game of gotcha, where somebody builds something and someone pokes a hole in it. And that’s been kind of the model since basically the 1960s. The push more recently, partially led by Darpa, has been this notion of “formal security analysis,” where you take a chip-set design or some software, study its properties, and then give what are known as formal guarantees, or mathematical proofs, that it’s resistant or protected from classes of attacks.

Q. OK, but what if somebody comes up with a new class of attack?

A. Yeah, that’s exactly the issue. It’s like a cat-and-mouse game. This is just the nature of cybersecurity.

Q. So that means there will never be secure computers?

ADVERTISEMENT

A. That’s like saying there will never be safe drivers — there’s just certain level of things you have to live with in your life. The challenge is: How can you go and protect yourself? Can you design safer roads and cars that have airbags and other kinds of inherent protections? You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder and learn from the lessons that we’ve experienced, and that’s the nature of cybersecurity right now.

Q. If that’s fundamentally true, then don’t we need a noncomputer component of research to help us live with that reality? And are researchers developing that?

A. Often the human is the weakest part. You’re getting at this philosophy of whether things should be permissive or restrictive as a rule — is the human someone to be trusted to know when to turn things on or off? Or is the human not to be trusted, because it’s usually the social-engineering attacks that are the most damaging, that are used to exfiltrate things like passwords and bank accounts?

Q. And are researchers doing that? As someone involved in computer security, are you talking with social scientists to work together and figure this out together, or is it largely a world of computer scientists trying to figure out computer security?

ADVERTISEMENT

A. It’s mostly computer scientists trying to do these things. There are definitely human-factors folks who are involved with this kind of stuff — folks who look at what is the nature of two-factor identification, what kind of biometrics make sense. But they’re not as common, and I wish there were a heck of a lot more.

Q. What does that tell you that needs to be done differently, especially as it concerns university work on cybersecurity?

A. That social science is a major part of this, and people in some sense give it short shrift. Cybersecurity comes from computer-science departments, which have not really spent a lot of time looking at human factors. The traditional soft science departments seem to have had trouble with this, and I don’t know if it’s a top-down issue — where the people who control tenure and other things in this space don’t see the value of this type of research — or if there’s not publishing opportunities that are set up for academics to succeed if they look at human-factors issues in security.

Q. So are you saying it’s the soft-science folks who don’t want to get involved, or is it the computer-science folks, or both?

ADVERTISEMENT

A. It’s probably both. In the tenure process, there’s a certain set of hoops you have to jump through, about making sure you publish in such-and-such quality journals, and so many publications, and get so much funding. The funding for that kind of research, and the ability to get high-quality publications, is not there as it is for traditional computer science. And so there’s this kind of systemic bias against doing things like that if one wants to improve one’s chances for tenure.

This philosophy of whether things should be permissive or restrictive — is the human to be trusted? Or is the human not to be trusted, because usually the social-engineering attacks are the most damaging?

I don’t know social scientists all that well, but I’d be shocked if their story was all that different.

Q. Is there anything anybody could do so that you did know social scientists a bit better? Like put you in the same cafeteria?

ADVERTISEMENT

A. I’m active in the Darpa community, which is very big on these interdisciplinary programs. And I used to be involved in computational social sciences, and things like that. And these kinds of funding-agency-directed, need-based projects seem to be a great way of doing those kinds of things. But it tends not to be so much people at an individual institution as it is people working across institutions and across departments at different institutions,

Q. What could be done to improve that situation?

A. I’m probably more biased toward solutions involving funders that incentivize people to work in a space.

Q. Meltdown and Spectre seem to be a traditional computer-science problem, not so much an interdisciplinary issue, and still it got missed for decades. Does that suggest that even within your own discipline, “formal security analysis” isn’t getting the job done?

ADVERTISEMENT

A. It’s what happens. I could point to any number of things that were out there for decades and only now people are discovering. For example, it was realized only a few years ago that you could put a relatively low-quality microphone near a laptop, and just by listening to the vibration of the capacitors and other electronic components, start to figure out what encryption keys were stored on these devices. And that’s just putting a microphone near a computer.

Paul Basken covers university research and its intersection with government policy. He can be found on Twitter @pbasken, or reached by email at paul.basken@chronicle.com.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Technology
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
Paul Basken Bio
About the Author
Paul Basken
Paul Basken was a government policy and science reporter with The Chronicle of Higher Education, where he won an annual National Press Club award for exclusives.
ADVERTISEMENT
ADVERTISEMENT

Related Content

Keeping Up With the Growing Threat to Data Security
Innovations in Cybersecurity Benefit Graduates and the Nation
Vigilance Is the Price of Secure Computing

More News

Photo illustration showing Santa Ono seated, places small in the corner of a dark space
'Unrelentingly Sad'
Santa Ono Wanted a Presidency. He Became a Pariah.
Illustration of a rushing crowd carrying HSI letters
Seeking precedent
Funding for Hispanic-Serving Institutions Is Discriminatory and Unconstitutional, Lawsuit Argues
Photo-based illustration of scissors cutting through paper that is a photo of an idyllic liberal arts college campus on one side and money on the other
Finance
Small Colleges Are Banding Together Against a Higher Endowment Tax. This Is Why.
Pano Kanelos, founding president of the U. of Austin.
Q&A
One Year In, What Has ‘the Anti-Harvard’ University Accomplished?

From The Review

Photo- and type-based illustration depicting the acronym AAUP with the second A as the arrow of a compass and facing not north but southeast.
The Review | Essay
The Unraveling of the AAUP
By Matthew W. Finkin
Photo-based illustration of the Capitol building dome propped on a stick attached to a string, like a trap.
The Review | Opinion
Colleges Can’t Trust the Federal Government. What Now?
By Brian Rosenberg
Illustration of an unequal sign in black on a white background
The Review | Essay
What Is Replacing DEI? Racism.
By Richard Amesbury

Upcoming Events

Plain_Acuity_DurableSkills_VF.png
Why Employers Value ‘Durable’ Skills
Warwick_Leadership_Javi.png
University Transformation: a Global Leadership Perspective
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin