> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
Technology
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

2 New Threats Highlight Human-Factor Gaps in Cybersecurity Research

By  Paul Basken
January 12, 2018
Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”
NJIT
Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”

The latest cybersecurity scare is a big one.

Known as Meltdown and Spectre, the newly revealed security vulnerabilities expose major and longstanding flaws in virtually all computer chips, essentially giving hackers a widespread opportunity to steal data.

While these flaws represent hardware and software failures, they also highlight a larger struggle among university cybersecurity researchers to better incorporate an understanding of human behavior into their work.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”
NJIT
Kurt Rohloff, a computer scientist at the New Jersey Institute of Technology: “You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder.”

The latest cybersecurity scare is a big one.

Known as Meltdown and Spectre, the newly revealed security vulnerabilities expose major and longstanding flaws in virtually all computer chips, essentially giving hackers a widespread opportunity to steal data.

While these flaws represent hardware and software failures, they also highlight a larger struggle among university cybersecurity researchers to better incorporate an understanding of human behavior into their work.

In fact, said Kurt R. Rohloff, an associate professor of computer science at the New Jersey Institute of Technology, who is working on defenses against Meltdown and Spectre, the cases show how badly needed an understanding of human factors remains in the nation’s cybersecurity-research agenda.

In an interview with The Chronicle, Mr. Rohloff explained the basics of Meltdown and Spectre and described how academic structures and policies in areas such as tenure and promotion may hinder efforts to thwart computer-related crime. This interview has been edited for length and clarity.

ADVERTISEMENT

Q. What are Meltdown and Spectre, and how do they work?

A. Modern software and processors are so sophisticated that they can basically make guesses about what the next computation is going to be and then run that computation before it’s actually called for. This is what’s known as “speculative execution,” and it helps computers run faster. The hacking opportunity comes when the chips begin performing tasks before the computer is ready for them. If one tries to run unauthorized code as a normal operation, it would be stopped by the processor. But these speculative-execution engines don’t have these guards.

Q. Is there any way to have both the speed of speculative execution and the necessary security built in?

A. Right now, no. In the future, yes. It means waiting a few years and then buying new computers when they put out new hardware.

ADVERTISEMENT

Q. Are we ever going to get past that kind of cycle in cybersecurity? Or will we just end up waiting a few years for a new computer that is not vulnerable to Meltdown and Spectre, then find it’s vulnerable to some new problem discovered by college students hashing out theories over beers?

A. That gets to the nature of cybersecurity and where the field is going. Right now the field looks like a game of gotcha, where somebody builds something and someone pokes a hole in it. And that’s been kind of the model since basically the 1960s. The push more recently, partially led by Darpa, has been this notion of “formal security analysis,” where you take a chip-set design or some software, study its properties, and then give what are known as formal guarantees, or mathematical proofs, that it’s resistant or protected from classes of attacks.

Q. OK, but what if somebody comes up with a new class of attack?

A. Yeah, that’s exactly the issue. It’s like a cat-and-mouse game. This is just the nature of cybersecurity.

Q. So that means there will never be secure computers?

ADVERTISEMENT

A. That’s like saying there will never be safe drivers — there’s just certain level of things you have to live with in your life. The challenge is: How can you go and protect yourself? Can you design safer roads and cars that have airbags and other kinds of inherent protections? You’re never going to completely limit the ability of someone to steal information from a computer, but you can make it a heck of a lot harder and learn from the lessons that we’ve experienced, and that’s the nature of cybersecurity right now.

Q. If that’s fundamentally true, then don’t we need a noncomputer component of research to help us live with that reality? And are researchers developing that?

A. Often the human is the weakest part. You’re getting at this philosophy of whether things should be permissive or restrictive as a rule — is the human someone to be trusted to know when to turn things on or off? Or is the human not to be trusted, because it’s usually the social-engineering attacks that are the most damaging, that are used to exfiltrate things like passwords and bank accounts?

Q. And are researchers doing that? As someone involved in computer security, are you talking with social scientists to work together and figure this out together, or is it largely a world of computer scientists trying to figure out computer security?

A. It’s mostly computer scientists trying to do these things. There are definitely human-factors folks who are involved with this kind of stuff — folks who look at what is the nature of two-factor identification, what kind of biometrics make sense. But they’re not as common, and I wish there were a heck of a lot more.

ADVERTISEMENT

Q. What does that tell you that needs to be done differently, especially as it concerns university work on cybersecurity?

A. That social science is a major part of this, and people in some sense give it short shrift. Cybersecurity comes from computer-science departments, which have not really spent a lot of time looking at human factors. The traditional soft science departments seem to have had trouble with this, and I don’t know if it’s a top-down issue — where the people who control tenure and other things in this space don’t see the value of this type of research — or if there’s not publishing opportunities that are set up for academics to succeed if they look at human-factors issues in security.

Q. So are you saying it’s the soft-science folks who don’t want to get involved, or is it the computer-science folks, or both?

A. It’s probably both. In the tenure process, there’s a certain set of hoops you have to jump through, about making sure you publish in such-and-such quality journals, and so many publications, and get so much funding. The funding for that kind of research, and the ability to get high-quality publications, is not there as it is for traditional computer science. And so there’s this kind of systemic bias against doing things like that if one wants to improve one’s chances for tenure.

This philosophy of whether things should be permissive or restrictive — is the human to be trusted? Or is the human not to be trusted, because usually the social-engineering attacks are the most damaging?

ADVERTISEMENT

I don’t know social scientists all that well, but I’d be shocked if their story was all that different.

Q. Is there anything anybody could do so that you did know social scientists a bit better? Like put you in the same cafeteria?

A. I’m active in the Darpa community, which is very big on these interdisciplinary programs. And I used to be involved in computational social sciences, and things like that. And these kinds of funding-agency-directed, need-based projects seem to be a great way of doing those kinds of things. But it tends not to be so much people at an individual institution as it is people working across institutions and across departments at different institutions,

Q. What could be done to improve that situation?

A. I’m probably more biased toward solutions involving funders that incentivize people to work in a space.

ADVERTISEMENT

Q. Meltdown and Spectre seem to be a traditional computer-science problem, not so much an interdisciplinary issue, and still it got missed for decades. Does that suggest that even within your own discipline, “formal security analysis” isn’t getting the job done?

A. It’s what happens. I could point to any number of things that were out there for decades and only now people are discovering. For example, it was realized only a few years ago that you could put a relatively low-quality microphone near a laptop, and just by listening to the vibration of the capacitors and other electronic components, start to figure out what encryption keys were stored on these devices. And that’s just putting a microphone near a computer.

Paul Basken covers university research and its intersection with government policy. He can be found on Twitter @pbasken, or reached by email at paul.basken@chronicle.com.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Technology
Paul Basken
Paul Basken was a government policy and science reporter with The Chronicle of Higher Education, where he won an annual National Press Club award for exclusives.
ADVERTISEMENT
ADVERTISEMENT

Related Content

  • Keeping Up With the Growing Threat to Data Security
  • Innovations in Cybersecurity Benefit Graduates and the Nation
  • Vigilance Is the Price of Secure Computing
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin