[ISN] How to Conduct a Vulnerability Assessment

From: InfoSec News (alerts@private)
Date: Tue Jun 05 2007 - 22:30:23 PDT


http://www.cio.com/article/116800/How_to_Conduct_a_Vulnerability_Assessment

By Sarah D. Scalet
June 05, 2007  
CSO  

Roger Johnston knows about security vulnerabilities, and not only 
because he works for the Los Alamos National Laboratory, which has 
experienced more than its share of security problems of late (including 
the loss of classified materials last autumn). As leader of the 
laboratory's Vulnerability Assessment Team, a research group devoted to 
improving physical security, Johnston is the guy who gets brought in to 
find security problems, not only at his own agency, but also at other 
agencies and at private companies. His team has been hired to conduct 
vulnerability assessments at government agencies with such high security 
stakes as the International Atomic Energy Agency, the Department of 
State and the Department of Defense, as well as at private companies 
that are developing or considering the use of high-tech security 
devices.

Senior Editor Sarah D. Scalet recently spoke with Johnston about 
strategies for running an effective vulnerability assessment and then 
communicating the results without also putting your job on the line. To 
help security leaders identify specific areas that need improvement, 
Johnston also developed a quiz that identifies the 28 attributes of a 
flawed security system. "We see the same things over and over," he says. 
"These are the common unifying themes." Find out how you rate. (Note: 
Johnston emphasized that his statements here are his own opinions and do 
not necessarily reflect the official position of the Los Alamos National 
Laboratory or the U.S. Department of Energy, its parent organization.)


CSO: You basically spend your days finding problems with things. Are 
people afraid to cook for you?

ROGER JOHNSTON: Yeah, well, we always try to have an upbeat message. 
There are often very simple fixes to problems. Say you're using a 
tamper-indicating seal for cargo security. When you inspect the seal, 
maybe you simply spend an extra second or two looking for a little 
scratch in the upper right-hand corner to discover an attack.


CSO: So training is a key to that upbeat message?

JOHNSTON: Right. We're very strong believers in showing security 
personnel a lot of vulnerability information. Often, low-level security 
people aren't given the information they need to do a good job. If they 
know what they're supposed to be looking for, instead of just turned 
loose and told to report "anomalous incidents," they generally will do a 
lot better. You really haven't spent a lot of extra money, and it 
doesn't necessarily take a great deal of time.


CSO: When you're doing a vulnerability assessment, what's the best way 
to get into the mind-set of the adversary?

JOHNSTON: That's the real trick. The problem with a lot of vulnerability 
assessments is that they're done by very sincere security people who 
have devoted their lives and careers to being good guys. They really 
don't want security to have any problems. It's not a matter of 
dishonesty; it's just human nature. Also, in many cases security 
personnel come from military or police backgrounds. That kind of 
training and discipline can be very useful, but those backgrounds don't 
typically tend to attract people who are wildly creative.

You want to look around your organization and find people who are 
outside-the-box thinkers. They don't have to be in the field of 
security. You're looking for people who would normally be your worst 
security nightmare--people who are loophole finders, smart alecks, kind 
of skeptical. They're people who have to prove things for themselves and 
aren't sure they buy everything they hear from authority.


CSO: So you're looking for people who've been in trouble for violating 
some security policy?

JOHNSTON: I don't want to push it too far. If they're wanted in 35 
states for felonies, maybe that's not exactly who you want looking at 
your critical security vulnerabilities. It's more about finding the 
people who won't automatically toe the party line. These are people in 
your organization who are already thinking about how they could beat 
your security. They're probably not going to do it, but that's just the 
way they think. They may be graphic artist types; they may be the smart 
aleck on the loading dock who's always questioning the boss.


CSO: There's more of that ethos in the information security culture than 
in the physical security culture.

JOHNSTON: Absolutely. There's a huge cultural gap, of course, between IT 
security and physical security, and that's much of the problem of 
convergence, trying to bring the two together. I think IT is better off 
in this regard. A lot of the people who work on computers automatically 
think that way.


CSO: What's the risk of conducting a vulnerability assessment from the 
point of a good guy?

JOHNSTON: When vulnerability assessments are done by good guys thinking 
like good guys, number one, they let the existing security 
infrastructure and hardware and strategies define the vulnerability 
issues. For example, if there's a fence, they'll think about ways the 
bad guys might get over the fence. But of course that's all backwards. 
We need to think about what the bad guys want to accomplish and then 
decide if we even need a fence. Number two, there's that tendency not to 
want to try to find problems.

CSO: Not only are they possibly making themselves look bad if they find 
a problem, they're also creating more work for themselves, right?

JOHNSTON: Absolutely. In many cases when the fix is very simple, 
organizations are very reluctant to do it, because that is sometimes 
thought of as saying, "We've been screwing up all these years." So you 
don't want to go with people who have a history of doing a vulnerability 
assessment and then telling you everything is swell. There are always 
vulnerabilities, and they are always present in very large numbers. Any 
vulnerabilities assessment that finds zero vulnerabilities is completely 
useless.


CSO: When you actually do the assessment, are there warm-ups you can do 
to get yourself in the mind-set of a bad guy, or are there ways you 
should set up the room?

JOHNSTON: A lot of vulnerability assessment needs to be very similar to 
classic brainstorming. A lot of the tools that are applied to creative 
thinking in other fields can be applied directly to vulnerability 
assessments. This is kind of a radical position. A lot of people in the 
security business are not comfortable with this 1960s hippy, 
touchy-feely, "let's all get together" approach.


CSO: I'm imagining a bunch of beanbag chairs.

JOHNSTON: Yeah. A lot of people would much rather have a rigorous, 
quantitative approach, and I would claim that's largely a sham. I don't 
think it's a mistake to use analytical tools like a security survey, but 
we would like to combine those more closed-ended, straightforward tools 
with creative thinking. The fact is that creativity has been studied 
extensively over the last 50 years, and there's a lot of understanding 
of how you create an environment where people come up with good ideas. 
It's not quite the seat-of-the-pants, wacky kind of thing that it might 
look like from the outside.


CSO: Should the CSO even be there?

JOHNSTON: You don't want the boss in the room, because it constrains 
people. What you need are really nutty ideas, so we strongly encourage 
thinking about attacks that involve Elvis impersonators and flying 
monkeys and the use of space aliens. Early on, it's very important not 
to editorialize. Later on, we're going to prioritize them and think 
about the practicality of them. In many cases, we have people say, 
"Well, if I had the space aliens come down with a ray beam, they could 
do the following." Later on, it turns into a very viable attack, once we 
get rid of the space aliens and the laser beams.


CSO: Does this take hours? Days? Weeks?

JOHNSTON: It depends. If you're looking at a very complex security 
program, you may want to spend two or three weeks just kind of 
freewheeling. But you don't just sit around and do ideas. You generate 
nutty ideas, and then you go back to the program or the hardware and 
play around a little bit to see if those nutty ideas might have some 
merit. Then you get back together again, and you think of more nutty 
ideas based on what you learned. We're very much in favor of hands-on 
work, and not just thinking in abstractions. Toss the device around. 
Chat up the security guards. Kick the fence. Play with the system and 
try to understand how it behaves.


CSO: When the CSO tells his or her company about a vulnerability, we've 
seen that there can be a kind of "shoot the messenger" effect. (See 
"Don't Shoot the Messenger" from the August 2006 CSO.) What are ways 
they can avoid that or at least mitigate the effect?

JOHNSTON: We try to encourage people think about a vulnerability not as 
bad news. It's great news. When you find a vulnerability, you can do 
something about it.


CSO: But you still have to take people down the path of, something 
terrible could happen.

JOHNSTON: All our vulnerability assessment reports start out by pointing 
to the good things. There are <em>always</em> good things. Sometimes 
they're an accident, but by pointing them out, you get them recognized. 
Also, at the very beginning we always point out that we're going to find 
more vulnerabilities than they can possibly mitigate. We're going to 
make more suggestions for changes than you can possibly implement. 
That's OK. The bottom line is, vulnerability assessors are not here to 
tell you what changes to make. We're here to point out what we think are 
problems and what we think may be solutions. It's up to you to decide 
what you do with the findings.

This binary thinking about security--that something is either secure or 
not secure, or that we have to have all the vulnerabilities covered or 
we're not doing our job--is really nonsense. Security is a continuum, 
and there are always going to be vulnerabilities you can't do anything 
about. That doesn't mean anybody is screwing up. That's just the way 
security works.


CSO: In coming up with this laundry list of problems and possible 
solutions, is there oftentimes an 80/20 thing at play, where you can 
solve 80 percent of the problems with 20 percent of the solutions?

JOHNSTON: It does work that way. People say, "Gee, you're telling me I 
need to make this one little change, and this attack and this attack and 
this attack and this other attack basically go away?" It's really quite 
surprising. Sometimes the vulnerabilities are extraordinarily complex, 
and the solutions, while they may not be 100 percent perfect, are often 
really painless. We don't always have the most realistic view--we work 
for the government--about what's economically viable to implement. 
Sometimes what we think is simple isn't really simple in the real world. 
But that's OK too. Sometimes our suggestions get the end users thinking, 
and then maybe they come up with their own solution.


CSO: You've brought a couple of industrial-organizational psychologists 
onto your team. Why?

JOHNSTON: Industrial-organizational psychology has been applied across a 
wide range of fields, but for some weird reason, not security. When we 
first got these psychologists to work with us, they just couldn't 
believe that no one had applied all these powerful tools in industrial 
psychology towards security problems. Increasingly, we're using them to 
understand the human factors associated with security. In the end, 
security is really about how people interact with technology, how people 
use and think about technology, and how the technology was designed to 
enhance what people are already doing.


CSO: What kinds of things have the industrial-organizational 
psychologists found?

JOHNSTON: The main one early on was the recognition that the security 
guard turnover problem is a huge problem. The numbers typically run 
between 40 percent and 400 percent per year. McDonald's has a turnover 
rate of about 35 to 40 percent, so McDonald's does a better job than 
security of finding the right people and hanging on to them. There are 
plenty of organizations that do very fine with turnover rates that don't 
pay people very well and don't necessarily represent fabulous careers. 
There are ways that IO psychologists have developed over the last couple 
decades that help these companies, but the tools never have been applied 
to security. The first things that our guys did was publish some papers 
basically saying, "Hey, wake up, we don't need to do any new R&D, there 
are all these tools already proven out there." They involve things like 
understanding who you hire and creating a realistic picture in their 
mind of what the job is like. If you simply do that, turnover rate 
plummets.

We're just beginning to look more specifically at how IO psychology 
applies to vulnerability assessments. It's a totally open field. One 
problem we want to look at is the tamper-indicating seals that are used 
for cargo security. We know from experience that some people are really 
good at finding seals that have been tampered with, and some people 
aren't. But we don't know why. One of the things we want to do is study 
the people who are good at it and try to understand what it is that 
they're doing or what characteristics they have that make them good. One 
of the studies we want to do, and we haven't found anybody to fund it, 
is an eye-tracking study. We want to look at what seal inspectors are 
looking at. You give them this little eyeglass thing, and it tells what 
their eyes are looking at. It's used all the time to judge 
advertisements for TV; advertisers stick audiences in front of the 
proposed commercial to see if they're really looking at the product or 
they're looking at the pretty girl in the background. We want to apply 
this technology to understanding what the people who are effective at 
finding seals that have been tampered with are looking at. Maybe we can 
train people better, or maybe we can do a screening exercise to find the 
people who are really good at it, for whatever reason.



CSO: I know you've done a lot of work around what to do once you 
actually find a vulnerability. Can you tell me about the Vulnerability 
Disclosure Index that you and your group have created?

JOHNSTON: One of the problems with finding a vulnerability is, exactly 
who do you tell? We have found vulnerabilities that were specific to the 
sponsor of the vulnerability assessment, and of course if they pay for 
the work, they get the findings. No issue there. But we'll find things 
that have more general applicability. Now the question is, what do you 
do? A classic example is spoofing a global positioning system. 
Everyone's focused on jamming GPS devices, but that's not an interesting 
attack, because the GPS receiver knows it's not getting satellite 
signals from space. Spoofing, however, turns out to be surprisingly 
easy. You can feed fake coordinate information to a GPS receiver.


CSO: How could the bad guys use that to their advantage?

JOHNSTON: A lot of national networks, like for financial transactions, 
get their critical time synchronization from the GPS satellite signals. 
If someone fed the GPS fake information, the networks could crash within 
milliseconds. It could potentially be very serious. There's some 
recognition that jamming might be an issue, but in our view spoofing is 
the far more serious issue and is not widely recognized. Now, do we 
discuss this? Do we write papers about this problem? Or do we just keep 
our mouths shut?

This kind of problem crops up all the time, but there are some fairly 
straightforward, simple signs you're looking for. If there are a whole 
lot of good guys who don't seem to be very sophisticated in 
understanding the vulnerabilities, and there are only a small number of 
bad guys, you probably ought to just publicize it to the world. If the 
attack is pretty obvious--and I think GPS spoofing is--the bad guys are 
going to figure it out anyway. So again, you probably ought to just tell 
the whole world. On the other hand, if it's kind of a specialized 
security device not being used by very many people, but a whole bunch of 
potential bad guys might want to exploit it, then maybe you don't need 
to be publicizing that vulnerability. Instead, you want to seek out the 
specific end user and point out the potential problem. The Vulnerability 
Disclosure Index is a sort of semiquantitative attempt to try to provide 
some guidance as to whether you should disclose this vulnerability, how 
publicly, and in how much detail you should go.


CSO: Vulnerability disclosure has been especially contentious in the 
field of IT security. (See "The Chilling Effect" from the January 2007 
CSO.) Does this Vulnerability Disclosure Index apply to IT 
vulnerabilities as well?

JOHNSTON: It's really meant for physical security. IT lives in a very 
different world. Let's say you're playing around on your home computer, 
and you find a very serious software vulnerability. There's some 
controversy, but most people agree you should do the following: You 
should contact the software company and say, "I think there's a problem 
here." You give them a chance to fix that. If after a while they're just 
stonewalling and not doing anything, then maybe you go public. Once they 
fix the problem, it's no big deal. Everybody who bought the product 
typically does checks on whether there are upgrades.

Physical security is not like that. In many cases the physical security 
systems are from a bunch of different vendors and may be put together by 
a third-party vendor. Often there's no one company to go to complain 
about a potential vulnerability. Moreover, the fix isn't just some 
software download. The fix may require servicepeople going out and 
changing parts, and it could be very expensive, very disruptive. Before 
you get everybody all wound up about a physical security vulnerability, 
you may want to think about, is it even going to be practical to fix it?


CSO: You've written that when the vulnerability assessment is chartered, 
the sponsor owns the findings, but that that doesn't necessarily 
"relieve the vulnerability assessors of their responsibility to warn 
others of a clear and present danger." This might strike fear into the 
hearts of CSOs who think they're going to hire someone to do a 
vulnerability assessment and the contract will ensure that the findings 
remain private.

JOHNSTON: A typical example would be if a company is considering a 
commercial security device. Let's say we do a vulnerability assessment 
on that device and oh my gosh, if you poke it with a paperclip it will 
quit working. And we know that commercial device is being used for a 
wide variety of applications, including corporate security, U.S. 
national security and nuclear safeguards. We believe we have some moral 
responsibility to tell people there might be a problem. Most companies 
we've done that for have had no problems and in some cases encourage us 
to do exactly that.


Senior Editor Sarah D. Scalet can be reached at sscalet (at) cxo.com. If 
you would like to see a copy of a paper Roger Johnston wrote about 
vulnerability disclosure, contact him at rogerj (at) lanl.gov.

Copyright 2002-2007 CXO Media Inc. All rights reserved.


_____________________________________________________
Attend Black Hat USA, July 28-August 2 in Las Vegas, 
the world's premier technical event for ICT security 
experts. Featuring 30 hands-on training courses and 
90 Briefings presentations with lots of new content 
and new tools. Network with 4,000 delegates from 
70 nations.   Visit product displays by 30 top
sponsors in a relaxed setting. Rates increase on 
June 1 so register today. http://www.blackhat.com



This archive was generated by hypermail 2.1.3 : Tue Jun 05 2007 - 22:35:50 PDT