RE: [ISN] When to Shed Light

From: InfoSec News (isnat_private)
Date: Wed Jun 18 2003 - 00:57:16 PDT

  • Next message: InfoSec News: "[ISN] Hatch Takes Aim at Illegal Downloading"

    Forwarded from: Pete Lindstrom <petelindat_private>
    
    To further my comments in the article:
    
    I think actively seeking vulnerabilities is just plain destructive.
    Sure, if the vulnerability is known we should disclose it, but it
    never should have gotten to that. I believe there is a lot of faulty
    logic behind the disclosure phenomenon. For example:
    
    1. We claim that disclosure actually makes our systems stronger/more
    secure. Of course, if that is the case then Microsoft has the
    strongest software on the planet and we should be happy to deploy it
    in our enterprise. Any takers? (By the way, I happen to believe
    Microsoft gets a bum rap, but use this as a common example of what
    goes on in the security space.) The whole concept of counting
    vulnerabilities as a measure of security is bogus - it is an
    unpopularity contest, nothing more, and doesn't say anything about the
    software itself. By the way, enterprises have shown time and again
    that they don't patch their systems anyway, so we can't get more
    secure this way.
    
    2. The more vulnerabilities we find, the closer we are to "the cure,"
    i.e. some sort of security nirvana where no more vulnerabilities exist
    in the world. Hmmm, this is a good one. So, count the number of lines
    of code in existence, then come up with some metric for the number of
    vulnerabilities in that code (I suspect you could use a very, very low
    number to be conservative). Now add in the number of lines of code
    being added to the world's code base every day. Finally, we factor in
    the number of vulnerabilities found. Are we getting any closer to
    finding all vulnerabilities in the world? Not a chance. More likely,
    we are getting further away. That shouldn't further our resolve to try
    harder, it should make us look at alternatives.
    
    3. If we don't find it, then the bad guys will. This is another one
    that doesn't work in the "macroeconomics" of the world's code base.
    Though I can't prove this, I suspect that, given the amount of code in
    the world, the likelihood of a good guy finding the same hole as a bad
    guy is probably the same as the likelihood of a collision in a crypto
    hash - nearing impossible. The most recent WebDAV vulnerability is the
    only case I am aware of where the vulnerability wasn't known
    beforehand. So the real question is, how many new exploits would there
    be if there weren't such a large pool of vulnerabilities to choose
    from? At the very least, it would reduce a lot of noise out there...
    (I would love to know about other exploits that occurred with unknown
    vulnerabilities, and am glad to keep them anonymous).
    
    I guess what really bothers me are the pretenses under which we
    operate. Those engaged in seeking out new vulnerabilities should just
    go ahead and say that they think it proves they are smarter than their
    competition. Period. It has nothing to do with the common good, it has
    to do with boosting egos and generating revenue.
    
    If consultants really want to spend time on this (honestly, I don't
    understand how companies can absorb the simple cost of it) they should
    be setting up honeypots. I don't advocate honeypots for most
    enterprises, but this would be the perfect fishbowl to really
    determine what was going on 'in the wild.' Setting up a honeypot would
    truly further our understanding of things like likelihood of attack,
    prevalence of attacks, the nature of security on the Internet, etc...
    All great stuff we really have limited information on, but what we do
    have is valuable (thanks, Lance).
    
    There is one other reason that is a bit more difficult to dispense
    with - That we really do this just to 'stick it to the vendor' and
    make them pay the price for having written poor software. In my
    opinion, this seems a bit spiteful and amounts to a pyrrhic victory -
    sure we sock it to 'em, but at what cost? The real loser ends up being
    enterprises.
    
    My solution for this one is still a bit sketchy, but let me try. I
    don't advocate software liability because it is too likely to be wrong
    - the old "it's not a bug, it's a feature" cliché would create lots of
    problems, and we only think about Microsoft and not the little guys in
    our argument. I also don't believe we will ever completely eradicate
    vulnerabilities and must therefore come up with a new metric to
    measure 'software risk' (how about person hours per vulnerability
    found?).
    
    Instead of software liability, I advocate Material Safety Data Sheets
    for software. In the same way chemical/pharmaceutical manufacturers
    must document the interactions of their chemicals with "the world
    around them," we should have software vendors document software
    interactions with the rest of the operating environment. This will
    ensure that they have completely tested their software and provide us
    with a blueprint to create security profiles in host intrusion
    prevention software. At least then we have a set of assertions from
    the vendor about how their software works. Heck, it also sets the
    stage for demonstrable negligence and fraud in the future.
    
    Just some ideas.
    
    Regards,
    
    Pete  
    
    
    Pete Lindstrom, CISSP
    Research Director
    Spire Security, LLC
    
    
    
    -----Original Message-----
    From: owner-isnat_private [mailto:owner-isnat_private] On Behalf
    Of InfoSec News
    Sent: Tuesday, June 17, 2003 3:14 AM
    To: isnat_private
    Subject: [ISN] When to Shed Light
    
    
    http://www.eweek.com/article2/0,3959,1128749,00.asp
    
    By Dennis Fisher
    June 16, 2003 
    
    Until recently, software security vulnerabilities were discovered mostly
    by chance and by developers, security specialists or other
    professionals. Once the flaw was discovered, news about it spread slowly
    and typically by word of mouth on bulletin boards or perhaps the
    occasional security lecture.
    
    The huge network of security researchers - independent or otherwise -
    who race to find the next big vulnerability in Windows or Apache, for
    example, is a recent phenomenon.
    
    So, too, are the overlapping and interconnected mailing lists on which
    the researchers publish their vulnerability bulletins. Lists such as
    BugTraq and Full Disclosure were founded to give administrators and
    other IT professionals a place to get early information on developing
    software problems.
    
    But the amount of publicity and attention security has commanded in
    recent years has brought new, less experienced and less disciplined
    people into the security community. This, in turn, has led to
    vulnerability reports being published before patches are available,
    bulletins being stolen from researchers' computers and posted without
    their knowledge, and a litany of other problems.
    
    This chaos has led some in the community to question whether
    vulnerability research and disclosure, in its current form, does more
    harm than good. One side of the debate argues that because there is
    essentially an infinite number of potential vulnerabilities in software,
    finding and fixing a handful every year has no effect on the overall
    security landscape. On the other hand, since disclosing a vulnerability
    to the public means that good guys and bad guys alike get the
    information, disclosure can actually cause a great deal of damage.
    
    "The point is not to say that these folks don't have the right to 
    disclose anything they want - of course, they do. In fact, we must 
    assume that, in general, people are finding vulnerabilities and not 
    disclosing them and [that] they can be used against us," said Pete 
    Lindstrom, research director at Spire Security LLC, in Malvern, Pa. 
    "The point is to demonstrate that those folks that say full disclosure 
    is in some way good for us are actually doing more harm than good. 
    Just think how much better our security might be if the highly skilled 
    people who spend all day, every day, searching for vulnerabilities in 
    software would try to design a security solution."
    
    [...]
    
    
    
    -
    ISN is currently hosted by Attrition.org
    
    To unsubscribe email majordomoat_private with 'unsubscribe isn'
    in the BODY of the mail.
    



    This archive was generated by hypermail 2b30 : Wed Jun 18 2003 - 03:12:42 PDT