Re: [ISN] Security Stats from Gartner Group

From: mea culpa (jerichot_private)
Date: Tue Apr 28 1998 - 15:45:59 PDT

  • Next message: mea culpa: "[ISN] As Corporate Networks Expand, Some Visitors Come Uninvited"

    Reply From: David Kennedy CISSP <dmkennedyt_private>
    
    >[Moderator: It annoys me to no end to see groups quote statistics like
    > this. For those of you who missed it, a year or so ago the DOD released
    > 'statistics' showing 250,000 computer attacks in the previous year. What
    > they failed to mention was the method they used to obtain that number.
    > It has now come out that the figure included failed logins among other
    > things, meaning every time a legitimate user mistyped their password,
    > that was counted. So when you read stats like this, be weary.]
    >
    > http://www.zdnet.com/icom/cyberstats/1997/12/
    
    
    [David Kennedy: Sorry for the latency.  Not really correct, AFAIK.  
     See this:]
    
    From: Julian Assange <profft_private>
    Date: 03 Feb 1998 13:21:02 +1100
    To: lacct_private
    Subject: LACC: DoD Computer Systems Vulnerability Assessments
    
    
    "Higgins, Michael R" <miket_private>
    Mr. Perillo,
    
    Not withstanding the fine analysis of available data that you provide,
    might I add a little insight into a process which I developed and which
    resulted in the DISA statistics so often quoted in the open literature.
    The VAAP was established with strong process and procedures to provide
    the first statistical analysis of the security posture of DoD's
    unclassified information systems.
    
    Having fought the good fight for over five years and loosing more often
    than not due to the inability of the senior staffs to believe they were
    vulnerable, LtGen Edmonds (then Director of DISA) authorized a
    cooperative program of selected demonstrations.  The selection criteria
    for these demonstrations included volunteers (many of whom had a high
    confidence in the security of their systems but some from the other end
    of the spectrum who wanted to "prove" that they needed help) and
    "sensitive but unclassified" DoD interest systems.  This latter group
    included Logistics, Financial, Medical and "First Alert Units" in the
    US.
    
    The tests were conducted using automated "intrusion tools" and manual
    exploitation scripts.  A successful intrusion was measures as "USER"
    level access to a host.  A system was defined as a single host.
    Therefore, the numbers you often see are a measure of number of
    successful intrusions into "identified" hosts.  This last qualifier is
    important since the initial scans were completed up to five times at
    varying times of day and days of the week.  This method brought the
    largest sample size and therefor somewhat artificially decreased our
    successes.  What I mean is many times we identified a host through our
    scanning but then it would disappear during our probe phase because it
    had been turned off.  We did not, in any single case I am aware, probe
    every machine identified.  
    
    The next statistic that is often referred to is the "detected"
    intrusions.  This number was gained from the post test analysis.  We
    asked!  We never considered someone would lie and we went to extended
    efforts to identify the owners/operators of the attacked systems and
    simply asked them if they knew they had been broken into.  In only a
    handful of cases did we believe we were mislead because the system
    owner/operator couldn't describe what alert or condition caused him to
    believe he had been penetrated.  In these cases, we concluded we were
    detected because we were told we were!
    
    The last statistic is the most abused and least mature of the
    information gained in our process, "reported."  The information was
    sometimes simple to gather, they called the ASSIST or Service hotlines
    to report the intrusions or attempts.  This is my single criticism of
    your analysis below, a report was considered to be made when anyone was
    told external to the organization or to the senior staff of the
    organization.  That is if SSG Jones noticed something and told his
    office mate and that was it, then no report was made.  If SSG Jones told
    his immediate supervisor and no other report was made, then a report WAS
    made.  If SSG called the ASSIST, Service response centers, their local
    military police, their local military intelligence organization, CERT-CC
    or any other response team, or (as I often joked) their mother, it was
    considered a report.
    
    The only point of conflict with the statistics and others (i.e.. the
    John Howard study) is the reporting statistic.  But I think the DISA
    statistics are still valid for two reasons. First, the length of the
    DISA testing was artificially set to no more than 14 days of actual
    probing and exploiting.  This reduced some of the foot print you see in
    a normal "major" incident.  Second, the DISA testing did NOT stress
    invisibility.  We wore our muddy Army boots when we hit a site!  We did
    nothing to hide our activity or change any logs which would have
    resulted in detection.  I can not remember the last incident I have
    worked (and that is hundreds) in which a lot of subterfuge was not in
    place to keep the perpetrator out of the sight of the administrator.  
    
    So when I read about my statistics, and see how often they are
    mis-interpreted or abused to show some political argument I get angry.
    You can not extrapolate these statistics to the entire unclass DoD
    infrastructure, as the sample was truly not random.  
    
    It is my belief that DoD unclassified systems and unclassified but
    sensitive systems were insufficiently secure to support the military
    missions which they were designed to support.  DoD needs to address
    these insecurities in a big way!!
    
    As for me I moved on, as you can see by my mail address, to the
    wonderful world of commercial Information Protection.  I do many of the
    same activities for financial institutions, high technology firms,
    manufacturing concerns and Entertainment, Medical, and Power industries.
    I run a commercial Incident Response group called the Rapid Emergency
    Action Crisis Team (REACT), Intrusion testing service unmatched in the
    commercial sector (known as PROTECT) and an open source intelligence
    gathering program (DETECT).  I read, albeit irregularly, C4IPRO and
    welcome comments or questions.
    
    Mike
    
    
    Mike Higgins
    Information Protection Operations and Technology
    Center for Information Protection
    888-REACT12 (hotline)
    703-442-5687 (direct)
    800-700-8294 (pager)
    "The opinions expressed, though entirely correct, may not be shared by
    my employer"
    
    >     The correct statistic is that 96% of all successful attacks
    > against DoD computer systems connected to the Internet go
    > undetected. This was out of a total sample of 38,000 attacks done
    > by DISA from 1992 to 1996, with 24,700 (65%) being successful
    > exploitations of known security vulnerabilities and 988 of those
    > were detected. Of the 988 detected only 27% were reported back to
    > DISA. This last statistic is questionable, and will elaborate on
    > that.
    > 
    
    -o-
    Subscribe: mail majordomot_private with "subscribe isn".
    Today's ISN Sponsor: Dimensional Communications (www.dim.com)
    



    This archive was generated by hypermail 2b30 : Fri Apr 13 2001 - 12:51:59 PDT