Re: [logs] Due Diligence for Admission in Court

From: todd glassey (todd.glasseyat_private)
Date: Tue Dec 04 2001 - 09:40:42 PST

  • Next message: todd glassey: "Re: [logs] Due Diligence for Admission in Court"

    If you plan on submitting something to a court, you likely may need to have
    the following setup:
    
        0)    Understanding what it is you are trying to do
    
    Understanding what it is you are trying to do - to ultimately build in
    Information Assurance, or to extend that to include Information Content
    Integrity.
    
    The first instance (Information Assurance) says that this piece of data and
    the envelope it came in is the same now as when it got here, and that the
    method of xfer and management of the data can assure you of that. The second
    one is "that the content of the envelope that is OK as per the
    Information-Assurance process, is specific to XY&Z (Information Content
    Integrity) and that someone will back that commercially (like the operator
    or a CA). This is likely to be done by some PKI based protocol process that
    is used to trigger a decision and logging instance.
    
        1)    Some proof of the time data that was stamped on the document.
    
    This is a real issue since NTP across the open Internet is not reliable
    (heck, the NTP RFC even says so), and SNTP even less so. GPS likewise is a
    problem since it is easily spoofed and there is no way digitally to prove
    where the time data in the computing model came from with GPS, so it (and
    all passive RF based time services) has no more validity than looking at
    your watch and setting the computers time. That leaves ACTS as the only
    reliable timebase service in the world. It is loggable in the sense that you
    get the long distance bills to NIST or to the USNO and the time transfer
    process is OOB (out of band) from any networking models.
    
    You also may want to check your local Court's filing requirements regarding
    time and timestamping (if they have any) because they might mandate a
    particular time source in their country needs to be used (for International
    Filings, etc).
    
        2)    Some level of operating integrity which means a regularly run IDS
    process to verify the consistency of the system, like AIDE, SHADOW, or any
    of the commercial ones (Tripwire, ISS, etc).
    
    The bottom line is that you need to prove that your environment was behaving
    properly and to do that you will need these Filesystem IDS tools. The IDS
    system should be run once after initial system installation to create a
    local baseline and then again after each added application to update the
    baseline, then each operating period the IDS should be run to insure the
    integrity of the system and its logs forwarded to the Log Management process
    (#4 below)
    
        3)    Some kind of active threat management like firewalls and internal
    data path IDS services like SNORT or the like.
    
    To provide more real time warnings some sites may also want to run an
    intrusion detection system on the Network as well to look for attacks when
    they occur. This does not take the place of the exercise mandated in #2
    above but rather augments it.
    
        4)    Some kind of Log Management regimen wherein the logs are regularly
    timestamped, rotated, and made tamper-proof
    
    This is more to protect the SysAdmin than anyone else but also attackers try
    for the logging subsystem to disguise or hide their tracks and this needs to
    be prevented so all attacks are logged. As it happens, the Systems
    Administrators are the really weak link in most security models today and
    this can only be addressed by making the logging services and regimen that
    operates and verifies the integrity of the logs, safe from the fingers of
    the SysAdmins. This process may take the delegation of the logging
    operations to the Site Security Officer or to the Logger Designee.
    
    
        5)    Some kind of active audit model that demonstrates that each one of
    these constraints or subsystem requirements are met.
    
    This is likely between you and your auditors who ever they are and then your
    data and process will likely be admissible just about anywhere.
    
    
    Todd Glassey
    CTO - ServerWerks/ForensicAgents
    
    
    ----- Original Message -----
    From: "Devdas Bhagat" <devdasat_private>
    To: "Log Analysis Mailing List" <loganalysisat_private>
    Sent: Monday, December 03, 2001 11:33 PM
    Subject: Re: [logs] Due Diligence for Admission in Court
    
    
    > On 03/12/01 20:34 -0600, Tina Bird wrote:
    > > Pardon me for re-opening this can of worms.
    > >
    > > Did we ever come to a consensus, or a pseudo-consensus,
    > > on due diligence for computer logs as evidentiary
    > > quality data?
    > >
    > > What makes a judge unlikely to admit my logs as evidence?
    > > - unauthenticated data sources ("anyone can write to this
    > > datastream, therefore none of it is reliable")
    > Current standards don't support quite a lot of stuff that should be
    > reliable.
    > For example, you can log the SMTP sender, but since that is in the hands
    > of the client, it cannot be verified, or trusted.
    
    Only true if Sendmail/mailx is used. Many of the other mailer agents have
    addressed this and allow for extended handshaking.
    
    > Even if you restrict only authorized clients writing to the datastream,
    > you have no knowledge that the data itself is valid.
    
    You are not worried as to the integrity of the data only the constency of
    the data once it entered your system.
    
    >
    > > - lack of time synchronization
    > NTP, ideally following the new RFC.
    
    No No No - NTP is not reliable over the open Internet - period.
    
    >
    > > - long term storage that is not tamper-proof
    > If you can prove that the storage is secure, this should not be that
    > much of a problem.
    > "Yes, the floppy disk could have been written to, but it was sealed in
    > the presence of two people and the seal has not been broken."
    >
    > > - no strategy for dealing with all the data once it's collected
    > Or should this be no implemented strategy?
    > I believe there have been cases in the US where logs were not deemed
    > admissible because they weren't monitored (someone please hunt down any
    > links).
    >
    > <snip>
    > > 1) can't enforce secure transmission protocols throughout
    > > the network, because standards aren't sufficiently
    > > evolved -- so standard syslog, SNMP, SMTP are okay for
    > > transport protocols.  (although see #3 below)
    > Ummmm, IPSec between systems and log hosts? and syslog-ng or another
    > syslog variant over TCP can provide a higher degree of reasonableness.
    >
    > > 2) central loghost with NTP or other time synchronization
    > > throughout the network -- use ongoing record of process
    > > IDs on logging machines to verify reasonable expectation
    > > that a particular log message came from a given machine
    > > (does that make sense?  I know what I mean...)
    > It makes sense, for some value of reasonable. What you will need are two
    > copies of the logs, on different machines. If these match, you have a
    > reasonable chance that the logs are accurate, for a high value of
    > reasonable.
    >
    > > 3) access control enforced at loghost that limits which
    > > machines can log -- help reduce likelihood of spoofed
    > > traffic -- or implement other transports altogether, like
    > > the serial cable mechanism we've discussed
    > Essential.
    >
    > > 4) loghost is of course totally locked down, SSH only
    > > access, or console only access, and dumps logs to
    > > write-once archive format on regular basis
    > ssh only access, without unpassworded keys.
    > There was an interesting suggestion about putting a host in promiscious
    > mode and grabbing syslog traffic on the promisc interface which would
    > have no ip, or arp using tcpdump.
    >
    > > 5) log review and reduction strategy -- anyone want to
    > > take a stab?  since presumably part of showing that the
    > > data is reliable is showing that I've thought about how
    > > I should process it.
    > The administrator should be using some kind of automated alerting system
    > that parses the log file (something like logcheck). This provides log
    > reduction, and sends the interesting parts to the administrator for
    > further analysis.
    >
    > > 6) minimum list of machines on that non-existent typical
    > > network that I should be required to monitor to be
    > > credible?
    > Every machine that is deemed critical MUST be monitored. I would say:
    > file servers, database servers, firewalls, webservers, email servers,
    > routers and switches, the time server and the loghost(s).
    >
    > Devdas Bhagat
    >
    >
    > ---------------------------------------------------------------------
    > To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    > For additional commands, e-mail: loganalysis-helpat_private
    >
    >
    
    
    ---------------------------------------------------------------------
    To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    For additional commands, e-mail: loganalysis-helpat_private
    



    This archive was generated by hypermail 2b30 : Tue Dec 04 2001 - 12:15:22 PST