Re: [logs] Due Diligence for Admission in Court

From: todd glassey (todd.glasseyat_private)
Date: Tue Dec 04 2001 - 14:27:39 PST

  • Next message: Tina Bird: "Re: [logs] Due Diligence for Admission in Court"

    All - it is not my intent to seem snotty about this, so I apologize if  I
    seemed a bit gruff in my retorts, but I am very concerned with what
    constitutes digital evidentiary models, and how to qualify them. That is
    what drives my concern for logs and the like.
    
    Todd
    
    ----- Original Message -----
    From: "todd glassey" <todd.glasseyat_private>
    To: "Devdas Bhagat" <devdasat_private>; "Log Analysis Mailing List"
    <loganalysisat_private>
    Sent: Tuesday, December 04, 2001 1:30 PM
    Subject: Re: [logs] Due Diligence for Admission in Court
    
    
    > Bhagat - I disagree with a number of your comments - my text inline below.
    >
    > Todd
    >
    > ----- Original Message -----
    > From: "Devdas Bhagat" <devdasat_private>
    > To: "Log Analysis Mailing List" <loganalysisat_private>
    > Sent: Tuesday, December 04, 2001 10:49 AM
    > Subject: Re: [logs] Due Diligence for Admission in Court
    >
    >
    > > On 04/12/01 09:40 -0800, todd glassey wrote:
    > > > If you plan on submitting something to a court, you likely may need to
    > have
    > > > the following setup:
    > > >
    > > >     0)    Understanding what it is you are trying to do
    > > Important point, missed this one.
    > > <snip>
    > > >
    > > >     1)    Some proof of the time data that was stamped on the
    document.
    > > >
    > > > This is a real issue since NTP across the open Internet is not
    reliable
    > > Ok, but you can setup your own local NTP server. If your local time
    > > stamps are consistent across multiple local systems, you have much less
    > > to worry about.
    > > For multiple locations, one local NTP server and local log server,
    > > plus a central log server and central NTP server.
    > > Reasonable attempts to maintain timestamp synchronization.
    >
    > the question is how to get time to that NTP Server in the form of an
    > Initialization Event and developing a Timebase management process.
    >
    > >
    > > <snip>
    > >
    > > >     2)    Some level of operating integrity which means a regularly
    run
    > IDS
    > > > process to verify the consistency of the system, like AIDE, SHADOW, or
    > any
    > > > of the commercial ones (Tripwire, ISS, etc).
    > >
    > > Yes, logging and log checking would be a part of this process.
    > >
    > > > The bottom line is that you need to prove that your environment was
    > behaving
    > > > properly and to do that you will need these Filesystem IDS tools. The
    > IDS
    > >
    > > Actually, you need to prove that it had integrity before the compromise.
    > > Behaving properly, or not is a different term and will depend on what
    > > you want it to do.
    >
    > That's the point. The audit must take into account what the system was
    > supposed to be doing and what not. IDS processes need to start long before
    > any breakin or compromise can possibly occure to set a baseline for the
    > oeprations of the specific system under logging protection.
    >
    > >
    > > <snip>
    > > >     4)    Some kind of Log Management regimen wherein the logs are
    > regularly
    > > > timestamped, rotated, and made tamper-proof
    > >
    > > Again, see point two. You need to show integrity. Thats the only thing
    > > you need to prove.
    >
    > No its not. You need to get more into timestamping to see where the holes
    in
    > this statement are.
    > >
    > > <snip>
    > > >     5)    Some kind of active audit model that demonstrates that each
    > one of
    > > > these constraints or subsystem requirements are met.
    > > >
    > > > This is likely between you and your auditors who ever they are and
    then
    > your
    > > > data and process will likely be admissible just about anywhere.
    > >
    > > As long as a consistent audit model is maintained, and well documented.
    > > If the model is not documented, it will not be admissible.
    >
    > I disagree with this as a blanket statement with no founding in any court
    of
    > law that I am familiar with. If the model can be proven by audit to be
    > consistent then the documentation is a nicety.
    >
    > >
    > > <snip>
    > > > > For example, you can log the SMTP sender, but since that is in the
    > hands
    > > > > of the client, it cannot be verified, or trusted.
    > > >
    > > > Only true if Sendmail/mailx is used. Many of the other mailer agents
    > have
    > > > addressed this and allow for extended handshaking.
    > >
    > > What the client supplies cannot be trusted. Only locally
    > > originated/locally verifiable data can be trusted.
    >
    > This simply is untrue. If the system does not supply SMTP on port 25 then
    > this whole question is moot. It also is an assumption of the clients
    > operating criteria, and that is unaccepetable in any audit model.
    >
    > >
    > > > > Even if you restrict only authorized clients writing to the
    > datastream,
    > > > > you have no knowledge that the data itself is valid.
    > > >
    > > > You are not worried as to the integrity of the data only the constency
    > of
    > > > the data once it entered your system.
    > >
    > > You have to worry about both. If you cannot trust the integrity of the
    > > data itself, you have problems with your tools, since they have to trust
    > > this data to act on it.
    >
    > Simply not true - The application that is using the data needs to know
    that
    > the data is OK, the logging system doesn't care. Remember that the logging
    > system is part of the proofing model and not the decision support systems
    > that makes up a part of the application.
    >
    > > GIGO.
    > > If your data is not consistent, it should immediately ring alarm bells.
    >
    > yes but this is about interprtation and an operating model, not a
    > capability. Note that you are mixing MUST's/SHOULD's which have to do with
    > process and other constructs that are user selectable with operating
    > processes, and this is a mistake I think.
    >
    > > If the data is consistent, then its integrity has to be validated in
    some
    > > form, as far as possible.
    >
    > Only if one cares what the data is. Many processes, especially logging
    ones
    > will not. The applications that log to them might, but that is a different
    > layer of the puzzle. remember there are two purposes of information in a
    > system, one is in support of some decision support process and the other
    is
    > evidentiary in nature.
    >
    > >
    > > > >
    > > > > > - lack of time synchronization
    > > > > NTP, ideally following the new RFC.
    > > >
    > > > No No No - NTP is not reliable over the open Internet - period.
    > > See what I said above. Expect that errors will occur. Make reasonable
    > > attempts to keep the errors to a minimum.
    > >
    > > The operative word here is reasonable. I will not expect it reasonable
    > > that a typical home user needs to have the expertise needed to configure
    a
    > > firewall, but I would expect a much higher degree of competence from a
    > > professional administrator.
    > > <snip>
    >
    > OK but can they use a dialout client to set their time from NIST ACTS?
    >
    > >
    > > Devdas Bhagat
    > >
    > > ---------------------------------------------------------------------
    > > To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    > > For additional commands, e-mail: loganalysis-helpat_private
    > >
    > >
    >
    >
    > ---------------------------------------------------------------------
    > To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    > For additional commands, e-mail: loganalysis-helpat_private
    >
    
    
    ---------------------------------------------------------------------
    To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    For additional commands, e-mail: loganalysis-helpat_private
    



    This archive was generated by hypermail 2b30 : Tue Dec 04 2001 - 14:39:47 PST