Re: [logs] Due Diligence for Admission in Court - Time

From: edward.j.sargissonat_private
Date: Tue Dec 04 2001 - 15:03:52 PST

  • Next message: Rick Magill: "Re: [logs] Due Diligence for Admission in Court"

    Just a comment here.
    I don't think (IANAL) that the court is going to care about accuracy past
    about a minute.
    So why don't you setup a local NTP server. Get it to synch across the net
    regularly. Make sure it logs any major changes. Check it agains the time
    pips on the radio or whatever occaisonally.
    That way you know that all your servers are against the same time and that
    that time isn't going to be far off true time.
    
    Comments?
    
    Edward
    
    
    
    
    Devdas Bhagat <devdasat_private> on 05/12/2001 07:49:08
    
    Please respond to Devdas Bhagat <devdasat_private>
    
    To:   Log Analysis Mailing List <loganalysisat_private>
    cc:
    Subject:  Re: [logs] Due Diligence for Admission in Court
    
    
    On 04/12/01 09:40 -0800, todd glassey wrote:
    > If you plan on submitting something to a court, you likely may need to
    have
    > the following setup:
    >
    >     0)    Understanding what it is you are trying to do
    Important point, missed this one.
    <snip>
    >
    >     1)    Some proof of the time data that was stamped on the document.
    >
    > This is a real issue since NTP across the open Internet is not reliable
    Ok, but you can setup your own local NTP server. If your local time
    stamps are consistent across multiple local systems, you have much less
    to worry about.
    For multiple locations, one local NTP server and local log server,
    plus a central log server and central NTP server.
    Reasonable attempts to maintain timestamp synchronization.
    
    <snip>
    
    >     2)    Some level of operating integrity which means a regularly run
    IDS
    > process to verify the consistency of the system, like AIDE, SHADOW, or
    any
    > of the commercial ones (Tripwire, ISS, etc).
    Yes, logging and log checking would be a part of this process.
    
    > The bottom line is that you need to prove that your environment was
    behaving
    > properly and to do that you will need these Filesystem IDS tools. The IDS
    Actually, you need to prove that it had integrity before the compromise.
    Behaving properly, or not is a different term and will depend on what
    you want it to do.
    
    <snip>
    >     4)    Some kind of Log Management regimen wherein the logs are
    regularly
    > timestamped, rotated, and made tamper-proof
    Again, see point two. You need to show integrity. Thats the only thing
    you need to prove.
    
    <snip>
    >     5)    Some kind of active audit model that demonstrates that each one
    of
    > these constraints or subsystem requirements are met.
    >
    > This is likely between you and your auditors who ever they are and then
    your
    > data and process will likely be admissible just about anywhere.
    As long as a consistent audit model is maintained, and well documented.
    If the model is not documented, it will not be admissible.
    
    <snip>
    > > For example, you can log the SMTP sender, but since that is in the
    hands
    > > of the client, it cannot be verified, or trusted.
    >
    > Only true if Sendmail/mailx is used. Many of the other mailer agents have
    > addressed this and allow for extended handshaking.
    What the client supplies cannot be trusted. Only locally
    originated/locally verifiable data can be trusted.
    
    > > Even if you restrict only authorized clients writing to the datastream,
    > > you have no knowledge that the data itself is valid.
    >
    > You are not worried as to the integrity of the data only the constency of
    > the data once it entered your system.
    You have to worry about both. If you cannot trust the integrity of the
    data itself, you have problems with your tools, since they have to trust
    this data to act on it. GIGO.
    If your data is not consistent, it should immediately ring alarm bells.
    If the data is consistent, then its integrity has to be validated in some
    form, as far as possible.
    
    > >
    > > > - lack of time synchronization
    > > NTP, ideally following the new RFC.
    >
    > No No No - NTP is not reliable over the open Internet - period.
    See what I said above. Expect that errors will occur. Make reasonable
    attempts to keep the errors to a minimum.
    The operative word here is reasonable. I will not expect it reasonable
    that a typical home user needs to have the expertise needed to configure a
    firewall, but I would expect a much higher degree of competence from a
    professional administrator.
    <snip>
    
    Devdas Bhagat
    
    ---------------------------------------------------------------------
    To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    For additional commands, e-mail: loganalysis-helpat_private
    
    
    
    
    
    
    
    ----------------------------------------------------------------
          The information transmitted is intended only for the person or entity
          to which it is addressed and may contain confidential and/or
          privileged material.  Any review, retransmission, dissemination or
          other use of, or taking of any action in reliance upon, this
          information by persons or entities other than the intended recipient
          is prohibited.   If you received this in error, please contact the
          sender and delete the material from any computer.
    
    
    ---------------------------------------------------------------------
    To unsubscribe, e-mail: loganalysis-unsubscribeat_private
    For additional commands, e-mail: loganalysis-helpat_private
    



    This archive was generated by hypermail 2b30 : Tue Dec 04 2001 - 15:31:43 PST