Pardon my pit-bull style here - ----- Original Message ----- From: "Tina Bird" <tbird@precision-guesswork.com> To: "todd glassey" <todd.glasseyat_private> Cc: "Devdas Bhagat" <devdasat_private>; "Log Analysis Mailing List" <loganalysisat_private> Sent: Tuesday, December 04, 2001 1:32 PM Subject: Re: [logs] Due Diligence for Admission in Court > Todd -- I don't think you seem snotty. Thanks - I was just a little acerbic in my wording I think though. > From my point of > view, there are two separate but related sets of issues: > > 1) What can I, an overworked sys admin, be >reasonably< > expected to do to convince my HR department, or a court, > that my computer audit data has been protected "as much > as anyone could expect," First off the term "as much as anyone could expect," just may not be enough anymore, in fact I am sure its not already. There are laws and other regulatory requirements happening that have criminal as well as civil penalties attached to them. This is a whole new world for us as Sys and network Admins as well as DBA's. So in response to the question on what to do? my take is that this list could actually produce something other than just the list content. It could produce a set of BCP's for how we do it today and ways we can make it better. Who better to do this than the Systems Admins? and think of the value of developing an operations model to address this. The Auditors would love us for this since it gives them an arms length from it. As to the HR people, they should have separate systems. HIPAA regulatory compliance really mandates it, no other risk model makes sense (IMHO) because of the criminal penalties attached to screwing up on HIPAA.. > --> that is, what's real world, > available now, and doesn't require years of integration > work and programming to make functional, Yes but as part of maintaining our value to the organization we have an obligation to not only be technically savvy but also conscious of the laws and regulatory framework that we may operate your computers in. The days of just being a lowley droid installing and configuring OS's is rapidly coming to an end. You know that you personally could be held liable for the criminal penalty on operating a system with health care data in it? the HIPAA statute sets three penalties for unauthorized disclosure of named health care data, 1, 5, and 10 years at your favorite club fed, and this is no joke. This section of the bill went into effect the end of August 1999 > cos' few > organizations are going to foot the bill for years of work > on this . > > 2) What can we the security community -- or the enlightened > subset that reads this mailing list -- do in terms of making > technologies more reliable and easy to use, to improve the > quality of "the best one can reasonably expect" as defined > above? We have talked about this before, you and I - The answer is to create pre-packaged "operations templates" and a set guidelines for using them, or as a group we could work with people who are implementing these. I think this is a more savvy group of SysAdmins through and we could set these up better than a lot of the other efforts in this area. This means building a set of models and all the pre-set configurations. It probably also means actually implementing one to test it which could be fun - Bruce might even subsidize it I would think. Oh and to add value to the process the rest of the world needs to be in on it too. So we do this in conjunction with other standards orgs like CASPR and the IETF so that the technologies are widely dispersed for commentary and adoption. > > Two URLs that I've been pointed to are relevant: > > http://www.usdoj.gov/criminal/cybercrime/usamarch2001_4.htm > http://www.ietf.org/internet-drafts/draft-ietf-grip-prot-evidence-05.txt This second on is of less use than you would think. The document is a 200,000 foot look at what to do that's right, not anyting about how to do it. Look the projects I would suggest are: 1) Setting up and operating a secured Logging Infrastructure/Service - i.e. pick several templates and then decsribe them at reference level. Working with Event Representation as well since this is critical for evidentiary processes. 2) Forensic Log Management Processes - Setting standards for long term data storage and proofing - - i.e. pick logging methods and develope operations templates and then decsribe them at reference level. 3) Multi-Event Logging, Management, and Query Access to events and event streams. Todd > > tbird > > On Tue, 4 Dec 2001, todd glassey wrote: > > > Date: Tue, 4 Dec 2001 14:27:39 -0800 > > From: todd glassey <todd.glasseyat_private> > > Reply-To: todd glassey <todd.glasseyat_private> > > To: todd glassey <todd.glasseyat_private>, > Devdas Bhagat <devdasat_private>, > Log Analysis Mailing List <loganalysisat_private> > > Subject: Re: [logs] Due Diligence for Admission in Court > > > > All - it is not my intent to seem snotty about this, so I apologize if I > > seemed a bit gruff in my retorts, but I am very concerned with what > > constitutes digital evidentiary models, and how to qualify them. That is > > what drives my concern for logs and the like. > > > > Todd > > > > ----- Original Message ----- > > From: "todd glassey" <todd.glasseyat_private> > > To: "Devdas Bhagat" <devdasat_private>; "Log Analysis Mailing List" > > <loganalysisat_private> > > Sent: Tuesday, December 04, 2001 1:30 PM > > Subject: Re: [logs] Due Diligence for Admission in Court > > > > > > > Bhagat - I disagree with a number of your comments - my text inline below. > > > > > > Todd > > > > > > ----- Original Message ----- > > > From: "Devdas Bhagat" <devdasat_private> > > > To: "Log Analysis Mailing List" <loganalysisat_private> > > > Sent: Tuesday, December 04, 2001 10:49 AM > > > Subject: Re: [logs] Due Diligence for Admission in Court > > > > > > > > > > On 04/12/01 09:40 -0800, todd glassey wrote: > > > > > If you plan on submitting something to a court, you likely may need to > > > have > > > > > the following setup: > > > > > > > > > > 0) Understanding what it is you are trying to do > > > > Important point, missed this one. > > > > <snip> > > > > > > > > > > 1) Some proof of the time data that was stamped on the > > document. > > > > > > > > > > This is a real issue since NTP across the open Internet is not > > reliable > > > > Ok, but you can setup your own local NTP server. If your local time > > > > stamps are consistent across multiple local systems, you have much less > > > > to worry about. > > > > For multiple locations, one local NTP server and local log server, > > > > plus a central log server and central NTP server. > > > > Reasonable attempts to maintain timestamp synchronization. > > > > > > the question is how to get time to that NTP Server in the form of an > > > Initialization Event and developing a Timebase management process. > > > > > > > > > > > <snip> > > > > > > > > > 2) Some level of operating integrity which means a regularly > > run > > > IDS > > > > > process to verify the consistency of the system, like AIDE, SHADOW, or > > > any > > > > > of the commercial ones (Tripwire, ISS, etc). > > > > > > > > Yes, logging and log checking would be a part of this process. > > > > > > > > > The bottom line is that you need to prove that your environment was > > > behaving > > > > > properly and to do that you will need these Filesystem IDS tools. The > > > IDS > > > > > > > > Actually, you need to prove that it had integrity before the compromise. > > > > Behaving properly, or not is a different term and will depend on what > > > > you want it to do. > > > > > > That's the point. The audit must take into account what the system was > > > supposed to be doing and what not. IDS processes need to start long before > > > any breakin or compromise can possibly occure to set a baseline for the > > > oeprations of the specific system under logging protection. > > > > > > > > > > > <snip> > > > > > 4) Some kind of Log Management regimen wherein the logs are > > > regularly > > > > > timestamped, rotated, and made tamper-proof > > > > > > > > Again, see point two. You need to show integrity. Thats the only thing > > > > you need to prove. > > > > > > No its not. You need to get more into timestamping to see where the holes > > in > > > this statement are. > > > > > > > > <snip> > > > > > 5) Some kind of active audit model that demonstrates that each > > > one of > > > > > these constraints or subsystem requirements are met. > > > > > > > > > > This is likely between you and your auditors who ever they are and > > then > > > your > > > > > data and process will likely be admissible just about anywhere. > > > > > > > > As long as a consistent audit model is maintained, and well documented. > > > > If the model is not documented, it will not be admissible. > > > > > > I disagree with this as a blanket statement with no founding in any court > > of > > > law that I am familiar with. If the model can be proven by audit to be > > > consistent then the documentation is a nicety. > > > > > > > > > > > <snip> > > > > > > For example, you can log the SMTP sender, but since that is in the > > > hands > > > > > > of the client, it cannot be verified, or trusted. > > > > > > > > > > Only true if Sendmail/mailx is used. Many of the other mailer agents > > > have > > > > > addressed this and allow for extended handshaking. > > > > > > > > What the client supplies cannot be trusted. Only locally > > > > originated/locally verifiable data can be trusted. > > > > > > This simply is untrue. If the system does not supply SMTP on port 25 then > > > this whole question is moot. It also is an assumption of the clients > > > operating criteria, and that is unaccepetable in any audit model. > > > > > > > > > > > > > Even if you restrict only authorized clients writing to the > > > datastream, > > > > > > you have no knowledge that the data itself is valid. > > > > > > > > > > You are not worried as to the integrity of the data only the constency > > > of > > > > > the data once it entered your system. > > > > > > > > You have to worry about both. If you cannot trust the integrity of the > > > > data itself, you have problems with your tools, since they have to trust > > > > this data to act on it. > > > > > > Simply not true - The application that is using the data needs to know > > that > > > the data is OK, the logging system doesn't care. Remember that the logging > > > system is part of the proofing model and not the decision support systems > > > that makes up a part of the application. > > > > > > > GIGO. > > > > If your data is not consistent, it should immediately ring alarm bells. > > > > > > yes but this is about interprtation and an operating model, not a > > > capability. Note that you are mixing MUST's/SHOULD's which have to do with > > > process and other constructs that are user selectable with operating > > > processes, and this is a mistake I think. > > > > > > > If the data is consistent, then its integrity has to be validated in > > some > > > > form, as far as possible. > > > > > > Only if one cares what the data is. Many processes, especially logging > > ones > > > will not. The applications that log to them might, but that is a different > > > layer of the puzzle. remember there are two purposes of information in a > > > system, one is in support of some decision support process and the other > > is > > > evidentiary in nature. > > > > > > > > > > > > > > > > > > > > - lack of time synchronization > > > > > > NTP, ideally following the new RFC. > > > > > > > > > > No No No - NTP is not reliable over the open Internet - period. > > > > See what I said above. Expect that errors will occur. Make reasonable > > > > attempts to keep the errors to a minimum. > > > > > > > > The operative word here is reasonable. I will not expect it reasonable > > > > that a typical home user needs to have the expertise needed to configure > > a > > > > firewall, but I would expect a much higher degree of competence from a > > > > professional administrator. > > > > <snip> > > > > > > OK but can they use a dialout client to set their time from NIST ACTS? > > > > > > > > > > > Devdas Bhagat > > > > > > > > --------------------------------------------------------------------- > > > > To unsubscribe, e-mail: loganalysis-unsubscribeat_private > > > > For additional commands, e-mail: loganalysis-helpat_private > > > > > > > > > > > > > > > > > --------------------------------------------------------------------- > > > To unsubscribe, e-mail: loganalysis-unsubscribeat_private > > > For additional commands, e-mail: loganalysis-helpat_private > > > > > > > > > --------------------------------------------------------------------- > > To unsubscribe, e-mail: loganalysis-unsubscribeat_private > > For additional commands, e-mail: loganalysis-helpat_private > > > > "I was being patient, but it took too long." - > Anya, "Buffy the Vampire Slayer" > > Log Analysis: http://www.counterpane.com/log-analysis.html > VPN: http://kubarb.phsx.ukans.edu/~tbird/vpn.html > --------------------------------------------------------------------- To unsubscribe, e-mail: loganalysis-unsubscribeat_private For additional commands, e-mail: loganalysis-helpat_private
This archive was generated by hypermail 2b30 : Tue Dec 04 2001 - 17:14:55 PST