On 03/12/01 20:34 -0600, Tina Bird wrote: > Pardon me for re-opening this can of worms. > > Did we ever come to a consensus, or a pseudo-consensus, > on due diligence for computer logs as evidentiary > quality data? > > What makes a judge unlikely to admit my logs as evidence? > - unauthenticated data sources ("anyone can write to this > datastream, therefore none of it is reliable") Current standards don't support quite a lot of stuff that should be reliable. For example, you can log the SMTP sender, but since that is in the hands of the client, it cannot be verified, or trusted. Even if you restrict only authorized clients writing to the datastream, you have no knowledge that the data itself is valid. > - lack of time synchronization NTP, ideally following the new RFC. > - long term storage that is not tamper-proof If you can prove that the storage is secure, this should not be that much of a problem. "Yes, the floppy disk could have been written to, but it was sealed in the presence of two people and the seal has not been broken." > - no strategy for dealing with all the data once it's collected Or should this be no implemented strategy? I believe there have been cases in the US where logs were not deemed admissible because they weren't monitored (someone please hunt down any links). <snip> > 1) can't enforce secure transmission protocols throughout > the network, because standards aren't sufficiently > evolved -- so standard syslog, SNMP, SMTP are okay for > transport protocols. (although see #3 below) Ummmm, IPSec between systems and log hosts? and syslog-ng or another syslog variant over TCP can provide a higher degree of reasonableness. > 2) central loghost with NTP or other time synchronization > throughout the network -- use ongoing record of process > IDs on logging machines to verify reasonable expectation > that a particular log message came from a given machine > (does that make sense? I know what I mean...) It makes sense, for some value of reasonable. What you will need are two copies of the logs, on different machines. If these match, you have a reasonable chance that the logs are accurate, for a high value of reasonable. > 3) access control enforced at loghost that limits which > machines can log -- help reduce likelihood of spoofed > traffic -- or implement other transports altogether, like > the serial cable mechanism we've discussed Essential. > 4) loghost is of course totally locked down, SSH only > access, or console only access, and dumps logs to > write-once archive format on regular basis ssh only access, without unpassworded keys. There was an interesting suggestion about putting a host in promiscious mode and grabbing syslog traffic on the promisc interface which would have no ip, or arp using tcpdump. > 5) log review and reduction strategy -- anyone want to > take a stab? since presumably part of showing that the > data is reliable is showing that I've thought about how > I should process it. The administrator should be using some kind of automated alerting system that parses the log file (something like logcheck). This provides log reduction, and sends the interesting parts to the administrator for further analysis. > 6) minimum list of machines on that non-existent typical > network that I should be required to monitor to be > credible? Every machine that is deemed critical MUST be monitored. I would say: file servers, database servers, firewalls, webservers, email servers, routers and switches, the time server and the loghost(s). Devdas Bhagat --------------------------------------------------------------------- To unsubscribe, e-mail: loganalysis-unsubscribeat_private For additional commands, e-mail: loganalysis-helpat_private
This archive was generated by hypermail 2b30 : Mon Dec 03 2001 - 23:20:16 PST