I've always thought ensuring the integrity of the log begins at collection and ends at archival. In defining integrity I would say it means no log entries are altered and the complete log is collected and centralized. As for best practices, I would recommend the following: - Collect all logs as close to real-time as possible. The longer the source log is on the source host, the more likely it can be altered. - When possible transport logs with a TCP based protocol and encrypt. TCP ensures reliability and combined with encryption, makes altering log entries as they traverse the network significantly more challenging. - If UDP Syslog is the only option and the central repository is across the WAN, try to collect close to the source and then forward via a TCP based protocol. - The central repository should utilize a strong access control mechanism at the operating system, database, and application layer to ensure that log data cannot be altered once written. - Ideally, an archive/backup copy of all centralized logs should be maintained. This provides redundancy in the event the central log database fails or becomes corrupt. - When written to the central repository/archive, additional controls like hashsums or digital signatures can be used to validate integrity. If hashsums and/or signatures are used make sure the proper access controls exist around storing the hashes and signatures to ensure they cannot be modified. This in itself is another line of thought... - If you want to get really serious (and spend some money), write archive copies of the logs to write-once storage. I'm not sure if/how Splunk addresses the above. Vendors focused on log management (vs SIM) address most if not all of the above. Chris Petersen, CTO & Founder LogRhythm, Inc. www.LogRhythm.com > -----Original Message----- > From: loganalysis-bounces+chris=security-conscious.com@private > [mailto:loganalysis-bounces+chris=security-conscious.com@private ] > On Behalf Of Patrick Debois > Sent: Tuesday, August 22, 2006 12:44 AM > To: loganalysis@private > Subject: [logs] Log integrity handling on central logsystem > > I'm looking for feedback how centralized log solutions handle data > integrity; If you would log directly to a central system, that log is > the only source. So you would miss something to compare against. > > -Would you rely on taking checksums of the logs and storing them on > another system? > -How do you protect yourself from the fact that the central logging is > compromised with a still growing logfile? > Would you consider signing each log line? Signing within a text file is > fairly easy, but what about content stored in a database? > > My customer is currently looking at Splunk. It seems a great way to go > through the logfiles, but I'm not sure that we can fullfill his > dataintegrity requirements with it. But then again it does not stand in > the way of another solution doing it probable. > > Patrick > > > _______________________________________________ > LogAnalysis mailing list > LogAnalysis@private > http://lists.shmoo.com/mailman/listinfo/loganalysis _______________________________________________ LogAnalysis mailing list LogAnalysis@private http://lists.shmoo.com/mailman/listinfo/loganalysis
This archive was generated by hypermail 2.1.3 : Tue Aug 22 2006 - 11:35:20 PDT