> It just depends. Sometimes collecting volatile info > only becomes tedious and cumbersome, and really puts > your footprints all over the system (more so than > just imaging the live system). Imaging a live system would then have to include imaging RAM (volatile memory), and then being able to pull something meaningful out of it; ie, running processes. As far as footprints...well, there would be some footprints, particularly within the volatile information you're collecting. For example, if you're grabbing process data using pslist.exe, then one of the processes you grab will be pslist.exe. If you open a socket to send the info to a remote server (w/ a listener), then yes, you do make some modifications to the system. However, as long as you aren't actually modifying any of the non-volatile data (or if you do, you document it thoroughly), I still don't see the advantages of making a "live" image. > I'm not sure about admins and what they're doing. > From the investigative side I think some folks are > doing it right. I would agree. My experience is from working with admins, as a fellow admin, and as a consultant, as well as teaching my IR course. > Methodologies do exist in various > formats for a) steps to be taken b) tools to use c) > chain of custody for evidence, etc. For both live > and post mortem analysis. Yes, I've seen them. However, what about specific methodologies for collecting volatile data from NT/2K systems? Or just how to to IR on these systems? Most of the methodologies that you mention are specific to Linux...which leaves a lot of NT/2K admins out of the loop. > Data forensics has been > going on for a long time now and when looking at a > stand alone pc or a simple server that's a no > brainer. The toughies are now the handhelds and > other non-standard PC devices that contain > electronic data. As well as mainframes, mid > frames, etc. Agreed. But data forensics is at the extreme end of the IR spectrum...there is still a lot of ground that isn't covered. I think that it's possible to develop a methodology for IR (particularly on NT/2K systems...which is where I "live" right now) that will allow an admin or incident handler to turn over his data and documentation to LEOs, and they will still be able to obtain a conviction. This steps into the arena of "evidence dynamics", as mentioned in Eoghan Casey's latest book. > As for tools . . . I think it would be 'difficult' > for a tool to grab all necessary volatile info - > especially an automated one. Right now, I'm "living" in the Win32 arena, as I mentioned. I think I've come up with a framework for doing just this...and I'm working on implementing it. Using three separate client components...one for collecting specific volatile/non-volatile data, one for running external commands (like fport, pslist, handle, etc) and one for automating the copying of files (to include computing/verifying hashes, etc)...I think I've been able to cover almost all the bases. By building a backend server application that collects the data and documents the entire process, I think I've pretty much got it. I've presented this to others, who have found it useful enough to want to create clients for other systems. Carv __________________________________________________ Do You Yahoo!? Yahoo! - Official partner of 2002 FIFA World Cup http://fifaworldcup.yahoo.com ----------------------------------------------------------------- This list is provided by the SecurityFocus ARIS analyzer service. For more information on this free incident handling, management and tracking system please see: http://aris.securityfocus.com
This archive was generated by hypermail 2b30 : Tue Jun 18 2002 - 08:21:41 PDT