There was a thread this week on one of the digital forensic email lists I follow where the initial email was from an examiner who was seeing signs of an anti-forensic wiping program. The examiner was looking for assistance in determining what program might have been used. He had performed what any of us would normally do such as looking in places like the registry and so forth. I responded to the list on how I sometimes approach problems like these in an indirect manner by looking at web history. An examiner from a government digital forensics lab found the response useful since he’s in the process of training some new examiners and asked if he could pass it along. Of course, I was flattered that he thought it was useful information so I was happy to see him make us of it in training his new examiners. I thought I’d share my thoughts with the rest of the team through this blog in case anyone else found it useful also.
Web history is still good for this sort of investigation also. It's an indirect way of going after the problem, but one of the things I've learned about digital forensic examinations is that sometimes it pays to flank the enemy, so to speak.
For example, if you come up empty with the traditional registry forensic searches, hitting an image with something like HSTEX and going over all of the browser history that is available might get you some results. I've had cases like that where if I can an image soon enough after a application of interest is installed and used, I can see the predictable timeline of events such as the user's Google searches looking for a particular application, the user accessing a specific application's website, the downloading link and sometimes the file access information from IE history when the user starts interacting with the program in question.
That's why even if you have a user who is using a non-IE browser, you still want to process all of the browser history especially those IE index.dat files because you can still get some interesting file hits from that history.
This is one of the reasons why digital forensics feels like an art sometimes. It’s certainly a science and we should be using the scientific method early and often in how we approach our jobs. In addition to having a strong analytical approach, one of the things I like to see in an examiner is a healthy amount of creativity and curiosity. These are qualities that greatly assist in solving challenges that we’re continuously faced with in digital forensic examinations.
I’ve been bookmarking quite a few websites that have come up through my Twitter feed and other resources that I’d like to share with the group.
The HSTEX tool that I mentioned above is Craig Wilson’s awesome web browser history extractor. You can find it here and it goes together with his Net Analysis product like chocolate and peanut butter.
The first is one that I learned about from Jonathan Krause and that’s ddrescue. Jonathan pointed out an article at howtogeek.com that talks about the use of the tool. This might be a tool that many are already familiar with especially if you are used to doing forensics in a Linux environment. I’ve recently rediscovered the joys of using Linux in digital forensic examinations so I’m enjoying learning about tools like this.
The next tool of interest is the CAINE Live CD. I don’t remember how I learned about this tool, but it looks interesting enough where I’d like to play with it more to see if I should add it to the toolbox. The CAINE project is managed by Nanni Bassetti and contains a whole host of forensic tools including the previously mentioned ddrescue. Another nice feature is the WinTaylor aspect which includes tools like the Nirsoft Mega Report which uses a variety of Nirsoft tools to extract data for a report.
Next up is FirePasswordViewer and, again, I didn’t write down where I learned about this one. I haven’t tried this program either, but it looks like it could be a useful tool for extracting login passwords from Firefox. This gets back to the idea of flanking the enemy when it comes to forensic examinations. If I have an encrypted container that I can’t easily brute force, I might be able to just cut the Gordian Knot by obtaining passwords from easier to attack sources like this and using those same passwords against the encrypted container. Sure, you used Serpent-Twofish-AES encryption on your TrueCrypt container that you didn’t want the police to examine, but you used the same password that you saved in your Firefox password container to login to your Facebook account.
Lastly, we have the Paladin Live CD from Sumari. The guys over at the Inside the Core Podcast (see below) talked about it on their most recent episode. I haven’t been able to test this one out yet either, but it’s a Live CD that can be used for making images. The nice thing is that it can be used to image Macs in addition to PCs. When people ask me about how long it takes to remove a hard drive from a Mac laptop, I tell them about 15 years. Four years of undergraduate school, four years of medical school, five years of general surgical residency and a two year fellowship until you have the necessary surgical skills to successfully remove a hard drive from a Mac and to put it back in without the dreaded “Bag O’ Laptop”.
Podcasts and Blogs
I don’t perform Mac forensics, but given that the Inside the Core defeated Forensic 4cast (where I’m a panelist from time to time) and Cyberspeak in the 4cast awards, I thought I’d give it a spin this week. I have to say that even though this isn’t an area of digital forensics that I’m currently engaged in, I really enjoyed the podcast. The most recent episode features an extended section on Google Chrome forensics which even though it was geared towards the examination on a Mac platform was useful information for Chrome forensics on a PC.
Ken Pryor put up a great blog post on the SANS Forensic Blog. Ken provided us with a nice compilation of all of the various test images that we can use for practice and research purposes. There were quite a few that I didn’t realize were available and I’ll be happily be making use of them in future research efforts.