Have you seen these questions on the lists:
* What is the best forensic file copy program?
* What is the best forensic file hash program?
* What is the best forensic file cataloging/listing program?
* What is the best forensic file zip/unzip for long term storage for your reports and work product?
Well, I have seen these questions, and I have seen dozens of responses for various programs which many "ASSUME" are valid answers.
Introduction
In today's digital landscape, ensuring the reliability and effectiveness of forensic tools is critical for investigators, cybersecurity experts, and legal
professionals. Whether you're cataloging digital evidence, hashing files, copying data, or compressing evidence for storage, the accuracy of the tools used can
make or break a case. The failure or shortcoming of a program may be a valid defense argument.
Why Participate:
Many of you have likely come across discussions about the best forensic tools whether for file copying, hashing, cataloging, or zipping evidence for storage.
Despite numerous recommendations, there seem sto be little verified data on which programs consistently meet forensic standards.
And more importantly, the capability of the program to meet your current needs for this particular case. One person, or one case may need to specifically validate a hash before and after the copy, or may need to examine alternate data stream to see what URL the picture came from. Others say, not my needs. So each investigation may have a fine tuned need for a targeted capability or operasion. And don't forget, the opposing side knows what information or evidence you are relying on, and they know and most likely have tested your particular program operation to see if it has any shortcomings to process that particular piece of evidence. Not everyone needs to process the evidence in an identicle way. But you should know if your software properly processes your evidence for this case.
This initiative offers a structured way to test these tools and share findings with the community, allowing for better-informed decisions when choosing forensic software.
I have, (time flies when you are having fun) over the past 5 or 6 years been giving a 4 hour session at a local college on forensic software tool testing.
These sessions discuss developing test data and basic testing of the software tools you use in your forensic analysis of seized computers, and the tools you may
need to use in your role as cyber security expert.
Now, to address these concerns, you are invited to participate in a comprehensive forensic software testing initiative. This initiative is designed to validate commonly used forensic software, ensuring it can handle the demands of real-world investigations on NTFS file systems, one of the most widely used file systems in forensic investigations. You may choose to report your findings, or not. But you will see that much of the software you test may not meet all your needs all the time.
Scope of Testing
The tests focus on NTFS file systems because it is widely used in forensic investigations involving individual computers and corporate servers. The testing will
stress a few specific aspects of the NTFS system by using files with long names, alternate data streams, and preset MAC (Modified, Accessed, Created) dates which
may be critical aspects of a digital forensic investigation. Not every situation or capability is tested. Just those 3 or 4 items which may be a simple and
general evidence item that may at some point be challeged. As the user of the software, you can obviously add your own requirements for the test to meet your own
investigations.
Simple setup. Yes/NO?
Participants (that is you) will:
1. Download the test suite (a 3.5 MB zip/encrypted file set containing about 150 files designed to stress your forensic tools).
2. Extract the files to a clean NTFS folder to ensure accurate testing.
3. Configure the NTFS registry key to enable last access date updates.
4. Run tests (explained more fully below) using your forensic tools, focusing on any or all of four key areas:
A: File cataloging/listing. (in other words, create an inventory of case work, and what you seize)
B: File hashing. (check and maintain integrity of the evidence)
C: File copying. (forensically copy ALL the evidence and your reports)
D: File zipping/unzipping for evidence preservation. (maintain for future review the evidence)
Each test will assess whether the software can accurately and defensibly process files while maintaining critical metadata such as long filenames, alternate data
streams, and MAC dates.
Test Requirements (catalog, hash, copy, zip)
Participants will run their forensic software through a series of tests and report the following for inclusion into a final consolidated report:
1. File Cataloging/Listing to determine:
Create accurate catalog/inventory of evidence at the suspect source before seizure.
Is the output easily included, massaged into a report or spreadsheet.
Does it find and properly list any long filename items.
Can the software find and list of all files in the tree, including hidden and alternate data streams.
Does it record all three MAC dates accurately.
Sample output here. Path removed for display purposes.
top_of_lfn_folders DIR 10/02/2024 16:08:21:643c 10/02/2024 16:08:21:643w 10/02/2024 16:08:27:565a EST ....D.
ALTERNATE_STREAM_FILE.TXT 48 01/01/2019 07:34:56:789c 01/01/2019 07:34:56:789w 01/01/2019 07:34:56:789a EST ......
ALTERNATE_STREAM_FILE.TXT:ALTERNATE 34 01/01/2019 07:34:56:789c 01/01/2019 07:34:56:789w 01/01/2019 07:34:56:789a EST .adata
logfile_hidden 498 01/01/2019 07:34:56:789c 01/01/2019 07:34:56:789w 01/01/2019 07:34:56:789a EST .H....
PATH MD5 SIZE CDATE CTIME MDATE MTIME ADATE ATIME I:\TMP\DC4.txt 26E47CB686A587BD43FB77CB9EBC2937 152 11/19/2019 07:39:03:245c 07/21/2017 16:04:09:248w 07/21/2021 12:47:01:264a
Program Name | Version | Long Filename | Original Last Access | Destination Access | Destination MAC | Altnerate data stream copied | | Retained | Maintained | maintained | my_copy.exe | ver 1.0 | fail | fail | 1/2 pass; | 1/2 pass | fail your_copy.exe | ver 2.1 | pass | pass | fail: | fail | pass 1/2 pass could mean: it may have failed on 1 or more of the requirement items. for instance: retained created, but failed modified and access or process parent file but didn't see ADS. process short filenames, but failed LFN's. basically, some part of the data was not properly retained/process/recorded. you get the idea.