The Other Side of CIS Critical Control 2 - Inventorying *Unwanted* Software
When I work with clients and we discuss CIS Critical Control 2, their focus is often on inventorying their installed software. Today we'll talk about inventorying software that you didn't install. Malware is typically the primary target in that “we didn’t install that software” list.
The method we're looking at today will inventory the running processes across the enterprise, and we'll look at how to "sift" that information to find outliers - applications that are running only one or two (or 5 or 10%, whatever your cutoff is) of hosts. Note that this is hunting for *running* software, not software that was installed with a traditional MSI file, so this does a good job of finding malware, especially malware that hasn't spread too far past its initial infection hosts yet.
OK, let's look at the base code. We're basically running get-process, getting the path on disk for that process, then hashing that file on disk. If the hash operation errors out (which it will for file-less malware for instance), that file is saved to an error log. The hash is the key item, it uniquely identifies each file, even if malware has replaced a known filename the hash will be different on that station. You can then use this hash to reference back to malware IOCs if that's helpful. Note that the hash in this case is SHA1 - you can change this to meet whatever your hashing requirements are, or add a few different hashing algorithms if that works better for you.
# collect the process list, then loop through the list |
We'll then run our script across the entire organization, and save both the process data and the errors in one set of files. Because we're hashing the files, its likely better (and certainly much faster) to run this operation on the remote systems rather than opening all the files over the network to hash them.
Note that when we do this we’ll be logging the error information out to a remote share.
function RemoteTaskList { $targets =get-adcomputer -filter * -Property DNSHostName foreach ($targethost in $targets) { $DomainTaskList | select-object PSComputerName,Id,ProcessName,Path,FileHash,FileVersion,Product,ProductVersion, HashAlgo | export-csv domain-wide-tasks.csv |
With that CSV file exported, you can now look at the domain-wide list in Excel or any tool of your choice that will read a CSV file.
===============
Rob VandenBrink
Coherent Security
Comments
Anonymous
Jun 26th 2019
5 years ago
something like:
# collect the process list, then loop through the list
$hostname = $env:ComputerName
# update the filenames to suit
$errfname = "\\loghost\sharename\err"+"-"+$hostname+".csv"
$outputfname = "\\loghost\sharename\proclist-"+$hostname+".csv"
$proc = @()
Foreach ($proc in get-process)
{
try
{
# hash the executable file on disk
$hash = Get-FileHash $proc.path -Algorithm SHA1 -ErrorAction stop
}
catch
{
# error handling. If the file can't be hashed - either it's not there or we don't have rights to it
# note that you will need to edit the host and share for your environment
$env:ComputerName,$proc.name,$proc.path | out-file $errfname -Append }
}
$proc | select-object PSComputerName, Id, ProcessName, Path,FileHash,FileVersion,Product,ProductVersion, HashAlgo | export-csv $outputfname
Anonymous
Jun 26th 2019
5 years ago
Anonymous
Jun 27th 2019
5 years ago
Anonymous
Jun 27th 2019
5 years ago
Anonymous
Jun 28th 2019
5 years ago
There's something to be said for running the called functions in the background - - the trick there is to limit how many background threads are in play to reduce the impact on cpu and in particularly memory of the machine running the script.
I've had more experience on running concurrent threads like this in python than in PowerShell, I guess it's time I researched the PowerShell possibilities for this.
There might be another story in my future on concurrency ....
Anonymous
Jun 28th 2019
5 years ago