"404" is not Malware

Published: 2019-03-30
Last Updated: 2019-03-30 23:11:13 UTC
by Didier Stevens (Version: 1)
4 comment(s)

Reader Chris submitted a PowerShell log. These are interesting too. Here's what we saw:

A typical downloader command.

When I tried to download this using wget and the URL, I got a 404 page.

Next, I did a search for the URL on the free version of VirusTotal:

The URL has some detections. But more important: there is a link to the downloaded file. this can help me to find the actual malware that was downloaded:

Notice that the detection is 0, but that it has a very low community score. It's a very small file: 564 bytes.

And it turns out to be HTML:

This time, VirusTotal too can't help me to identify the file: the hash of that small HTML file is the same as the hash of the file I downloaded. It's also a 404.

It's something that happens more on VirusTotal: "404" downloads being scored as malware.

That doesn't mean that the initial file (PowerShell script) wasn't malware. But what was actually downloaded, wasn't malware, but a 404 file. Probably because the compromised server was cleaned.

 

 

Didier Stevens
Senior handler
Microsoft MVP
blog.DidierStevens.com DidierStevensLabs.com

4 comment(s)

Comments

My comment may be off-topic a bit...but for what it’s worth, we’ve been seeing some IP Fingerprinting in play during some VirusTotal (non-Tor) results.

I wonder how many times our attempts to #wget (and other research) are going to be skewed in the future by more and more events where the threats are more focused/targeted/customized/intelligent/automated to deploy only once for each target, making our research even more difficult.

Curious on comments...apologies if off-topic.
Doc
Considere the host might also be very picky on what it serves. I have seen plenty of malware cases where the user agent must match some very specific criteria.
So in this case your wget should mimic the PowerShell user agent field in order to validate if the host is still infected.
I have different aliases for wget commands with different options, including a couple of User Agent Strings.
https://isc.sans.edu/forums/diary/Using+Curl+to+Retrieve+Malicious+Websites/8038/
I've found a few sites that were clearly hosting malware but which 404'd too, but I found that with the right UserAgent they did respond with the right page. A bunch of phishing URLs are like this - use one UserAgent and you get a 404 and use the right UserAgent with the right plugins/versions/OSes/etc and you get an exploit toolkit. I suspect some may be tailored based on the requesting IP/country/location too. Maybe this is to hide from folks like us?

In the case of the one I investigated, I suspect it was meant to hide from the owner of the infected/compromised website in the hopes it would take longer to get cleaned up.

Diary Archives