My next class:

Tracking Newly Registered Domains

Published: 2017-12-13. Last Updated: 2017-12-13 07:16:05 UTC
by Xavier Mertens (Version: 1)
4 comment(s)

Here is the next step in my series of diaries related to domain names. After tracking suspicious domains with a dashboard[1] and proactively searching for malicious domains[2], let’s focus on newly registered domains. They are a huge number of domain registrations performed every day (on average a few thousand per day all TLD’s combined). Why focus on new domains? With the multiple DGA (“Domain Generation Algorithms”) used by malware families, it is useful to track newly created domains and correlate them with your local resolvers’ logs. You could detect some emerging threats or suspicious activities.

The challenge is to find a list of all those domains. They’re plenty of online services that provide this kind of data. Some of them allow to browse the new domains online[3], others sell this kind of database, usually linked with the corresponding whois data via a monthly fee (usually around $65)[4]. Some registrars offer a list for their own TLD’s (like the AFNIC in France[5]) but they are limited.

I was looking for a global list that includes all TLD’s and, if possible, for free. I found whoisds.com[6] which offers this service. They provide a complete database (domains + whois data) for a monthly fee but the “simple” list is available for free (only domains) and without any registration.

I’m fetching the file via a simple shell script and a cron job:

#!/bin/bash
TODAY=`date --date="-2 day" +"%Y-%m-%d”`
DESTDIR=“/home/domains"
URL="https://whoisds.com/whois-database/newly-registered-domains/$TODAY.zip/nrd"
USERAGENT="XmeBot/1.0 (https://blog.rootshell.be/bot/)"
TEMPFILE=`mktemp /tmp/wget_XXXXXX.zip`
LOGFILE=`mktemp /tmp/wget_XXXXXX.log`
CSVFILE="/opt/splunk/etc/apps/search/lookups/newdomains.csv"

# Check if the destination directory exists
[ -d “$DESTDIR" ] || mkdir -p “$DESTDIR"
# Ensure that the file does not exist already
[ -r “$DESTDIR/$TODAY.txt" ] && rm "$DESTDIR/$TODAY.txt"

wget -o $LOGFILE -O $TEMPFILE --user-agent="$USERAGENT" $URL
RC=$?
if [ "$RC" != "0" ]; then
        echo "[ERROR] Cannot fetch $URL"
        cat $LOGFILE
else
        unzip -d $DESTDIR $TEMPFILE >$LOGFILE 2>&1
        RC=$?
        if [ "$RC" != "0" ]; then
                echo "[ERROR] Cannot unzip $TEMPFILE"
                cat $LOGFILE
        else
                echo "newdomain" >$CSVFILE
                cat “$DESTDIR/$TODAY.txt" >>$CSVFILE
                rm $LOGFILE $TEMPFILE
        fi
fi

This script is executed once a day to store the daily file in the specified directory. A CVS file is also created in the specific Splunk application. Note that the script fetches the file being 2 days old (--date="-2 day") because I detected that sometimes, the previous day is published with some delay!

With the CVS file created in Splunk, I can now search for newly created domains in my Bro DNS logs:

index=securityonion sourcetype=bro_dns rcode="A" OR rcode="AAAA"
|rex field=qclass ".*\.(?<newdomain>\w+\.\w+)"
|search [|inputlookup newdomains.csv]

You can also search for specific keywords like brands, keywords related to your business:

# cat domains_keyword.csv
keyword
*bank*
*paypal*
*apple*
*ec2*

Here is an interesting Splunk query:

|inputlookup newdomains.csv
|rex field=newdomain "(?<keyword>\w+)\.\w+"
|search [|inputlookup domains_keyword.csv]

This search returned for yesterday:

halk-bankbireysel.com
storybankmaine.org
summitbank.org 
towercommunitybankmortgage.org

Happy hunting! 

[1] https://isc.sans.edu/forums/diary/Suspicious+Domains+Tracking+Dashboard/23046/
[2] https://isc.sans.edu/forums/diary/Proactive+Malicious+Domain+Search/23065/
[3] https://domainpunch.com/tlds/daily.php
[4] https://www.whoisxmlapi.com/newly-registered-domains.php
[5] https://www.afnic.fr/en/products-and-services/services/daily-list-of-registered-domain-names/#
[6] https://whoisds.com/newly-registered-domains

Xavier Mertens (@xme)
ISC Handler - Freelance Security Consultant
PGP Key

4 comment(s)
My next class:

Comments

Why not run it every 2 days, if it runs every day, and looks for 2 day old domains, you will be duping half of the domains.
I reread it, never mind...
I am trying to implement this in my environment, but I have run into an issue. The script runs fine and creates the lookup file w/o any issues. Instead of using inputlookup, I went ahead and created a lookup definition for the lookup file. I also only have access to firewall logs, so my logs do not contain domain names, only IP addresses. I am trying to get around this by using the dnsLookup function to convert the IP addresses to domain names. This appears to be working fine, but I am not matching anything as the lookup file contains only the domain name while my logs are converted to FQDN (with a host name). Thus, I think, I need to strip the hostname off of my log records before trying to match them with the lookup file. Any input on making this work would be greatly appreciated! Here is my search sting:

sourcetype=cisco:asa src_ip!="aaa.bbb.0.0/16" src_ip!="aaa.bbb.0.0/16" dest_ip="aaa.bbb.0.0/16" (message_id=30214 OR message_id=302016)
| lookup dnsLookup ip as src_ip
| lookup newdomains newdomain as host OUTPUT newdomain as host_match
| where host_match!="NONE"
| table src_ip host host_match
| dedup src_ip host

"newdomains" is the lookup definition, "newdomain" is the supported field

Thanks!
Jon
The .nu and .se ccTLD zone-data can be downloaded here: https://zonedata.iis.se/

Diary Archives