Tracking Newly Registered Domains
Here is the next step in my series of diaries related to domain names. After tracking suspicious domains with a dashboard[1] and proactively searching for malicious domains[2], let’s focus on newly registered domains. They are a huge number of domain registrations performed every day (on average a few thousand per day all TLD’s combined). Why focus on new domains? With the multiple DGA (“Domain Generation Algorithms”) used by malware families, it is useful to track newly created domains and correlate them with your local resolvers’ logs. You could detect some emerging threats or suspicious activities.
The challenge is to find a list of all those domains. They’re plenty of online services that provide this kind of data. Some of them allow to browse the new domains online[3], others sell this kind of database, usually linked with the corresponding whois data via a monthly fee (usually around $65)[4]. Some registrars offer a list for their own TLD’s (like the AFNIC in France[5]) but they are limited.
I was looking for a global list that includes all TLD’s and, if possible, for free. I found whoisds.com[6] which offers this service. They provide a complete database (domains + whois data) for a monthly fee but the “simple” list is available for free (only domains) and without any registration.
I’m fetching the file via a simple shell script and a cron job:
#!/bin/bash TODAY=`date --date="-2 day" +"%Y-%m-%d”` DESTDIR=“/home/domains" URL="https://whoisds.com/whois-database/newly-registered-domains/$TODAY.zip/nrd" USERAGENT="XmeBot/1.0 (https://blog.rootshell.be/bot/)" TEMPFILE=`mktemp /tmp/wget_XXXXXX.zip` LOGFILE=`mktemp /tmp/wget_XXXXXX.log` CSVFILE="/opt/splunk/etc/apps/search/lookups/newdomains.csv" # Check if the destination directory exists [ -d “$DESTDIR" ] || mkdir -p “$DESTDIR" # Ensure that the file does not exist already [ -r “$DESTDIR/$TODAY.txt" ] && rm "$DESTDIR/$TODAY.txt" wget -o $LOGFILE -O $TEMPFILE --user-agent="$USERAGENT" $URL RC=$? if [ "$RC" != "0" ]; then echo "[ERROR] Cannot fetch $URL" cat $LOGFILE else unzip -d $DESTDIR $TEMPFILE >$LOGFILE 2>&1 RC=$? if [ "$RC" != "0" ]; then echo "[ERROR] Cannot unzip $TEMPFILE" cat $LOGFILE else echo "newdomain" >$CSVFILE cat “$DESTDIR/$TODAY.txt" >>$CSVFILE rm $LOGFILE $TEMPFILE fi fi
This script is executed once a day to store the daily file in the specified directory. A CVS file is also created in the specific Splunk application. Note that the script fetches the file being 2 days old (--date="-2 day") because I detected that sometimes, the previous day is published with some delay!
With the CVS file created in Splunk, I can now search for newly created domains in my Bro DNS logs:
index=securityonion sourcetype=bro_dns rcode="A" OR rcode="AAAA" |rex field=qclass ".*\.(?<newdomain>\w+\.\w+)" |search [|inputlookup newdomains.csv]
You can also search for specific keywords like brands, keywords related to your business:
# cat domains_keyword.csv keyword *bank* *paypal* *apple* *ec2*
Here is an interesting Splunk query:
|inputlookup newdomains.csv |rex field=newdomain "(?<keyword>\w+)\.\w+" |search [|inputlookup domains_keyword.csv]
This search returned for yesterday:
halk-bankbireysel.com storybankmaine.org summitbank.org towercommunitybankmortgage.org
Happy hunting!
[1] https://isc.sans.edu/forums/diary/Suspicious+Domains+Tracking+Dashboard/23046/
[2] https://isc.sans.edu/forums/diary/Proactive+Malicious+Domain+Search/23065/
[3] https://domainpunch.com/tlds/daily.php
[4] https://www.whoisxmlapi.com/newly-registered-domains.php
[5] https://www.afnic.fr/en/products-and-services/services/daily-list-of-registered-domain-names/#
[6] https://whoisds.com/newly-registered-domains
Xavier Mertens (@xme)
ISC Handler - Freelance Security Consultant
PGP Key
Comments