Podcast Detail

SANS Stormcast Wednesday, May 21st 2025: Researchers Scanning the Internet; Forgotten DNS Records; openpgp.js Vulneraiblity

If you are not able to play the podcast using the player below: Use this direct link to the audio file: https://traffic.libsyn.com/securitypodcast/9460.mp3

Podcast Logo
Researchers Scanning the Internet; Forgotten DNS Records; openpgp.js Vulneraiblity
00:00

Researchers Scanning the Internet
A “newish” RFC, RFC 9511, suggests researchers identify themselves by adding strings to the traffic they send, or by operating web servers on machines from which the scan originates. We do offer lists of researchers and just added three new groups today
https://isc.sans.edu/diary/Researchers%20Scanning%20the%20Internet/31964

Cloudy with a change of Hijacking: Forgotten DNS Records
Organizations do not always remove unused CNAME records. An attacker may take advantage of this if an attacker is able to take possession of the now unused public cloud resource the name pointed to.
https://blogs.infoblox.com/threat-intelligence/cloudy-with-a-chance-of-hijacking-forgotten-dns-records-enable-scam-actor/

Message signature verification can be spoofed CVE-2025-47934
A vulnerability in openpgp.js may be used to spoof message signatures. openpgp.js is a popular library in systems implementing end-to-end encrypted browser applications.
https://github.com/openpgpjs/openpgpjs/security/advisories/GHSA-8qff-qr5q-5pr8


Podcast Transcript

 Hello and welcome to the Wednesday, May 21st, 2025
 edition of the SANS Internet Storm Center's Stormcast. My
 name is Johannes Ulrich and this episode brought to you by
 the Graduate Certificate Program in Cyber Defense
 Operations is recorded again in Jacksonville, Florida.
 Well, today's diary is not a hot new threat for a change,
 but something that has been happening for a while. The
 reason I wrote about it today was earlier today, Caleb in
 our Slack channel asked the question whether or not
 researchers scanning the internet actually obey RFC
 9511. Now, you may ask, what is RFC 9511? It's a standard
 that suggests researchers who are scanning the internet,
 well, should do something to identify their scans as
 originating from a research project and also help people
 who are targeted by these scans to contact whoever is
 the origin organization of these scans. There are a
 number of methods that this particular RFC proposes. For
 example, you can set up a web server on the system
 performing the scans and then a standard file on that web
 server will identify the origin of the scan. Also, you
 can basically add some information like a URL or such
 as part of the payload of the data that you're sending. The
 most common implementation I see is that researchers are
 adding a string like a URL to the user agent when they're
 scanning a web applications. And that's, of course, one
 thing that we are looking at. For a few years now, we are
 publishing a list of IP addresses that we consider
 part of research organizations performing scan. Now, some of
 them are commercial, but the idea is the same that they're
 scanning the internet to, for example, identify infected
 systems or identify weakly configured systems. Shodan
 probably being the largest one. Census also being quite
 verbose and oftenly cited when it comes to these type of
 scans. Overall, 33,000 addresses are currently in our
 database. As I wrote the diary, we had 36 groups. Well,
 while I wrote the diary, I looked at a couple of other
 logs and actually now we're up to 39 groups that we are
 tracking for these scans. How do we know that these are not
 just attackers claiming to be researchers? And the short
 answer is we don't know. The behavior is sort of consistent
 with what I would researchers expect to do. They may hit
 like a URL, like a .git or something like this, looking
 for exposed .git directories. But they don't necessarily
 then send an actual exploit and they try not to harm the
 systems that they are scanning. But then again,
 that's why it's important that they identify themselves. And
 certainly not all of them are doing so. And there are
 certainly some that I would say a little bit in the shady,
 odd kind of range where it's not really clear how
 benevolent their research actually is. On the same note,
 if you are planning to do a scan like this, well, first
 take a look at some of the data sets that are already out
 there. For example, Census is publishing quite good data
 sets. There's a ton of scanning already happening for
 these kind of research purposes. Actually, in some
 cases, I think we have seen about 30% of the scans that
 hit individual systems being just attributed to these
 researchers. And maybe I'll publish an update about what
 the exact number is later in the week if I get around to
 collect some of that data. And then we got a great blog by
 Infoblox. Well, the DNS people. So, of course, it is
 about DNS that tells you why you need to keep your DNS
 house in order. In particular, again, the problem with
 dangling CNAME records. CNAME records are essentially
 aliases. So, you can redirect queries for one of your host
 names to another host name that, of course, can be at any
 other domain. This is often used to point users to cloud
 resources. So, you set up a particular AWS website or
 bucket. And now you would like to use your domain name to
 point users to that particular resource. Well, the option you
 have is just point a CNAME at it. The problem is that users
 often don't remove those CNAME when they are no longer used.
 What happens now is that the particular host name it points
 to in the case of HazyHawk, the operation that Infoblox
 here observed. It's usually a cloud resource. So, the
 attacker will now take over that cloud resource. Your host
 name still points to that cloud resource. And with that,
 the attacker now gained a foothold inside your domain,
 essentially inserting content into your website. This has
 been a common issue where various variation of these
 problems, essentially registering CNAME, registered
 domains, and not renewing them or not maintaining them after
 they no longer are being abused by attackers. Now,
 Infoblox points out that this is not a trivial attack
 because in order to find these delegations, well, the
 attacker has to usually sift through quite a bit of DNS
 data and also needs to have access to a lot of DNS data.
 Well, in the last few years, we had many applications that
 do offer end-to-end encryption for browser-based
 applications. The goal is always the same. We keep the
 files encrypted on the server, and then they are being
 decrypted in the browser once they are received by the user.
 And whatever password is such being used is usually
 encrypted itself using the user's password. Well, there
 is a problem here that the integrity of the systems
 depends on the integrity of any JavaScript. Crypto
 libraries being used here. One that's particularly popular is
 OpenPGP. Well, this library is vulnerable in that it did not
 correctly verify message signatures, which then can be
 spoofed, which in a limited way at least does endanger the
 integrity of these encrypted messages being viewed using
 this library. Interesting vulnerability. And if you're
 working on a system like this, please make sure that you are
 updating this library. Well, this is it for today. So
 thanks for listening. And of course, a couple of you
 noticed that at the beginning, I always reference lately some
 sans.edu programs. It's really just, well, they're paying the
 bills anyway for this podcast since I'm working for them. So
 may as well give them some credit if you're happy to talk
 to anybody about joining one of these programs. Well, let
 them know that you like this podcast. And with that, thanks
 and talk to you again tomorrow. Bye.