Threat Level: green Handler on Duty: Johannes Ullrich

SANS ISC: InfoSec Handlers Diary Blog - Internet Storm Center Diary 2017-04-24 InfoSec Handlers Diary Blog

Sign Up for Free!   Forgot Password?
Log In or Sign Up for Free!

Analysis of the Shadow Z118 PayPal phishing site

Published: 2017-04-24
Last Updated: 2017-04-24 19:50:58 UTC
by Johannes Ullrich (Version: 1)
1 comment(s)

"[This is a guest post submitted by Remco Verhoef. Got something interesting to share? Please use our contact form to suggest your topic]

Today I got lucky walking around within a phishing site and found some left-over deployment files, containing the complete source code of the site. This gives a unique insight into the inner workings and complexity of the site. I've analyzed many phishing site source codes before, but this one is definitely more sophisticated than usual.

The site has been called Scam Paypal v1.10 by the author Shadow Z118.

I'll walk you through the source code and my findings. The source code consists of 127 files, 6MB in size, with date ranges from the September 2016 till now. The author is definitely using the Agile development process here.

The `.htaccess` file contains several measures against (automated) analysis for known anti-phishing tools:

  • it will rewrite to the actual phishing page
  • deny rules for ip ranges and domains for e.g.,,,
  • an environment variable called "stealthed"
  • rewrite conditions on referer (Google, Paypal and Firefox)
  • rewrite on user agent
  • rewrites to when coming from specific IPs
  • set the environment variable "bad_bot" for specific user-agents
  • deny from all kind of av vendors
  • disable Indexes (which wasn't applied to root, yay!)

There is also a duplicate of this file, called "htaccess" (without the usual leading dot). I assume this is a mistake.

Phishtank had an issue last week where the kinds of redirects were causing incorrect flagging safe domains as phishing.

The `index.php` file contains several more checks for bots:

  • A function "is_bitch" is used to identify various bots. Worth to mention that curl is included in the list.
  • user-agent google will get a "404" response. Google will not index the site as a result.
  • Google is not only identified by User-Agent, but also by the hostname the IP address resolves to.

This index will copy the code from a source/template folder to a random folder per user (/customer_center/customer-IDPP00C followed by a random number). After the copy, the user will be redirected to this new location. Not completely sure why this hasn't been done with rewrites.

The `robots.txt` contains disallow all rules for folders to disable indexing by robots.txt respecting bots.

The source code contains quite some aversive methods against automated analyzed by a/v vendors and bots. The `bots` folder contains more scripts that will return 404s for all kind of checks.

  • known ip ranges
  • part of domains (including "tor-exit", "google" and "amazonaws")
  • user-agents
  • list of banned ips

For every request, a global hit counter will be incremented on each hit. When the counter exceeds 30 it will create a deny record for that specific remote address, user-agent and hostname.

Track all user-agents and ip addresses for bots, depending on a list of words. This will create a database with all user-agents and ip addresses for specific a/v vendors.

An `HTTP/1.0 404 Not Found` will be returned, sometimes accompanied by the friendly message `HELLOOOOO BITCHES | I FUCKING LOVE YOU HAHAHAHAHAHAHA <3 |  TRY BYPASS ME NEXT TIME BB <3.`. Note the HTTP/1.0 it will return, even when the request has been made with HTTP/1.1

All emails will be sent to both `` and ``, accompanied by the text `PUT UR FUCKING E-MAIL BRO`. There is also a reference to ``, which is somewhat obfuscated that will receive a copy of each email. It looks like that the maker of the software want to keep track of things without the phishers knowledge. Within the code I find references to `Mr-YcN Z.1.1.8` and `SHADOW Z.1.1.8`.

Code contains the following api calls, for checking credit cards and country detections and uses the micro services of and and includes an api_key as well.




There is a file which will detect the browser specific os and browser family using the user-agent.

A file containing language entries. Currently, only English is supported.

The lib folder contains all scripts and css files.

The actual phishing flow is as follows:

* first the user will have to sign in, using her email and password

* next the user will be asked to verify the account, by entering card number, card type, c_valid, expiry dates, csc, name on card, fullname, address, zipcode, city, state, country

* then it will check if the credit card type is visa, mastercard or maestro and if you are from France, Spain or Norway the next step will be skipped

* you're being asked to upload your identity photos, with allowed extensions (gif, jpeg, jpg and png)

* social security number details, day of birth, and country-specific social security numbers

* a success page containing a summary of all entered data, redirecting you to PayPal after 5 second

Each step the user will go through will result in sending an email to the noted addresses. This email will contain all entered information and is distinguishable by different subjects and different senders.

* "NEW BB XD ? LOGIN INFO FROM : ".$_SESSION['_forlogin_']." ? ".$_POST['login_email']." ?"

* "".$_SESSION['_cardholder_']." ? FULLZ : ".$_SESSION['_ccglobal_']." ? ".$_SESSION['_global_']." ? ".$_SESSION['_login_email_']." ?"

* "".$_SESSION['_cardholder_']." ? VBV FULLZ : ".$_SESSION['_ccglobal_']." ? ".$_SESSION['_global_']." ? ".$_SESSION['_login_email_']." ?"

* "".$_SESSION['_cardholder_']." ? NEW ID CARD - ENJOY BTC ? ".$_SESSION['_global_']." ?"

Some other interesting parts;

* some of the comments are in french

* the html contains a randomiziation routine for class names

* all userdata will be saved in php sessions for persistence between the steps

* the forms contain validation

* emails will be send only when there has been entered useful data, eg password, creditcard number

* all pages contain anti bots measures


Johannes B. Ullrich, Ph.D. , Dean of Research, SANS Technology Institute

1 comment(s)
Diary Archives