Showing posts with label scanning. Show all posts
Showing posts with label scanning. Show all posts

Wednesday, November 4, 2015

An explanation of webserver logs that contain requests such as "\x16\x03\x01"

Recently I have started coming across somewhat unusual entries in the access and error logs for a few of the Apache web servers that I am responsible for maintaining. The entries look like this:

95.156.251.10 - - [03/Nov/2015:13:56:23 -0500] "\x16\x03\x02\x01o\x01" 400 226 "-" "-"

Here is another example:

184.105.139.68 - - [03/Nov/2015:23:48:54 -0500] "\x16\x03\x01" 400 226 "-" "-"

These errors will be generated on a website configured to use SSL - and in fact, error messages similar to these can be generated by misconfiguring SSL for your website. This error message, for instance, can indicate an attempt to access Apache through SSL while the OpenSSL engine is either disabled or misconfigured:

Invalid method in request \x80g\x01\x03

Connections that generate that error would not be successful. This post, however, assumes that your website is working normally when used normally. So what gives?

The error indicates an attempt to scan OpenSSL for the SSLv3 POODLE vulnerability. No need to panic - getting scanned is an everyday occurrence for web server administrators, and hopefully your server is already long since patched for POODLE and disabled SSLv3 connections entirely. Furthermore, many of the servers scanning the internet making these connections are researchers - the example I provided above referencing the IP address 184.105.139.68 is one such example, and belongs to a group called "The Shadowserver Foundation" that reviews the internet for vulnerabilities and publishes trends in their findings. 

Still, even the connections made by researchers are done without the consent of those being scanned - and some admins might not like that. Furthermore, there are plenty of people who are doing this sort of scanning that don't have the best interests of the internet community at heart. Blacklisting IP addresses that perform these sorts of connections is possible - but I recommend avoiding the use of blacklisting based on generalized formatting of OpenSSL errors as such a policy runs the risk of banning users or even yourself during troubleshooting or maintenance.

Saturday, October 4, 2014

GoDaddy Has Hosted Malicious and Abusive Traffic for over a Year and Doesn't Care

A little over two weeks ago I attempted to contact GoDaddy's Abuse contact about malicious scanning coming from a GoDaddy IP. This post will describe how GoDaddy not only ignored my warnings about this criminal use of their IP space, but has allowed this same scammer to use this same IP to exploit legitimate users for years, ignoring numerous warnings from their own customers, industry security experts and even other hosting companies. I will also explore some possible reasons as to why GoDaddy has become a so-called "Bullet-Proof" host; an honor usually reserved for basement "data centers" from Southeast Asia and Eastern Europe.

This IP tried to scan my server for Wordpress vulnerabilities, and then tried to scrape some content. The traffic was ham-fisted and amateurish; the kind of traffic that is obviously malicious. The attempt was logged, immediately blacklisted, and forwarded to me.

This sort of thing happens all the time. And ordinarily, I am very sympathetic to hosting companies. Most hosting companies spend a lot of money and energy getting rid of scammers that abuse their service in this way. Once, many years ago, I worked for a hosting company where resolving such complaints was one of my primary responsibilities.

We all know that this kind of malicious traffic is a danger to people who are new to the internet; normal folks who just want to blog with their friends or get a little free advertising for their small business. Web pedestrians. But thats not the only danger. Scanning like this devalues the IP space maintained by the hosting company who is used to facilitate it. Scanning gets IPs blacklisted. When the next (legitimate) customer comes around and tries to use one of those IPs, she finds that email doesn't work like it should, and that some people can't get to the websites hosted on her server. This is a huge hassle. Word gets around: this hosting company sells broken IPs. Customers decide to go elsewhere.

GoDaddy has grown large over the years by going in the opposite direction of most hosting companies. Rather than by providing quality resources with well-trained engineering staff, GoDaddy provides half-broken resources with incompetent customer service representatives. GoDaddy is the very bottom of the down market; the catfish of datacenters. I should point out that GoDaddy's approach is not just about low prices. Providers like Linode, for example, appeal to tech-savvy people by providing very good infrastructure with no support. GoDaddy provides unreliable infrastructure with no support. GoDaddy has survived by competing on solely on price for inexperienced customers. Web pedestrians.

Despite GoDaddy's long-standing position as the butt of jokes, the overall opinion is that they have and continue to do the bare minimum. That's why I was so surprised to find them providing long term hosting to scammers. Preventing the use of your data centers to steal from people is, by any measure, the absolute bare minimum.

Here is a sample of the scanning looking for Wordpress vulnerabilities:

64.202.161.41 - - [08/Sep/2014:10:26:27 -0400] "GET /admin HTTP/1.1" 404 15 "-" "User-Agent\tMozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"

64.202.161.41 - - [08/Sep/2014:10:26:28 -0400] "GET /wp-login.php HTTP/1.1" 404 15 "-" "User-Agent\tMozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"

64.202.161.41 - - [08/Sep/2014:10:26:28 -0400] "GET /administrator HTTP/1.1" 404 15 "-" "User-Agent\tMozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"

64.202.161.41 - - [08/Sep/2014:10:26:28 -0400] "GET /user HTTP/1.1" 404 15 "-" "User-Agent\tMozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"

And here is a sample of the same host attempting to scrape content:

64.202.161.41 - - [08/Sep/2014:10:26:26 -0400] "GET / HTTP/1.1" 200 7218 "-" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; CrawlDaddy v0.3.0 abot v1.2.0.0 http://code.google.com/p/abot)"

As you can see, not much is hidden. What I was immediately interested in was this bit about User Agent identification on the last bit of that last log entry. abot is a an opensource webcrawler - a completely legitimate one, I should add. 64.202.161.41 has apparently developed their own fork of abot that they are using to scrape. They are so confident of their relationship with GoDaddy that they have used the GoDaddy name to brand their fork - calling it CrawlDaddy.

This bit of branded hubris gave me a means to start doing some historical research. Most scammers have a fly by night relationship with hosts. A scammer gets at best a few months, and more usually a few days or weeks of abusing a hosting service before they get the boot. You don't name you malware after a hosting provider that you plan on leaving anytime soon.

Sure enough, I found article after article talking about CrawlDaddy. Jetfire Networks, a VPS host, warned their customers of the scanning. Jetfire also blacklisted GoDaddy's IP space from reaching their share hosting customers, apparently to prevent a successful Wordpress exploit. Jetfire had absolutely nothing to do with hosting the attacks - unlike GoDaddy - yet took proactive precautionary measures. Jetfire published their notification October 2013; the earliest reports I found published were from September 2013.

I should note at this point that I am a GoDaddy customer. Several months ago I purchased a single domain name from GoDaddy for 99 cents. The money isn't really the point; the point is that I have a GoDaddy customer ID number. I'm not just some random lunatic to them.

So I emailed GoDaddy. I outlined all the technical details above, confirming those details with valid log data. I provided URLs to websites that also posted valid log data. I even explained to them how they could verify my claims using traffic sampling (netflow, etc). That was over two weeks ago. I received no reply. The host is still online, and from what I can tell, still scanning.

Others have already contacted GoDaddy about 64.202.161.41 over the last year. 64.202.161.41 could just as easily (likely more easily) be scanning other GoDaddy customers. And the scanning continues.

Tuesday, September 9, 2014

RedIRIS Compromised?

For those not familiar with Spanish ISPs, RedIRIS is Spain's National Research and Education Network. They are part of Consorci de Serveis Universitaris de Catalunya and Forum of Incident Response and Security TeamsEssentially its an organization devoted to university networking projects and advanced R&D. They get their own nice big netblock to mess around with (in this case 193.144.0.0/14). Similar projects in the US would be CalREN, Internet2 and LambdaRail. 

I'm seeing what looks like malicious scanning from the RedIRIS netblock, like this:

**** - - [08/Sep/2014:18:54:34 -0400] "GET /muieblackcat HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:34 -0400] "GET //phpMyAdmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:34 -0400] "GET //phpmyadmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:35 -0400] "GET //myadmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:35 -0400] "GET //mysqladmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:35 -0400] "GET //pma/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:36 -0400] "GET //mysql/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:36 -0400] "GET //scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:37 -0400] "GET //MyAdmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:37 -0400] "GET //typo3/phpmyadmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:37 -0400] "GET //phpadmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:38 -0400] "GET //pma/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:38 -0400] "GET //web/phpMyAdmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:39 -0400] "GET //xampp/phpmyadmin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:39 -0400] "GET //web/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:39 -0400] "GET //php-my-admin/scripts/setup.php HTTP/1.1" 404 15 "-" "-"
**** - - [08/Sep/2014:18:54:40 -0400] "GET //websql/scripts/setup.php HTTP/1.1" 404 15 "-" "-"

The traffic lacks the usual signs of IP spoofing. Spoofed scanning I come across tends to show multiple IPs attempting to make the same types of connections within a somewhat short period of time. With this, the access attempts are unique. If these connections are spoofed, they would be pointless - not enough connections to add any server load for a DoS attempt, and no way to route a reply. Days ago, and with another target host, I say an identical block of requests from a server in a California data center. This all points to a bot net looking to expand itself.

I've tried to contact RedIRIS, but they are a big organization and my Spanish is barely comprehensible. If anyone affiliated with RedIRIS, FIRST or CSUC reads this, please email me or leave a comment below with your email. I would be happy to provide additional data that would help to identify and remove the source of malicious traffic.

As many readers already know, the files this scan looks for should never be accessible to public traffic. Best practices indicate removing installation files once application install is completed. Keeping configuration files in uniquely named directories doesn't hurt, either.

Wednesday, September 3, 2014

An Example of Bad Referrer Traffic and How to Block it Using ModRewrite and IPTables

Getting these on one of my web servers on an almost daily basis:

114.232.243.86 - - [01/Sep/2014:09:51:34 -0400] "GET http://hotel.qunar.com/render/hoteldiv.jsp?&__jscallback=XQScript_4 HTTP/1.1" 404 15 "http://hotel.qunar.com/" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.114 Safari/537.36"

The traffic comes from all sorts of different IPs that are owned by China Telecom. 114.232.243.86, 114.231.42.219, 222.209.137.232, 222.209.152.192, 118.113.227.95.

The host I am seeing this on does not need to speak to anyone or anything in China, so I used IPTables to filter the entire netblocks I see hits from. Here is an example of a filtering rule along with a little note for myself. Notice that this rule assumes two nonstandard chains - BLACKLIST and LOGDROP - that I use to organize my ruleset.

-A BLACKLIST -s 114.224.0.0/12 -m comment --comment "Chinanet Hotel Qunar Referrer" -j LOGDROP

Because I'm not sure which IP the next connection will come from, but all of the connections rely on the hostname hotel.qunar.com, I also set up a RewriteMap in Apache for that hostname. RewriteMap directives have to be added at the virtualhost or server level - they can't be placed within an .htaccess file. So I added the following to an Apache Conf include file (again to keep things organized):

##
## Bad Referrer Deflection via RewriteMap
##
RewriteEngine on
RewriteMap deflector txt:/$PATHTOFILE/deflector.map
RewriteCond %{HTTP_REFERER} !=""
RewriteCond ${deflector:%{HTTP_REFERER}} =-
RewriteRule ^ %{HTTP_REFERER} [R,L]
RewriteCond %{HTTP_REFERER} !=""
RewriteCond ${deflector:%{HTTP_REFERER}|NOT-FOUND} !=NOT-FOUND
RewriteRule ^.* ${deflector:%{HTTP_REFERER}} [R,L]

While my deflector.map file looks like this (make sure that the file has permissions necessary for Apache to read it): 


##
## deflector.map
##
http://hotel.qunar.com -

The "-" after the bad hostname is a directive that tells Apache where to send the connection. "-" tells the referrer to connect back to itself. However you can send the traffic to a page informing the scanner that you know what they are up to if you are feeling confrontational (and don't mind the additional load).

Your deflector.map doesn't have to be a text file. Using a dbm hash file is both possible and considerably faster. Read more about the RewriteMap directive at the Apache project website.

NSA Leak Bust Points to State Surveillance Deal with Printing Firms

Earlier this week a young government contractor named Reality Winner was accused by police of leaking an internal NSA document to news outle...