Webmastering

Moving a site to sub folder under another site or domain

Suppose you want to move site1.com to under site2.com so after moving site1.com content will look like: site2.com/new. Here are the steps that I took do move it:

  1. Move the content of site2.com to under new folder in site1.com.
  2. Change RewriteBase in site1.com/new/.htaccess
  3. In site2.com/robots.txt change all entries which look like /include/ to /new/include
  4. Setup a .htaccess redirect in site2.com ensuring that even mobile domain(m.site2.com) is properly redirected to m.site1.com/new:
    
    RewriteEngine On
    
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{HTTP_HOST} !^m\.
    RewriteRule ^ http://www.site1.com/new%{REQUEST_URI} [L,R=301,NE]
    
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{HTTP_HOST} ^m\.
    RewriteRule ^ http://m.site1.com/new%{REQUEST_URI} [L,R=301,NE]
    
    
    
  5. Adjust Google Custom Search settings for site1.com
  6. Update site URL's in facebook & Twitter etc

How to Identify Malicious Pharama Hacking Attack Scripts in your Website

This Perl script I've written after my sites were ( on 17th Sept, 2012) attacked Pharma hacking scripts. If you've VPS or dedicated server then Cpanel already allows scanning for malware.

I then scanned all the files and also compared the files( some 80,000) with earlier backup using folder comparison tool.

I was shocked to find another infection already present since 2009 but somehow it could not start!

Thank god I found out the infection by chance(within 12 days or so) and I save my sites rankings and revenue from falling down. Google had already started showing "Site compromised" in search results. It would take months to recover lost rankings if I had come to know about the infection late.

I decided to write my own tool to alert me whenever another attack occurs. I had to write it I don't trust Joomla as much as I do Drupal and moreover when I'm having older versions.

Assumption

All hacking code is made up of either:

  1. base 64 encoded strings in one form or the another, these strings don't contain any invisible characters and contain characters in random and not in pattern.
  2. A small code something like getting some code from _REQUEST and calling an eval on it.
  3. An executable file - although it does not do harm but if a virus is found by Google your site rankings suffer.

What's the objective?

  1. In case any .httaccess is changed, it will send us alert in email and mobile. This will check once every hour.
  2. All files are scanned once in a day(preferably at night) and if any hacking code is found we'll be alerted.

Limitation

Unfortunately I'm on shared hosting so I can write CPU intensive script otherwise my account may be closed.

Mandatory Alerts

Hacking attack occur once in months or years. So if the monitoring scripts stop running due to one reason or other, we will be susceptible of attack again. So scripts must send alert at least once in a day( after 12 night) of infections found or not.

If you don't receive any alert in a day then don't forget to check whether the scripts are running.

The Scripts

  1. Monitor changes in .htaccess
  2. Monitor of Hacking infections( this page is about this)

How to Run it

This is Perl script and remember Perl is available on all systems in the world. Mine is on Linux hosting and here is the line I've put it in crontab:

45 0 * * *  /usr/local/bin/perl /home/john/www/tmp/all/hacked/locatesuspiciousfiles.pl --noverbose --onlyphpfiles --exitonsecondtime --workingdir='/home/john/www/tmp/all/hacked' --targetdir='/home/john/public_html'

Here 45 means to run at 45th minute of 00 am ie., 00:45 am once every day. Also recommended not to use --onlyphpfiles this option. I'm using it as I'm on a shared server.

Command line option

If you run with -h or --help then these run time command line options will be printed:

  • --verbose/--noverbose sets/resets verbose mode
  • --onlyphpfiles/--noonlyphpfiles sets/resets the mode in which only php files will be processed and the rest ignored
  • --exitonsecondtime/-noexitonsecondtime exits/allows to run in case this program is called second time after 00:00 hours on any day
  • --workingdir="path" set the desired path where the log files will be created; if space is there in path then enclose it in double quotes for the path otherwise double quotes is optional
  • --targetdir="path" set the desired path whose directories/subdirs will be processed by this script; if space is there in path then enclose it in double quotes for the path otherwise double quotes is optional
  • --help invokes this help

Setting Parameters

Before you run it open the script and set the following parameters

  1. $ENV{TZ} = 'Asia/Kolkata'; #Change to your timezone
  2. Email Settings

    Set the SMTP server email settings - self explained

  3. $MAX_FILE_SIZE = 4 * 1048576 #If any file exceeds this limit(4 MB) only 4 MB of it's content will be read and examined. You can ignore unnecessary files by placing them in skipfilelistAntiHack.txt.
  4. @EXTENSIONS_TO_REPORT # Just add the regex for the type of filenames/extension which must be reported at any cost. By default *.exe will be reported.

Log Files

  1. LogfileAntiHack.txt: This is general log information which will not be emailed. All debug etc will go into this
  2. filelistAntiHack.txt : All files for which your attention is needed will be listed in this. If you ever want the script to ignore the list of files present in this then copy each complete line to skipfilelistAntiHack.txt
  3. skipfilelistAntiHack.txt : Create it and place files only in skipfilelistAntiHack.txt so that next time script runs it'll simply ignore the file. The format is filename:size. So to be ignored the size must match with the current size of file on the system because it could have been hacked if size has changed.

Output

When run if any infections are found then it'll email you with subject and body. All the infections found will be placed in filelistAntiHack.txt which you can move to skipfilelistAntiHack.txt to ignore it next time on wards it runs. If it generates many false positives, including Confirmed base64, large file size etc, just move all those safe files from filelistAntiHack.txt to skipfilelistAntiHack.txt each line "as is". Also if those files in skipfilelistAntiHack.txt ever change then the entry will be ignored.

License

Free to anything, but attribution to this page & my name "PP Gupta" is required.

Suggestions

Any suggestions? Email me at guptaprakashprem at_the_rate g-m-a-i-l dot com

How I Recovered After Pharma Hacking on My Site : Steps

Attention: Pl. update JCE Editor(Joomla). It contained open vulnerability in the earlier versions. Also change your PHP handler to DSO to prevent such infections.

Recovering my site

Here I want to list the steps I took when recovering from the Pharma hacking attempt for my websites. In fact I'm not expert nor do I have any earlier experience in handling Pharma hack attack in websites.

Terms Used

  • SERP: Search Engine Result Pages
  • SE: Search Engines(Google/Bing/Yahoo/Ask etc)

The hacking attack was done due to un-updated Joomla version site and I guess it was mod_joomla.php which was placed exploiting some vulnerability.

Since then I've done quite a bit net surfing and I want to narrate all that I know now.

Pharma attack has matured now after 4 years since it first started(today is: 15 Oct 2012). The motive isn't destruction but revenue making in a nefarious way. It finds vulnerabilities especially in Joomla and Wordpress sites.

After the hack exploit the script first places many other files in unsuspecting places through which hacker can gain access again if you don't remove it now the hack will attack again. The script contacts some central server(in mine I found one in Netherlands) and download newer exploits or scripts.

The scripts modify .htaccess of sites.

I found the following infections:

  1. mod_joomla.php
  2. 2012.php
  3. bourne.php
  4. common.php
  5. gymnastics.php
  6. lakers.php
  7. leryn.php
  8. medal.php
  9. rss.php
  10. story.php
  11. tom.php

Shockingly I found infection named LICESNE.php since 2009 which could not run - don't know why. I've found another infection hidden in connect_95.zip ! It was uploaded through an upload place in the site.

Recovery from pharma hack in your website is not so simple and will require your careful attention and it is possible to get it back again.

How it works out

The hacking code modifies the .htaccess files so whenever the file is accessed by directly typing the url shows the genuine unmodified page but if that URL is reached from search results of Google/Bing etc, it is recognized using for example HTTP_REFERER .htaccess directive and instead of index.php running some other file takes control of it and redirects the user to an affiliate pharma or p-o-r-n site.

I've observed that search engines are served different content than others. If you change your browser identification to say Googlebot and then visit an infected site, you can see those "inserted" affiliate site's links. Also some pharma terminology pages are served when search engines visit so you'll find in Google searches containing 'V-1-a-g-r-a/-C-i-a-1-i-s' terms in the description of search results.

In other cases it modifies the content of the html by appending pharma/p-o-r-n site links below the content. This is to increase search engine rankings of those affiliate sites. In some sites I've found pharma links with a div with hidden attribute just to affect the site rankings in the eyes of Google/Bing etc.

At the same time it continues to download new infectious code/data from hacker's server. The data contains list of changing affiliate sites and newly infected sites whose links will be placed in the. New pages are generated using the genuine page modified with bad links and also linking to other infected sites. In similar fashion your site will be back linked from other infected sites.

Purpose of Hacking

It is done to redirect traffic to pharma/porn sites when visitors click on the search results in Google/Yahoo/Bing search results. Upon clicking Google sends them to the infected site and using compromised .httaccess they are then redirected to some pharma or p-o-r-n site where they make little money using affiliate links.

What is the side affect?

Search engines regularly get such infected sites so they figure out('oogle wasn't so quick) that the site has been compromised. They will start showing as "site compromised" to visitors. Since search engines will see modified version of pages containing pharma etc content and links - your site ranking will be geared towards those terms by 'oogle/Bing etc. Your site will start plunging in the search for the actual products/services you may be selling.

Once you lose the ranking it'll take months to recover. In competitive environment you may permanently lose against your trailing competitors.

Soon expect an email from 'oogle WMT about site compromised message. You can also see the message in WMT. In my case I think infection started on 17th but only on 29th 'oogle showed me the message in WMT. At the same time 'oogle was already showing "Site compromised" in 'oogle search results to visitors.

How I came to know about it

By chance and due to bug in the hacking code, for a site whenever I clicked from 'oogle SERP after redirection from 'oogle my site was giving Internal Server Error. But upon pressing F5 from browser the same url it was working. That means by 29th all clicks from 'oogle SERP gave this error.

As I waited to complain to my web host about the internal server error, I by chance peeked into the web server log and error files and I found one common.php getting called instead of index.php.

It was the same day when finally 'oogle dropped some of my sites' traffic heavily.

How to prevent such attack from happening?

  1. Keep your site updated with latest releases and patches
  2. Run site monitoring utilities which can inform you of attack and vulnerabilities ASAP - I've developed myself two utilities to monitor .htaccess files and also find base encoded64infections and then email/Sms me warnings immediately- you can use them free.
  3. Remove all those plugins/modules and components which you don't need. I've heard that Jumi had some vulnerabilities so I've removed it - moreover I wasn't using it at all.
  4. Always disallow writing permission by others in your site directories/files. Ideally on Linux you should better set it to 755 on all files but 555 to important configuration and .htaccess files.
  5. Regularly change passwords of all your ftp/site/hosting accounts. Don't leave your passwords in your FTP client and try to use secured ftp if it is available.
  6. Regularly scan your site with a good anti virus program. When I ran ESET Nod32 with my site files it at least figured out correctly one vulnerability injected by hacker: story.php. Best way is to run against 19 antivirus on all your php files at least.
  7. Regularly check your web server logs and error logs.
  8. Set up 'oogle Alert for your sites so if any pharmacy keywords are indexed by Googlebot - you'll be notified. Here is a nice post about it.

Useful Steps

  1. Keep plenty of regular backups of your sites. I regularly take complete Cpanel backups and I've a script which runs daily in early morning when load is less and takes a backup of databases of all sites and email to different accounts.
  2. Beware of extensions/plugins developed and used at small scale - they can contain vulnerabilities
  3. Even bigger third party extension have can vulnerabilities but in this case - it will be well advertised so a hacker will probably use this to gain access.

Recovering from Pharma Attack - Steps I had taken

  1. Take your sites offline. For example you can move your site directory into a new folder name "j" or xyz and start working within it.
  2. I analyzed the web server raw and error logs to know what was happening
  3. Using ps -eafl or top commands ( on Linux/Unix/Centos etc) I could see plenty of php processes( named mod_joomla.php) being spawned and which was already hogging the CPU. I located the file and ran this command from bash shell:
    % > mod_joomla.php; chmod 555 mod_joomla.php

    Just removing the file was not working since it was able to come back again. I did this to make the file size 0 as wells as remove it's write permission - an unthinkable thing to do from Hacker's perspective. I did the same for all other infectious files like common.php.

  4. Restore and clean up all the .htaccess files, you'll find redirection settings for search engines.
  5. Go to 'oogle and search the undesirable terms your site has been hacked into and see the URL's. Request deletion of cache of pages which have been inserted spammy content. For newly generated spammy URL's file site page removal requests at WMT.
  6. Scan your sites with one or two good anti virus. Learn how to run 19 anti-virus for free on your site.
  7. Scan your site with my script Identifying malicious base64 encoded script. And remove the infections.
  8. Try to ascertain the date of start of attack. Get the backup just before it and now compare all the files using a good folder comparison tool and find threats and newly added files. I used Araxis merge to compare.
  9. Check for new tables in database if inserted by the infections
  10. Now go 'oogle WMT in and check :
    • Health->Crawl Errors
    • Health->Malware
    • Optimization -> Content Keywords - see if any unusual words( like v-1-a-g-r-a) are found.
  11. If you find any notice in WMT about hacking attack and that you need to submit a reconsideration request, then after clean up do it. Without this notice don't file reconsideration request and I think it'll be rejected by a bot.
  12. Common hacking trick is to insert spammy text only when search engines fetch the web page so it'd appear normal to you otherwise visible to 'oogle/Bing etc. Install thisUserAgent Switcher and set you agent to Googlebot, now you'll be able to see those spammy text in some cases. The catch is that sometimes those scripts also check IP addresses of search engines and then only insert that text. To see the text in this case you must use Google's Fetch as Googlebot tool from WMT as then it'd be Google's IP a page will be accessed.

How to calculate total DNS queries for my domain ( Answer from Dnsmadeeasy expert)

How to know how many dns queries your domain will generate?

This answer was sent to me by Lance Vita from dnsmadeeasy.com Support and Sales specialist. I've done some improvements in the text. DNS Made Easy is the most popular among all DNS service providers - it has Alexa rank of 25,000. Others not even near this mark!

Every different hostname within your domain that must be resolved for your pages to be rendered with generate a DNS query (some of these may be cached by the users local resolving DNS server).

It is actually almost impossible to predict how many queries you will receive without knowing not only what IPs your customers are coming from but also the network configuration for each one of your customers. Since this is not possible you can calculate a rough (and worst case) number by this calculation.

  1. (Take the average seconds the average user stays on your website) / (TTL of your record)
  2. Multiple this by the number of lookups required per page view
  3. Multiply this by the total number of sessions per day
  4. Multiply this by the number days in the month Add one for each email received. Add one for each email sent.

Example:

  • The average user stays on your site for 10 minutes.
  • Your record TTL is 5 minutes. So the average user will cause you 2 queries per visit. (10 / 5 = 2)
  • You have 10,000 visits per day (2 * 10,000 = 20,000 queries per day)
  • You have 31 days in the month (20,000 * 31 = 620,000 queries per month)
  • So you would have 620,000 queries in the month from web traffic.
  • Take number and add it to the total number of emails received.
  • So if you received 5,000 emails and you sent 5,000 emails, then your total queries would be 630,000.

This is of course a rough estimate and it is usually a worst case number. Since many users would share the same resolving name server then they actually share the same query.

_________________

Lance Vita
DNS Made Easy Support and Sales specialist

Cheapest DNS Service Provider

Are you looking for cheapest DNS service providers? Looks like there is maze of them.

If you just now outgrew your shared webhosting account and plan on moving to VPS hosting then it is better to delegate this DNS service to those who are expert in their field. The advantages: your site and email will always be resolved to the right target. Otherwise for secondary name server you'll need to hire another server and run DNS Bind on it.

Free DNS Providers

Here is the list: http://www.lowendtalk.com/wiki/free-dns-providers but it is not up-to-date. Mind you lowendtalk.com is a nice site listing reliable and dead cheap web hosting offers.

Amazon Route 53

It is recommended in web hosting forums and I checked it and found it to be economical

Godaddy Premium DNS

godaddy.com/domains/dns-hosting.aspx?ci=42423

Godaddy seems to be providing cheapest services in the world, be it for buying domains or even online ftp storage. Although I've found their online ftp storage product found be too much of trouble.

It is providing unlimited domains for $3/month!

What more it almost daily offers 30% discount codes and I'm sure nothing will come cheaper than this.

Power DNS Hosting

$2/month/domain - yes it is one of the cheapest ones. But no experience.

DNS Made Easy

To tell the truth I think DNSmadeeasy is the most professional of all. It has very nice video tutorial about DNS how-tos. Even though if you don't buy from them, don't forget to use free 1 month trail + nice tutorials!

It costs $2.5/month with max 10 domains. Disadvantage : year pricing.

Check your site immediately after web host migration before name server propagation

I searched everywhere but I could not find the solution. As a web-master all I want when migrating my site to another web-host and as soon as I setup new website I want to check it live!

So I don't want to lose any revenue and name server propagation will take it own time( from a couple of hours to 72 hours maximum).

Here is the ultimate way which could not find it anywhere ie., to check your site immediately after web host migration before name server propagation.

Just find the I.P. address of your website on new web host

And enter in your hosts file this line:

new-ip-address  www-your-site.com

Save it and go! Now in your computer the new website will be from new web host! But make sure that you've already added that domain in the Cpanel add-on domains.

Hosts File

The "hosts" file is located on your Windows PC at C:\Windows\System32\drivers\etc . Also it is very important system file, first make a backup of it. You'll need some special permission to edit it. Not expert in this field, I give all permissions on "everyone" in this file. The go to Run, then type Notepad( but I use very popular on Windows - Notepad++, free ware), without left clicking on it, right click over it and click "Run as Administrator". The go to open and open this file. Or having opened Notepad this way, right click over hosts file and select open with Notepad. Remember Notepad should already be running as "Run as administrator".

Logic

It is simple. Windows has a name resolver service called Resolver which accepts domain names from clients like ftp, browsers, ping etc and then works on to contact the name server to convert the domain name to I.P. address. Browsers need IP address to work with. Domain names are for humans only. Now if you enter an IP address like that in the hosts file then the Resolver need not contact the name servers! These name servers like NS1.hostgator.com take from 2-72 hours to propagate worldwide.

So if browser gets an IP then that's all it needs for a couple of minutes so it does not need to resolve the same domain again.

How to know the IP address of new website?

Oh I forgot to tell. That IP isn't the same as your web host server IP. For that you'd need to contact the new name servers to get the IP. The list of name servers are there in your Cpanel account. It is in the left sidebar. Once you have it ping it. For it go to site: http://www.kloth.net/services/dig.php and fill in your website name in the Domain. In server fill in the new name server Then click Look it up! Take out the IP and that's it.

Syndicate content