Wednesday, 4 November 2009

How to prevent your website for malware

What the hell, how is it possible that I got malware at my website?

I have a theory. A nasty computer program infected the website. The way it is done seems Cross-site scripting to me.

Ok, why me? May be because the nasty computer program searches for webpages called webform*.htm for example with this search

Ok, why was it possible? Because I created my own forms? That should not be a problem (in theory).

I used the PHP strip-tags function, so there should be no problem (in theory). It was clear that that was not enough (in practice). So I added also the PHP escapeshellcmd function. That little nasty computer program should have used that, because almost all pages where infected at once. It simple could not be a manual edit action.

The most important protection for nasty computer programs is probably a Turing test via a Captcha (or a recaptcha). I think that this combination gives the most effective protection. Allthough nowadays I am not sure about that. Remember that I thought that strip-tags was enough ;-).

Tip: Test your site with the free Acunetix cross-site scripting scanner


How to test if your site contains malware

A few weeks ago I discovered a problem as webmaster of the Helenahoeve ( The site was hacked. Each page contains some extra code between the end head and the body tag.

The code was something like:
script src= /script

I did not check it, but I think it is some nasty code. I discovered this code because I saw in the status bar that the browser was waiting for For what? Who is Mashaei? Well that is a simple question, take a look at: but do not visit his site!

I removed all rubish, protected my site a litte more and watched my site more closely. I suggest that you ask Google for advise via 

There are other sites, but they all state that is okay. I think that that is not correct.
  1. McAfee's SiteAdvisor:
  2. Finjan Vital Security:
  3. Norton Safe Web:
There are luckily other ways to test your site for malware.
  1. If your site contains valid HTML, check at a regular base
  2. If you know which URL's are valid, check for strange browser requests with
  3. Let your site check by