PHP just grows & grows

Netcraft began its Web Server Survey in 1995 and has tracked the deployment of a wide range of scripting technologies across the web since 2001. One such technology is PHP, which Netcraft presently finds on well over 200 million websites.

PHP Trend

The first version of PHP was named Personal Home Page Tools (PHP Tools) when it was released by Rasmus Lerdorf in 1995. PHP 1 can still be downloaded today from Weighing in at only 26 kilobytes in size, php-108.tar.gz is diminutive by today's standards, yet it was capable of allowing users to implement guestbooks and other form-processing applications.

PHP 2 introduced built-in support for accessing databases, cookie handling, and user-defined functions. It was released in 1997, and by the following year, around 1% of sites on the internet were using PHP.

However, PHP 3 was the first release to closely resemble today's incarnation of PHP. A rewrite of the underlying parser by Andi Gutmans and Zeev Suraski led to what was arguably a different language; accordingly, it was renamed to simply PHP, which was a recursive acronym for "PHP: Hypertext Preprocessor". This was released in 1998 and the ease of extending the language played a large part in its tremendous success, as this aspect attracted dozens of developers to submit a variety of modules.

Andi Gutmans and Zeev Suraski continued to rewrite PHP's core, primarily to improve performance and increase the modularity of the codebase. This led to the creation of the Zend Engine, which was used by PHP 4 when it was released in 2000. As well as offering better performance, PHP 4 could be used with more web servers, supported HTTP sessions, output buffering and several new language constructs.

By September 2001, Netcraft's Web Server Survey found 1.8M sites running PHP.

PHP 5 was released in 2004, and remains the most recent major version release today (5.4.11 was released on 17 January 2013). Zend Engine 2.0 forms the core of this release.

By January 2013, PHP was being used by a remarkable 244M sites, meaning that 39% of sites in Netcraft's Web Server Survey were running PHP. Of sites that run PHP, 78% are served from Linux computers, followed by 8% on FreeBSD. Precompiled Windows binaries can also be downloaded from, which has helped Windows account for over 7% of PHP sites.

Popular web applications that use PHP include content management systems such as WordPress, Joomla and Drupal, along with several popular ecommerce solutions like Zencart, osCommerce and Magento. In January 2013, these six applications alone were found running on a total of 32M sites worldwide.

PHP also demonstrates a strong installation base across web-facing computers that are found as part of Netcraft's Computer Counting survey. Just as an individual IP address is capable of hosting many websites, an individual computer can also be configured to have multiple IP addresses. This survey allows us to identify unique web-facing computers and which operating systems they use regardless of how many sites or IP addresses they have. As of January 2013, 2.1M out of 4.3M web-facing computers are running PHP.

PHP has also become a victim of its own success in some respects: With so many servers running PHP, and with so many different web applications authored in PHP, hackers are presented with a huge and rather attractive attack surface. Because it is so easy to get started with programming in PHP, it attracts all levels of developers, many of whom may produce insecure applications through lack of experience and attention to detail. Netcraft's anti-phishing services find wave upon wave of phishing attacks hosted on compromised PHP applications, and the U.S. NVD (National Vulnerability Database) contains several thousand unique vulnerabilities that relate either to PHP itself, or to applications written in PHP.


The full list of hostnames from the Netcraft Web Server Survey forms the basis of our technology tracking. We make requests to each of these sites, or if there is a large number of sites hosted on a single IP address, we employ a proportional sampling technique. The content of each page and its HTTP headers are analysed to determine which technologies are being used. For PHP, we look for references to .php filename extensions or the existence of HTTP response headers like "X-Powered-By: PHP". Additional signature tests are used to identify particular PHP applications, such as WordPress.

Each metric is then calculated as follows:


For each IP address, we estimate the total number of PHP sites it serves by calculating the product of the proportion of sampled hostnames that are running PHP and the total number of hostnames on that IP address. In cases where the IP address is serving 100 or fewer sites, all sites will be sampled and thus be representative of the entire population for that IP address.

Active sites

To provide a more meaningful metric which counts the number of human-generated sites actively using PHP, our active site count excludes spam sites or other computer-generated content. This methodology is described in more detail here.

IP addresses

This metric counts the number of unique IP addresses where at least one hostname in its sample set was found to be running PHP.


A single physical or virtual computer may have more than one IP address. We are able to identify unique computers that are exposed to the internet via multiple IP addresses. If an IP address is running PHP, then the computer associated with it is marked as running PHP. Further details of this methodology are explained in our Hosting Provider Server Count.

Netcraft removes phishing attacks in less than half the industry average time

Netcraft’s phishing site countermeasures service helps organisations targeted by phishing attacks remove the fraudsters’ forms as quickly as possible.

Recently we became aware that our median times for takedowns are very much better than the industry average calculated by the Anti-Phishing Working Group (APWG) in its most recent Global Phishing Survey. The APWG found that phishing attacks have a median lifetime of 5 hours and 45 minutes. In contrast, banks and other companies using our countermeasures service have experienced a median phishing attack availability of 2 hours and 12 minutes calculated over our most recent 100 takedowns, with the attacks removed in just 38% of the industry average time.

The graph below shows the availability times of our most recent 100 phishing attacks.

Last 100 Takedown Times

The difference between the first and final outages reflect the fact that phishing attacks will sometimes fluctuate up & down on compromised hosts where the fraudster may still have access to the system and be able to replace his content after the site owner removes it. In this scenario it is important to continue monitoring sites for some time after they go offline and restart takedowns if & when the phishing content reappears. For example, 87% of phishing attacks we attended to had their first outage within 24 hours, and 90% had their final outage within 48 hours.

Takedown times do vary significantly from country to country. For example, all of our last 100 takedowns in the US were completed within three days, and 90% had their first outage within 12 hours. In contrast, takedown times in Russia are rather longer, albeit with 90% going down within three days, and 70% having their first outage within twelve hours.

Russia and the US are by no means the long and short of phishing attacks. Phishing attacks we dealt with in the UK & Ireland have a shorter median lifetime than those hosted in the US, whilst phishing attacks we have taken down in Iran have a median lifetime of just under 30 hours, around five times longer than Russia.

In addition to providing fast takedown of the fraudulent content, the countermeasures service is also linked to our phishing site feed, which is licensed by all of the main web browsers, together with many of the largest anti-virus and content filtering products, firewall and network appliance vendors, mail providers, registrars, hosting companies and ISPs. Consequently, as soon as the phishing attack is verified, access to it will be blocked for hundreds of millions of people shortly afterwards, significantly reducing the effectiveness of the attack even before it has been removed.

More information regarding our countermeasures service can be found here.

January 2013 Web Server Survey

In the January 2013 survey we received responses from 629,939,191 sites.

Apache continued its decline in market share that began in mid-2012, now having 100 million fewer hostnames than in June 2012: it still retains a clear majority at 55.26% of the market. Both within the million busiest sites and on the internet as a whole, nginx has continued its ascendance, increasing its market share to 12.77% and 12.64% respectively. Where the version is known, the widest deployed version of nginx is the current stable branch (1.2.x) but the bulk of Apache users are still using the 2.2.x branch of Apache httpd despite the new features available in the 2.4.x branch which has been available since February 2012.

Amazon now hosts 9.3 million hostnames using their cloud computing platforms — gaining more than one million sites this month, and more than doubling within the past year. The most used web server at Amazon is nginx, being used on more than 44% of all hostnames, many of which are being served by Heroku, a Platform as a Service (PaaS) provider.

Notwithstanding Amazon's fast growth, Go Daddy hosts 36 million sites — nearly 6% of the world's websites — making it the largest hosting company in terms of hostnames. The number of sites hosted does not, however, necessarily scale with the number of computers (physical or virtual) used to serve the corresponding content: shared hosting providers will often be able to host several hundred or even thousand sites from a single machine, whereas VPS and dedicated hosting providers may only serve a few. Although Netcraft found 23k web-facing computers at Go Daddy, Amazon has been the largest hosting company in terms of web-facing computers since September 2012 with 139k web-facing computers this month — Go Daddy hosts, on average, more than 23 times more sites per web-facing computer than Amazon. Although Go Daddy is the largest hosting company by hostname, the distribution of sites hosted is skewed towards the less busy: it hosts 2.6% of the million busiest sites, and only a single site in the top 1,000. Amazon, on the other hand, hosts a similar number in the million busiest, and 5.1% of the top 1,000 sites.

Almost two-thirds of the web-facing computers at Go Daddy run Microsoft Windows, with the vast majority running Windows server 2008. With such a high proportion of Windows-powered websites, Go Daddy, unsurprisingly, hosts the largest number of sites powered by ASP.NET. More than 24 million sites hosted by Go Daddy were actively using ASP.NET, whereas relatively few (2.4 million) were using the otherwise popular PHP scripting language.

DeveloperDecember 2012PercentJanuary 2013PercentChange
Continue reading

Most Reliable Hosting Company Sites in December 2012

Rank Company site OS Outage
DNS Connect First
1 ServerStack Linux 0:00:00 0.000 0.039 0.027 0.053 0.054
2 Swishmail FreeBSD 0:00:00 0.003 0.037 0.025 0.051 0.105
3 New York Internet FreeBSD 0:00:00 0.006 0.078 0.025 0.677 0.774
4 Server Intellect Windows Server 2008 0:00:00 0.006 0.035 0.066 0.132 0.328
5 Datapipe FreeBSD 0:00:00 0.009 0.102 0.015 0.032 0.049
6 Pair Networks FreeBSD 0:00:00 0.009 0.092 0.041 0.087 0.294
7 Virtual Internet Linux 0:00:00 0.009 0.072 0.061 0.182 0.321
8 Linux 0:00:00 0.009 0.121 0.066 0.134 0.220
9 Linux 0:00:00 0.009 0.183 0.086 0.370 0.747
10 Linux 0:00:00 0.016 0.206 0.027 0.059 0.065

See full table

Serverstack had the most reliable hosting company site during December, responding to every request from our monitoring system. We have only been monitoring Serverstack for three months, but it has quickly established itself as one of the hosting company sites with the fewest failed requests over that period, despite being located in the area affected by Hurricane Sandy.

Swishmail (second), New York Internet (third), Datapipe (fifth) and Reliable Servers (tenth) are also hosted within the area in which Hurricane Sandy made landfall and the presence of five such affected companies in the top ten reinforces Datapipe founder Robb Allen’s assertion that the recent history of the US North East with events including grid blackouts, Hurricane Irene and the 9/11 attacks has helped improve the resilience of the internet connectivity and hosting industry in that area.

December saw New York Internet (third) named NJBIZ's "Emerging Business of the Year" for 2012. NJBIZ profiled New York Internet's New Jersey datacentre in 2011, and praised the company's renovation and retrofitting of an older property in order to accommodate modern technology.

December's top ten list is dominated by FreeBSD and Linux, with the exception of Windows specialists Server Intellect (fourth) who have the only site running Windows. Server Intellect, which now offers Windows Server 2012 as standard on all dedicated and cloud servers, was second last month and regularly features among the top ten most reliable hosting company sites.

During December we added a new performance measurement point hosted at Webair's datacentre, in Amsterdam, bringing the total number of measurement points to 11.

Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage. In the event the number of failed requests are equal then sites are ranked by average connection times.

Information on the measurement process and current measurements is available.

World map of phishing attacks

Netcraft's new phishing attack map provides a real-time visualisation of the phishiest countries in the world. Measurements are determined by using IP address delegation information to attribute current phishing sites in our Phishing Site Feed to countries. We then use the number of active sites found by our Web Server Survey to calculate and display the ratio of phishing attacks to web sites in each country.

A few themes become immediately apparent when studying the map. Countries with poor internet access may host very few phishing attacks, or even none at all, and therefore may appear very safe; however, countries with an extremely small number of websites can prove very volatile: For example, the Falkland Islands appears incredibly phishy by virtue of the fact that out of only 38 active sites hosted in that country, one of them is currently blocked for phishing.

Countries which respond slowly to taking down phishing sites are more likely to have a higher proportion of their sites engaged in phishing at any one time. As the map displays only currently blocked phishing attacks, this characteristic is highlighted particularly well in Morocco, which is the second phishiest country with nearly 200 of its 11,000 sites blocked.

Fraudsters commonly host their phishing sites on compromised servers, as this does not require a purchasing transaction, making it more difficult to correctly identify the perpetrators. Shared hosting services tend to be the least secure, so countries with a large number of sites running on shared hosts are likely to attract the attention of fraudsters.

Countries which host a large number of vulnerable and commonly targeted web applications consequently host a large number of phishing attacks, notwithstanding their responsiveness to takedown requests. This perhaps explains why the US appears phishier than either Russia or China, and some US hosting companies host more phishing attacks than entire European countries, as they provide proportionately more WordPress and hosting control panel administered sites, plus shared IP hosting configurations that allow customer content to be accessed from any domain that resolves to the same IP address. Our datasets show that these are the most favoured platforms for hosting fraudulent content on compromised servers.

More information:

Please contact us ( for pricing or further details about any of our anti-phishing services.

December 2012 Web Server Survey

In the December 2012 survey we received responses from 633,706,564 sites - an increase of over 8 million since November.

Microsoft IIS experienced the largest gain this month, with the movement of an advertising network of 4.7M Apache hostnames to IIS 7.5 contributing to an overall 8.2M increase - their largest in over a year. As a result of the switch, Apache saw an equivalent loss, reducing their market share by 1.53 percentage points. Despite Apache's continuing downward trend over the last few months, they still hold on to more than half of the market (55.70%). Strong growth was also experienced by nginx this month, with a gain of 2M hostnames resulting in another increase to its market share.

nginx also further increased its market share within the million busiest sites, which now stands at 12.44%, as did Microsoft, which remains slightly ahead with a 13.22% share. While overall the survey sees IIS/6.0 as the most popular version of Microsoft's web server software, with a 41 percentage point lead over other versions, within the million busiest sites IIS/7.5 looks set to soon overtake it. IIS/7.5 is now used to serve 40% of IIS websites within the top million, just 4.8k and 4 percentage points behind IIS/6.0.

Linux Rootkit Found Infecting Webservers with iFrame Injection

A new rootkit, which can infect web servers running on 64-bit GNU/Linux, has been discovered which attacks web surfers with drive-by-downloads. The malware works by injecting an iFrame directly into the outgoing TCP packets of the infected machine, allowing it to infect all web traffic from the server. It was first discovered on a server running nginx, however it does not appear to be targeting nginx specifically.

ICANN Early Warnings Filed

More than half of the sites found by Netcraft's survey use the .com top-level domain, but ICANN is in the process of creating additional TLDs. On 20 November 2012, the Governmental Advisory Committee of ICANN filed 242 Early Warnings on individual applications for new top-level domains. These warnings are notices rather than formal objections, and do not directly lead to a process that can result in an application being rejected; however, they are indicative of likely formal objections later on in the application process. Most of the warnings that have been issued consist of "requests for information, or requests for clarity on certain aspects of an application".

Prominent among the list of Early Warnings is Amazon EU, which applied for .app, .book, .cloud, .game, .mail, .map, .mobile, .movie, .music, .news, .search, .shop, .show, .song, .store, .tunes, .video, plus several other unicode TLDs in other scripts and languages. Many of these TLDs have been described as generic terms that relate to broad market sectors, which could have a negative impact on competition if Amazon is to exclude other entities from using them.

India, Australia and the United States have each objected to .airforce, .army and .navy being applied for by United TLD Holdco Ltd. The United States simply claims that these strings are confusingly similar to the names of specific government agencies, while both India and Australia note that words associated with the armed forces are protected in national legislation, and the applied for TLDs could mislead users into thinking that a registrant is associated with these national armed forces.

India goes further to state that these applications have the potential to cause irreparable harm to the security and stability of the nation and suggests that the applicant should withdraw their application. The final rationale behind India's warning makes its position clear: "Allowing sovereign functions in the exclusive hands of foreign corporations whose motivations are unknown, and whose jurisdictions are not accessible for national government should NOT be allowed to happen by ICANN."

Applicants who wish to continue with their applications are advised by the Early Warning document to notify the Governmental Advisory Committee of their intended actions and when these actions will be completed. However, ICANN will still continue to process applications which do not receive a response. Conversely, if an applicant decides to withdraw their application, the applicant can receive a refund of up to 80% of the evaluation fee ($148,000).

DeveloperNovember 2012PercentDecember 2012PercentChange
Continue reading