Most Reliable Hosting Company Sites in February 2014

Rank Performance Graph OS Outage
hh:mm:ss
Failed
Req%
DNS Connect First
byte
Total
1 Qube Managed Services Linux 0:00:00 0.000 0.100 0.039 0.081 0.081
2 ServerStack Linux 0:00:00 0.008 0.087 0.076 0.150 0.150
3 Hosting 4 Less Linux 0:00:00 0.017 0.174 0.125 0.248 0.634
4 Datapipe FreeBSD 0:00:00 0.021 0.077 0.018 0.037 0.055
5 XILO Communications Ltd. Linux 0:00:00 0.021 0.199 0.069 0.166 0.261
6 www.dinahosting.com Linux 0:00:00 0.021 0.233 0.087 0.175 0.175
7 Server Intellect Windows Server 2012 0:00:00 0.021 0.075 0.101 0.638 0.998
8 Pair Networks FreeBSD 0:00:00 0.025 0.226 0.085 0.170 0.562
9 iWeb Linux 0:00:00 0.033 0.155 0.090 0.177 0.177
10 Anexia Linux 0:00:00 0.050 0.131 0.103 0.453 0.746

See full table

London-based Qube Managed Services had February's most reliable hosting company site, www.qubenet.co.uk, which successfully responded to all requests sent. This is the second time in six months Qube has had no failed requests, having also achieved it back in September. Qube's reliability is perhaps due to the routing infrastructure it has in place at its data centres in London, New York and Zurich. Qube's carriers include Level 3 Communications and Zayo (formerly AboveNet), both of which are known for their extensive network coverage across Europe and America.

In second place is ServerStack with two failed requests. ServerStack has maintained a 100% uptime record over the past year and offers a 100% uptime service-level agreement from its data centres in Amsterdam, New Jersey and San Jose. ServerStack uses the nginx web server to serve its website and also some of world's busiest websites, including a site which serves 150 million pageviews per day.

In third place with four failed requests is Hosting 4 Less. Hosting 4 Less has a 99.9% uptime guarantee and has been providing web hosting services for over 15 years. It owns and operates a Californian data centre facility which is privately peered via multiple gigabit connections to the Internet backbone.

FreeBSD powered the sites for both Datapipe (lowest connection time within the top 10) and Pair Networks. Windows Server 2012 powered Server Intellect and the remaining seven sites ran Linux, including first place Qube.

Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage. In the event the number of failed requests are equal then sites are ranked by average connection times.

Information on the measurement process and current measurements is available.

March 2014 Web Server Survey

In the March 2014 survey we received responses from 919,533,715 sites — around half a million fewer than last month.

Apache has gained some breathing space this month. Nearly a year of strong market share growth by Microsoft eventually culminated in the gap between Apache and Microsoft being reduced to only 5.4 percentage points. After the gap dropped to its lowest point last month, Microsoft had looked set to usurp Apache as the most common web server within a matter of months. This month's survey saw Microsoft lose 15.8 million sites and Apache gain 3.2 million, however. Apache's market share increased to 38.6%, 7.5 percentage points ahead of Microsoft and bucking the recent trend.

Most of Microsoft's losses this month were seen at Nobis Technology Group, where more than 30 million link-farming sites stopped operating. Nobis is a private holding company, which owns the DarkStar voice communications network and Ubiquity Hosting Solutions, which has seven data centres across the United States.

nginx gained 5 million sites this month, increasing its market share to 15.6%. The latest mainline version of nginx (1.5.10) now supports SPDY 3.1, which extends the flow control features of SPDY 3.0 by allowing different sessions within a single connection to send data at different rates. It is no surprise that SPDY 3.1 is already supported in the Google Chrome web browser; SPDY was primarily developed by Google, and is one of their trademarks. SPDY 3.1 has also been supported in Mozilla Firefox since 4th February 2014.

Content delivery network CloudFlare — which uses its own web server software based on nginx — rolled out SPDY 3.1 support for all of its customers in February. Since last month, Netcraft's SSL Survey has identified a four-fold increase in the number of HTTPS websites supporting SPDY 3.1, most of which are hosted by CloudFlare. A smaller number of these SPDY 3.1 sites are hosted by the owner of the WordPress.com blogging platform, Automattic, which was one of the sponsors of the ngx_http_spdy_module.

Mozilla has been planning to remove SPDY 2 support from Firefox since September 2013, and this looks set to happen with the release of Firefox 28. Some developers asked for SPDY 2 support to be retained, arguing that dropping support for SPDY 2 would effectively drop SPDY support in many SPDY-enabled websites. However, nginx and CloudFlare now supporting SPDY 3.1 allays some of that concern.

LibreOffice — the free open source office suite bundled with Ubuntu Linux — moved its website from an Apache web server to nginx at the end of January, apparently for performance reasons. Incidentally, this further distances LibreOffice from the Apache Software Foundation - LibreOffice was forked from OpenOffice.org in 2010, before the latter was given to the ASF where development continued under the name of Apache OpenOffice.

More than 30 new generic top-level domains (gTLDs) were delegated to the Root Zone during February, making them officially part of the internet. There are now 471 top level domains in total. The new ones added in February included .flights, .wiki, .xyz, .fish and .移动 (xn--6frz82g – Chinese for "mobile").

Many of these new gTLDs were applied for by Donuts Inc, a US domain registry which was founded in 2011. The company's CEO and co-founder, Paul Stahura, previously founded domain name registrar eNom in 1997. Donuts raised more than $100,000,000 in its Series A financing round and applied to ICANN for more than 300 TLDs in 2012. As a registry, Donuts does not sell domain names directly to the public; instead, customers must purchase them from one of its accredited registrars.





DeveloperFebruary 2014PercentMarch 2014PercentChange
Apache351,700,57238.22%354,956,66038.60%0.38
Microsoft301,781,99732.80%286,014,56631.10%-1.69
nginx138,056,44415.00%143,095,18115.56%0.56
Google21,129,5092.30%20,960,4222.28%-0.02
Continue reading

Microsoft neck and neck with Amazon in Windows hosting

Microsoft has edged ahead of Amazon to become the largest hosting company as measured by the number of web-facing Windows computers. The pair have been neck and neck for almost nine months: Microsoft now has 23,400 web-facing Windows computers against Amazon's 22,600. Barring companies with large connectivity aspects to their businesses — including China Telecom, Comcast, Time Warner, and Verizon — Amazon and Microsoft are the largest Windows hosting companies in the world, though the market is still fragmented with each having just over 1% of the market.

Microsoft's growth is predominantly a result of the growth of Windows Azure: Azure now accounts for close to 90% of all web-facing computers at Microsoft. Windows Azure has grown by almost 50% since May 2013, during the February 2014 Web Server survey Netcraft found 27,000 web-facing computers (both Windows and Linux) using the cloud computing platform. Many of Microsoft's own services are powered by Windows Azure including Office 365, Xbox Live, Skype, and OneDrive.

Windows Azure Web Sites service — available to the general public since June 2013 — may be the driving force behind Azure's growth. This Platform as a Service allows existing applications written in ASP, ASP.NET, PHP, Node.js, or Python to be deployed on an automatically scaling platform without managing individual computers. Microsoft also provides pre-configured software packages, such as WordPress, which can be used immediately with the Web Site service.

With over 1% of all Windows web-facing computers in the world hosted at Azure, Microsoft is now defeating the Windows hosting providers which it still partners with, and which four years ago would have been its sole revenue source in the hosting market.

Azure Regions

Azure's data centres are split into regions and geos: there are several regions within each larger geo (formerly major regions).

GEO REGIONS
United States US West (California), US East (Virginia), US North Central (Illinois), US South Central (Texas)
Europe Europe West (Netherlands), Europe North (Ireland)
Asia Pacific Asia Pacific East (Hong Kong), Asia Pacific South-East (Singapore)
Japan Japan East (Saitama Prefecture), Japan West (Osaka Prefecture)

The two new Japanese Azure regions were made available to the general public on 25th February 2014, less than a year after they were first announced. Whilst all other Azure regions all share the same price for virtual machines (from 2¢ per hour), the two new Japanese regions are more expensive: virtual machines start at 2.7¢ (Japan East) and 2.4¢ (Japan West) per hour. Neither Japanese region was detected in the February 2014 web server survey which ran in mid-January.

More than half of all web-facing Azure computers are hosted within the United States. US East is the most populated US region, closely followed by US West. However, Europe West is the most populated Azure region in the world, accounting for 20% of all web-facing Azure computers. In total, 52% of Azure's web-facing computers are in the United States, 36% are in Europe, and only 12% are in Asia Pacific.

Being able to use Windows Azure in China could offer new opportunities to non-Chinese companies who wish to increase their internet presence in China, although Netcraft has previously noted a number of issues which could hold back the growth of cloud computing in China.

For additional performance when serving content to users around the globe, the Windows Azure Content Delivery Network (CDN) can be used. This allows end users to download content from one of more than 20 different CDN node locations, which is likely to be quicker than downloading the non-cached content directly.

Whilst Azure operates across the globe certain features, such as redundancy, can only operate within the same geo. Furthermore, some Azure services are not available in all regions – for example, Azure Web Sites cannot be deployed in US South Central or Asia Pacific South-East, and the Windows Azure Scheduler is only available in one region per geo.

Operating systems

Windows Azure virtual machines exhibit the TCP/IP characteristics of the operating systems installed on them, and thus it is possible to remotely determine which operating systems are being used by Azure customers.

Windows Server 2008 is the most popular operating system installed on Azure instances, although this is not necessarily a choice that is down to the customer — for example, when using the Blob storage service to expose files over HTTP/HTTPS, the user cannot choose which operating system to use.

Windows Server is used by 90% of all web-facing computers at Azure, including three computers which still appear to be running Windows Server 2003. The remaining 10% use Linux, with Ubuntu being the most commonly identified distribution.

Unsurprisingly, Microsoft IIS and Microsoft HTTPAPI are the most common web servers on the Windows Server computers at Azure; however, a few hundred websites use Apache on Windows. As expected, Apache is the most common web server for websites served from Linux machines at Azure (62%) followed by nginx (33%).

Preview services

Several Azure services are currently offered only as preview services, which means they are made available only for evaluation purposes. Some of these preview services have had well-established Amazon equivalents for several years. For example, the Windows Azure Scheduler preview service offers similar functionality to Amazon's Simple Workflow Service (SWF), which has been available for 2 years.

Microsoft's preview services also include the Azure Import/Export Service, which allows users to transfer large amounts of data into Windows Azure Blob storage. Customers can send an encrypted hard disk to Microsoft and the data on the hard disk will be uploaded directly into the Blob storage account. Microsoft currently only accepts hard disk deliveries from the United States (although the service can be used to send data to and from European and Asian cloud regions). Amazon's own Import/Export service has been available since 2010.

Blob Storage

Windows Azure Blob (Binary Large Object) Storage is Microsoft's answer to Amazon's Simple Storage Service (S3). Both allow large files such as video, audio and images to be stored, although while Amazon has no storage limits, individual blobs on Azure have a storage limit of 200TB. Blobs can be mounted as drives and accessed from a web application as if they were ordinary NTFS volumes. If this is the only way a Blob is used, then the frontend computer responsible for that Blob will not be directly measurable over the internet: Netcraft measures only publicly visible computers with corresponding DNS entries and which respond to HTTP requests.

Microsoft offers both locally redundant storage (replicas are held within a single region) and geo-redundant storage (replicas are held in multiple regions within a single geo). Read-Access Geo Redundant Storage is currently available as a preview service. This allows customers to have read access to a secondary storage replica so that it may still be accessed in the event of a failure in the primary storage location.

Users of Windows Azure

Some well known users of Windows Azure include the Sochi 2014 Olympic Games, luxury sports car manufacturer Aston Martin, Taiwanese electronics brand BenQ, McDonald's Happy Studio, and the Have I been pwned? website, which allows users to see whether their email addresses or usernames have been affected by any publicly released website security breaches.

Troy Hunt, the developer of haveibeenpwned.com, uses Windows Azure Table Storage to store more than 160 million records much more cheaply than a comparable relational database. In fact, one of his complaints about Windows Azure is that it is too damn fast: "The response from each search was coming back so quickly that the user wasn’t sure if it was legitimately checking subsequent addresses they entered or if there was a glitch". Hunt also described how he used SQL Server on Windows Azure to analyse last year's Adobe data breach, which with 153 million records. After downloading the breach data to a low-spec Azure virtual machine, he then upgraded the virtual machine to an 8-processor system with 56 gigabytes of RAM and completed his on-demand analysis at an estimated cost of $12.

Fake SSL certificates deployed across the internet

Netcraft has found dozens of fake SSL certificates impersonating banks, ecommerce sites, ISPs and social networks. Some of these certificates may be used to carry out man-in-the-middle attacks against the affected companies and their customers. Successful attacks would allow criminals to decrypt legitimate online banking traffic before re-encrypting it and forwarding it to the bank. This would leave both parties unaware that the attacker may have captured the customer's authentication credentials, or manipulated the amount or recipient of a money transfer.

The fake certificates bear common names (CNs) which match the hostnames of their targets (e.g. www.facebook.com). As the certificates are not signed by trusted certificate authorities, none will be regarded as valid by mainstream web browser software; however, an increasing amount of online banking traffic now originates from apps and other non-browser software which may fail to adequately check the validity of SSL certificates.

Fake certificates alone are not enough to allow an attacker to carry out a man-in-the-middle attack. He would also need to be in a position to eavesdrop the network traffic flowing between the victim's mobile device and the servers it communicates with. In practice, this means that an attacker would need to share a network and internet connection with the victim, or would need to have access to some system on the internet between the victim and the server. Setting up a rogue wireless access point is one of the easiest ways for an individual to carry out such attacks, as the attacker can easily monitor all network traffic as well as influence the results of DNS lookups (for example, making www.examplebank.com resolve to an IP address under his control).

Researchers from Stanford University and The University of Texas at Austin found broken SSL certificate validation in Amazon's EC2 Java library, Amazon's and PayPal's merchant SDKs, integrated shopping carts such as osCommerce and ZenCart, and AdMob code used by mobile websites. A lack of certificate checks within the popular Steam gaming platform also allowed consumer PayPal payments to be undetectably intercepted for at least 3 months before eventually being fixed.

Online banking apps for mobile devices are tempting targets for man-in-the-middle attacks, as SSL certificate validation is far from trivial, and mobile applications often fall short of the standard of validation performed by web browsers. 40% of iOS-based banking apps tested by IO Active are vulnerable to such attacks because they fail to validate the authenticity of SSL certificates presented by the server. 41% of selected Android apps were found to be vulnerable in manual tests by Leibniz University of Hannover and Philipps University of Marburg in Germany. Both apps and browsers may also be vulnerable if a user can be tricked into installing rogue root certificates through social engineering or malware attacks, although this kind of attack is far from trivial on an iPhone.

The following fake certificate for facebook.com is served from a web server in Ukraine. There are clearly fraudulent intentions behind this certificate, as browsing to the site presents a Facebook phishing site; however, the official Facebook app is safe from such attacks, as it properly validates SSL certificates and also uses certificate pinning to ensure that it is protected against fraudulently issued certificates.

Similarly, this wildcard certificate for *.google.com could suggest an attempted attack against a multitude of Google services. The fake certificate is served from a machine in Romania, which also hosts dozens of websites with .ro and .com top level domains. It claims to have been issued by America Online Root Certification Authority 42, closely mimicking the legitimate AOL trusted root certificates which are installed in all browsers, but the fake certificate lacks a verifiable certificate chain. Some browsers' default settings will not allow a user to bypass the resultant error message.

Not all fake certificates have fraudulent intentions, though. The KyoCast mod uses a similar wildcard certificate for *.google.com, allowing rooted Chromecast devices to intentionally send certain traffic to KyoCast servers instead of Google's. The fake certificate is issued by "Kyocast Root CA". Using the Subject Alternative Name extension, the certificate specifies a list of other hostnames for which the certificate should be considered valid:

Russia's second largest bank was seemingly targeted by the following certificate – note that the issuer details have also been forged, possibly in an attempt to exploit superficial validation of the certificate chain.

A similar technique is used in this certificate which impersonates a large Russian payment services provider. SecureTrust is part of Trustwave, a small but bona fide certificate authority.

GoDaddy's POP mail server is impersonated in the following certificate. In this case, the opportunities could be criminal (capturing mail credentials, issuing password resets, stealing sensitive data) or even state spying, although it is unexpected to see such a certificate being offered via a website. Although the actual intentions are unknown, it is worth noting that many mail clients allow certificate errors to be ignored either temporarily or permanently, and some users may be accustomed to dismissing such warnings.

Apple iTunes is currently the most popular phishing target after PayPal. In this example, the fake certificate has an issuer common name of "VeriSign Class 3 Secure Server CA - G2", which mimics legitimate common names in valid certificates; however, there is no certificate chain linking it back to VeriSign's root (so it is a forgery rather than a mis-issued certificate).

It is not always criminals who use fake certificates to intercept communications. As a final example, the following fake certificate for youtube.com was served from a machine in Pakistan, where there is a history of blocking access to YouTube. This certificate is probably part of an attempt to prevent citizens from watching videos on YouTube, as the website serves "This content is banned in Pakistan" when visited.

Netcraft's Mobile App Security Testing service provides a detailed security analysis of phone or tablet based apps. A key feature of this service is manual testing by experienced security professionals, which typically uncovers many more issues than automated tests alone. The service is designed to rigorously push the defences of not only the app itself, but also the servers it interacts with. It is suitable for commissioning, third party assurance, post-attack analysis, audit and regulatory purposes where independence and quality of service are important requirements.

GCHQ website falls after threats from Anonymous

GCHQ's website at www.gchq.gov.uk is exhibiting some noticeable performance issues today, suggesting that it could be suffering from a denial of service attack.

Last week, documents from whistle-blower Edward Snowden revealed that GCHQ carried out denial of service (DoS) attacks against communications systems used by the hacktivist group Anonymous during their own Operation Payback, which itself involved carrying out denial of service attacks against high profile websites such as MasterCard, Visa, Amazon, Moneybookers, and PostFinance.

This caused some furore amongst supporters of Operation Payback, some of whom were tried and convicted for carrying out denial of service attacks. Denial of service attacks are illegal in the UK under the Police and Justice Act 2006, yet the leaked slides suggest that GCHQ may have used such techniques against Anonymous, resulting in 80% of IRC users leaving within a month.


Part of a statement published by Anonymous on AnonNews.

Following these revelations, a statement on GCHQ's war against Anonymous was posted on the AnonNews website. The statement ends with a suggestion that some kind of retaliation could be expected: "Now that we truly know who it was who attacked us, Expect all of us."

Twitter accounts associated with Anonymous also fuelled suggestions that they could be responsible for GCHQ's website woes, with some referring to the #TheDayWeFightBack hashtag.

Curiously, a much larger amount of downtime has been observed from Netcraft's Romanian performance monitor since the leaked slides were made public. That could indicate much more extreme DDoS mitigation techniques are being applied to these requests, and this in turn suggests that if an attack is occurring, perhaps Romania is one of the countries from which the attacks are being launched.

The www.gchq.gov.uk website is served from a content delivery network run by Limelight Networks, who claim to be one of the world's largest, best performing, and most highly available content delivery networks. Although it remains hosted at the same location, the website changed its Server header from "WebServer" to "EdgePrism/4.1.2.0" earlier this week. Limelight Networks first unveiled EdgePrism in 2001, so any similarities to the name of the NSA's PRISM mass electronic surveillance program are presumably coincidental.

Are there really lots of vulnerable Apache web servers?

Apache has been the most common web server on the internet since April 1996, and is currently used by 38% of all websites. Most nefarious activity takes place on compromised servers, but just how many of these Apache servers are actually vulnerable?

The latest major release of the 2.4 stable branch is Apache 2.4.7, which was released in November 2013. However, very few websites claim to be using the stable branch of 2.4 releases, despite Apache encouraging users to upgrade from 2.2 and earlier versions.

Less than 1% of all Apache-powered websites feature an Apache/2.4.x server header, although amongst the top million websites, more than twice as many sites claim to be using Apache 2.4.x. Some of the busiest websites using the latest version of Apache (2.4.7) are associated with the Apache Software Foundation and run on the FreeBSD operating system, including httpd.apache.org, www.openoffice.org, wiki.apache.org, tomcat.apache.org and mail-archives.apache.org.

The most recent security vulnerabilities affecting Apache were addressed in version 2.4.5, which included fixes for the vulnerabilities described in CVE-2013-1896 and CVE-2013-2249. Depending which Apache modules are installed, and how they are used, earlier versions may be vulnerable to unauthorised disclosure of information and disruption of service. The previous release in the 2.4 branch (2.4.4), also addressed several cross-site scripting (XSS) vulnerabilities in various modules; such vulnerabilities can severely compromise a web application by facilitating remote session hijacking and the theft of user credentials. Nonetheless, millions of websites still appear to be using vulnerable versions of Apache, including versions which are no longer supported.


Top 15 versions of Apache in February 2014, where the full version string is announced in the Server HTTP response header.
Note that no versions of the Apache 2.4 branch appear within the top 15.
Apache 1.3.41 and 2.0.63 are both end-of-lined.

The Apache 2.0 branch was retired in July 2013 with the conclusive release of Apache 2.0.65. This release addressed a few security vulnerabilities, but no subsequent vulnerabilities will be addressed by official patches or subsequent releases in the 2.0 branch. Anyone still using this branch of releases should strongly consider updating to the latest version in the stable 2.4 or legacy 2.2 branches.

Nevertheless, 6.5 million websites claim to be using the end of life 2.0 branch of Apache, with the most common versions being 2.0.63 and 2.0.52. Only 12k sites are running the conclusive release of this branch (2.0.65). However, it is worth noting that just over half of all Apache-powered websites hide their version numbers, so it is not always possible to accurately determine which version is installed without carrying out additional tests. Hiding software version numbers is usually a deliberate act by a server administrator – Apache 2.4.7 will reveal its full version number by default when installed on Arch Linux, and installing the apache2 package on the latest version of Ubuntu Linux will also reveal "Apache 2.4.6 (Ubuntu)" as the default Server banner.

Due to hidden version numbers, the number of sites openly reporting to be running Apache 2.4.x could be regarded as a lower bound, but conversely, exhibiting a vulnerable version number does not necessarily mean that a server can be exploited by a remote attacker.

For example, the Red Hat Linux operating system uses a backporting approach to applying security fixes, which means that a vulnerability in Apache 2.2.3 can be patched without affecting the apparent version number of the software. From an external point of view, the server will still appear to be running Apache 2.2.3, but it might not be vulnerable to any security problems that would affect a fresh installation of Apache 2.2.3.

Red Hat 5 and 6 use Apache 2.2.3 and 2.2.15 respectively, which explains why these seemingly old versions remain so prominent today (2.2.3 was originally release in July 2006). Both are still supported by Red Hat, and providing the necessary backported patches have been applied, Red Hat Apache servers which exhibit these version numbers can be just as secure as the latest release of Apache. However, because the version numbers correspond to Apache versions which were released several years ago, it is not unusual for Red Hat powered websites to attract unfair criticism for appearing to run insecure versions of Apache.

Certain Apache vulnerabilities can also be eliminated by removing or simply not using the affected modules – a configuration which is also difficult to ascertain remotely. However, exhibiting an apparently-vulnerable version number can still have its downsides, even if there are no vulnerabilities to exploit – as well as attracting unwarranted criticism from observers who falsely believe that the server is insecure, it could also attract undesirable scrutiny from hackers who might stumble upon different vulnerabilities instead. These are both common reasons why server administrators sometimes opt to hide version information from a web server's headers. Sites which do this include wikipedia.org, www.bbc.co.uk, www.nytimes.com and www.paypal.com, all of which claim to be running Apache, but do not directly reveal which version.

A further 6.0 million websites are still using Apache 1.3.x, even though the final version in this branch was released four years ago. The release of Apache 1.3.42 in February 2010 marked the end of life for the 1.3 branch, although 2.4 million sites are still using the previous version, (1.3.41), which contains a denial of service and remote code execution vulnerability in in its mod_proxy module.

The busiest site still using Apache 1.3 is Weather Underground, which uses Apache 1.3.42. This currently has a Netcraft site rank of 177, which makes it even more popular than the busiest Apache 2.0.x website. It is served from a device which exhibits the characteristics of a Citrix NetScaler application delivery controller. Weather Underground also uses Apache 1.3.42 for the mobile version of its site at m.wund.com.

Amongst the million busiest websites, Linux is by far the most common operating system used to run Apache web server software. With near-ubiquitous support for PHP, such platforms make tempting targets for fraudsters. Most of the phishing sites analysed by Netcraft rely on PHP to process the content of web forms and send emails.

The Audited by Netcraft service provides a means of regularly testing internet infrastructure for similarly vulnerable web server software, faulty configurations, weak encryption and other issues which would fail to meet the PCI DSS standard. Netcraft's heuristic fingerprinting techniques can often use the behaviour of a web server to identify which version of Apache is installed, even if the server does not directly state which version is being used. These automated scans can be run as frequently as every day, and can be augmented by Netcraft's Web Application Security Testing service, which provides a much deeper manual analysis of a web application by an experienced security professional.