GCHQ website falls after threats from Anonymous

GCHQ's website at www.gchq.gov.uk is exhibiting some noticeable performance issues today, suggesting that it could be suffering from a denial of service attack.

Last week, documents from whistle-blower Edward Snowden revealed that GCHQ carried out denial of service (DoS) attacks against communications systems used by the hacktivist group Anonymous during their own Operation Payback, which itself involved carrying out denial of service attacks against high profile websites such as MasterCard, Visa, Amazon, Moneybookers, and PostFinance.

This caused some furore amongst supporters of Operation Payback, some of whom were tried and convicted for carrying out denial of service attacks. Denial of service attacks are illegal in the UK under the Police and Justice Act 2006, yet the leaked slides suggest that GCHQ may have used such techniques against Anonymous, resulting in 80% of IRC users leaving within a month.


Part of a statement published by Anonymous on AnonNews.

Following these revelations, a statement on GCHQ's war against Anonymous was posted on the AnonNews website. The statement ends with a suggestion that some kind of retaliation could be expected: "Now that we truly know who it was who attacked us, Expect all of us."

Twitter accounts associated with Anonymous also fuelled suggestions that they could be responsible for GCHQ's website woes, with some referring to the #TheDayWeFightBack hashtag.

Curiously, a much larger amount of downtime has been observed from Netcraft's Romanian performance monitor since the leaked slides were made public. That could indicate much more extreme DDoS mitigation techniques are being applied to these requests, and this in turn suggests that if an attack is occurring, perhaps Romania is one of the countries from which the attacks are being launched.

The www.gchq.gov.uk website is served from a content delivery network run by Limelight Networks, who claim to be one of the world's largest, best performing, and most highly available content delivery networks. Although it remains hosted at the same location, the website changed its Server header from "WebServer" to "EdgePrism/4.1.2.0" earlier this week. Limelight Networks first unveiled EdgePrism in 2001, so any similarities to the name of the NSA's PRISM mass electronic surveillance program are presumably coincidental.

Are there really lots of vulnerable Apache web servers?

Apache has been the most common web server on the internet since April 1996, and is currently used by 38% of all websites. Most nefarious activity takes place on compromised servers, but just how many of these Apache servers are actually vulnerable?

The latest major release of the 2.4 stable branch is Apache 2.4.7, which was released in November 2013. However, very few websites claim to be using the stable branch of 2.4 releases, despite Apache encouraging users to upgrade from 2.2 and earlier versions.

Less than 1% of all Apache-powered websites feature an Apache/2.4.x server header, although amongst the top million websites, more than twice as many sites claim to be using Apache 2.4.x. Some of the busiest websites using the latest version of Apache (2.4.7) are associated with the Apache Software Foundation and run on the FreeBSD operating system, including httpd.apache.org, www.openoffice.org, wiki.apache.org, tomcat.apache.org and mail-archives.apache.org.

The most recent security vulnerabilities affecting Apache were addressed in version 2.4.5, which included fixes for the vulnerabilities described in CVE-2013-1896 and CVE-2013-2249. Depending which Apache modules are installed, and how they are used, earlier versions may be vulnerable to unauthorised disclosure of information and disruption of service. The previous release in the 2.4 branch (2.4.4), also addressed several cross-site scripting (XSS) vulnerabilities in various modules; such vulnerabilities can severely compromise a web application by facilitating remote session hijacking and the theft of user credentials. Nonetheless, millions of websites still appear to be using vulnerable versions of Apache, including versions which are no longer supported.


Top 15 versions of Apache in February 2014, where the full version string is announced in the Server HTTP response header.
Note that no versions of the Apache 2.4 branch appear within the top 15.
Apache 1.3.41 and 2.0.63 are both end-of-lined.

The Apache 2.0 branch was retired in July 2013 with the conclusive release of Apache 2.0.65. This release addressed a few security vulnerabilities, but no subsequent vulnerabilities will be addressed by official patches or subsequent releases in the 2.0 branch. Anyone still using this branch of releases should strongly consider updating to the latest version in the stable 2.4 or legacy 2.2 branches.

Nevertheless, 6.5 million websites claim to be using the end of life 2.0 branch of Apache, with the most common versions being 2.0.63 and 2.0.52. Only 12k sites are running the conclusive release of this branch (2.0.65). However, it is worth noting that just over half of all Apache-powered websites hide their version numbers, so it is not always possible to accurately determine which version is installed without carrying out additional tests. Hiding software version numbers is usually a deliberate act by a server administrator – Apache 2.4.7 will reveal its full version number by default when installed on Arch Linux, and installing the apache2 package on the latest version of Ubuntu Linux will also reveal "Apache 2.4.6 (Ubuntu)" as the default Server banner.

Due to hidden version numbers, the number of sites openly reporting to be running Apache 2.4.x could be regarded as a lower bound, but conversely, exhibiting a vulnerable version number does not necessarily mean that a server can be exploited by a remote attacker.

For example, the Red Hat Linux operating system uses a backporting approach to applying security fixes, which means that a vulnerability in Apache 2.2.3 can be patched without affecting the apparent version number of the software. From an external point of view, the server will still appear to be running Apache 2.2.3, but it might not be vulnerable to any security problems that would affect a fresh installation of Apache 2.2.3.

Red Hat 5 and 6 use Apache 2.2.3 and 2.2.15 respectively, which explains why these seemingly old versions remain so prominent today (2.2.3 was originally release in July 2006). Both are still supported by Red Hat, and providing the necessary backported patches have been applied, Red Hat Apache servers which exhibit these version numbers can be just as secure as the latest release of Apache. However, because the version numbers correspond to Apache versions which were released several years ago, it is not unusual for Red Hat powered websites to attract unfair criticism for appearing to run insecure versions of Apache.

Certain Apache vulnerabilities can also be eliminated by removing or simply not using the affected modules – a configuration which is also difficult to ascertain remotely. However, exhibiting an apparently-vulnerable version number can still have its downsides, even if there are no vulnerabilities to exploit – as well as attracting unwarranted criticism from observers who falsely believe that the server is insecure, it could also attract undesirable scrutiny from hackers who might stumble upon different vulnerabilities instead. These are both common reasons why server administrators sometimes opt to hide version information from a web server's headers. Sites which do this include wikipedia.org, www.bbc.co.uk, www.nytimes.com and www.paypal.com, all of which claim to be running Apache, but do not directly reveal which version.

A further 6.0 million websites are still using Apache 1.3.x, even though the final version in this branch was released four years ago. The release of Apache 1.3.42 in February 2010 marked the end of life for the 1.3 branch, although 2.4 million sites are still using the previous version, (1.3.41), which contains a denial of service and remote code execution vulnerability in in its mod_proxy module.

The busiest site still using Apache 1.3 is Weather Underground, which uses Apache 1.3.42. This currently has a Netcraft site rank of 177, which makes it even more popular than the busiest Apache 2.0.x website. It is served from a device which exhibits the characteristics of a Citrix NetScaler application delivery controller. Weather Underground also uses Apache 1.3.42 for the mobile version of its site at m.wund.com.

Amongst the million busiest websites, Linux is by far the most common operating system used to run Apache web server software. With near-ubiquitous support for PHP, such platforms make tempting targets for fraudsters. Most of the phishing sites analysed by Netcraft rely on PHP to process the content of web forms and send emails.

The Audited by Netcraft service provides a means of regularly testing internet infrastructure for similarly vulnerable web server software, faulty configurations, weak encryption and other issues which would fail to meet the PCI DSS standard. Netcraft's heuristic fingerprinting techniques can often use the behaviour of a web server to identify which version of Apache is installed, even if the server does not directly state which version is being used. These automated scans can be run as frequently as every day, and can be augmented by Netcraft's Web Application Security Testing service, which provides a much deeper manual analysis of a web application by an experienced security professional.

Most Reliable Hosting Company Sites in January 2014

Rank Performance Graph OS Outage
hh:mm:ss
Failed
Req%
DNS Connect First
byte
Total
1 Datapipe FreeBSD 0:00:00 0.007 0.090 0.020 0.039 0.059
2 Qube Managed Services Linux 0:00:00 0.007 0.109 0.041 0.083 0.083
3 Hyve Managed Hosting Linux 0:00:00 0.007 0.257 0.064 0.126 0.127
4 Netcetera Windows Server 2012 0:00:00 0.007 0.056 0.070 0.156 0.292
5 Swishmail FreeBSD 0:00:00 0.007 0.133 0.073 0.144 0.189
6 Bigstep Linux 0:00:00 0.011 0.314 0.065 0.137 0.228
7 www.uk2.net Linux 0:00:00 0.011 0.152 0.069 0.142 0.224
8 Server Intellect Windows Server 2012 0:00:00 0.011 0.089 0.099 0.635 0.984
9 Midphase Linux 0:00:00 0.015 0.262 0.120 0.243 0.444
10 Anexia Linux 0:00:00 0.019 0.127 0.093 0.436 0.717

See full table

All of the top five hosting company sites started the year with only two failed requests each. The average connection times were used to break the tie for first place.

Managed services provider Datapipe had January's most reliable hosting company site, with two failed requests and an average connection time of 20ms. Datapipe was the fastest within the top 10, and second overall. Datapipe provides managed enterprise cloud services via its Stratosphere cloud computing platform; delivered from Datapipe's global network of data centres in conjunction with Amazon Web Services. Datapipe acquired Newvem in September which has combined its managed AWS with a cloud optimisation platform. Datapipe has not had a single outage since 2006, and is only two months away from having 8 years of 100% continuous uptime.

In second place is last month's number one — Qube Managed Services, followed by UK based Hyve Managed Hosting in third place with average connection times of 41ms and 64ms respectively. Qube's hosting services are delivered from data centre facilities in London, New York and Zurich. This month is Qube's fifth consecutive month in the top 10.

Hyve Managed Hosting, in third place, offers a 100% network uptime guarantee by utilising multiple Tier 1 backbone providers and peering with multiple networks in multiple global locations. In December Hyve launched SecureShare which allows its customers to host their own encrypted file sharing servers which are protected behind its hardware firewalls, IPS systems and DDoS defense devices provided as part of its SecureCloud services.

Elsewhere in the full table this month Codero received $8M in financing from Silicon Valley Bank (SVB) and Farnam Street Financial. Codero is planning to deploy new data centres across the U.S. and Europe and expand its hosting portfolio to serve more customers.

FreeBSD powered Datapipe's site, at the top of the table, and also corporate email services provider Swishmail in 5th place. The remaining hosting company sites ran Linux except for two running Windows Server — Netcetera in 4th place and Server Intellect in 8th place.

Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage. In the event the number of failed requests are equal then sites are ranked by average connection times.

Information on the measurement process and current measurements is available.

NIST continues using SHA-1 algorithm after banning it

The National Institute of Standards and Technology (NIST) is still using SSL certificates signed with the SHA-1 signature algorithm, despite issuing a Special Publication disallowing the use of this algorithm for digital signature generation after 2013.

"SHA-1 shall not be used for digital signature generation after December 31, 2013."
— NIST recommendation

The SSL certificate for www.nist.gov is signed using the SHA-1 hashing algorithm, and was issued by VeriSign on 23 January 2014, more than three weeks after NIST's own ban came into effect. Also issued this year, NIST's "Secure File Transfer Service" at xnfiles.nist.gov uses a SHA-1 certificate.

An attacker able to find SHA-1 collisions could carefully construct a pair of certificates with colliding SHA-1 hashes: one a conventional certificate to be signed by a trusted CA, the other a sub-CA certificate able to be used to sign arbitrary SSL certificates. By substituting the signature from the CA-signed certificate into the sub-CA certificate, certificate chains containing the attacker-controlled sub-CA certificate will pass browser verification checks. This attack is, however, made more difficult by path constraints and the inclusion of unpredictable data into the certificate before signing it.

The increasing practicality of finding SHA-1 hash collisions could make it possible for a well-funded attacker to impersonate any HTTPS website. With a practical attack against SHA-1 (using cloud computing resources) estimated to cost $2.77M in 2012, falling to $700k by 2015, it may attract government agencies.

The SSL certificate for www.nist.gov with the signature algorithm and issuance date highlighted.

Along with NIST itself, many US Government institutions have continued to generate new SSL certificates with SHA-1 signatures. Examples include the certificate for donogc.navy.mil, issued on 3 January 2014, and several United States Bankruptcy Court document filing systems (each state has its own site and uses its own SHA-1-signed SSL certificate). Despite receiving widespread criticism for a number of other security problems last year, the ObamaCare exchange at healthcare.gov also saw fit to deploy a new SSL certificate in January which uses the SHA-1 hashing algorithm.

NIST and the rest of the US government are far from alone, however, as more than 92% of all certificates issued this year use the SHA-1 hashing algorithm.

Although the recommendations from the National Institute of Standards and Technology have been prepared for US federal agencies, the cryptographic weaknesses of SHA-1 should concern anyone who relies on SHA-1 to generate or validate digital signatures. Microsoft shares these concerns and has announced plans to deprecate the use of SHA-1 in both SSL and code signing certificates by the end of 2016.

The NSA-designed SHA-2 family (which includes SHA-224, SHA-256, SHA-384 and SHA-512) now provides the only cryptographic hash functions approved by NIST for digital signature generation. Whilst SHA-2 shares some similarities with SHA-1, there are significant structural differences: SHA-2 does not share the SHA-1's mathematical weakness. All of the SHA-2 algorithms have much longer digests: SHA-1 only produces a 160-bit digest, whereas the most common digest length for SHA-2 is 256-bits.


Huge divide: SHA-256 uptake remains low, and is still only used by a handful of certificate authorities.
Other signature algorithms with negligible shares (e.g. MD5 and SHA-512) are not displayed.

In total, more than 98% of all SSL certificates in use on the Web are still using SHA-1 signatures. Netcraft's February 2014 SSL Survey found more than 256,000 of these certificates would otherwise be valid beyond the start of 2017 and, due to the planned deprecation of SHA-1, will need to be replaced before their natural expiry dates.

SHA-256 is the most commonly used signature algorithm from the SHA-2 family, but it is used by only 1.86% of the valid certificates in Netcraft's February 2014 SSL Survey; nonetheless, this share has more than doubled in the space of 4 months, suggesting that some certificate authorities are starting to take the issue seriously. So far in 2014, more than 61% of the new certificates signed with SHA-256 were issued by a single certificate authority, Go Daddy. SHA-512 is the only other SHA-2 family algorithm to be seen used in SSL certificates, albeit deployed on only 4 websites so far.

The CA/B Forum – which comprises of both certificate authorities and web browser vendors – put forward Ballot 111 last year, which motions to take advantage of the deprecation of SHA-1 by accelerating the forum's planned move to shorter maximum certificate lifetimes. The deprecation alone will mean that some five-year certificates that are valid today will not be usable for their entire lifetime.

In practice, it is likely to be Microsoft's plans to deprecate the use of SHA-1 signatures by the end of 2016 which will force the mass adoption of SHA-2 by certificate authorities, although allowing three years for this to happen might seem generous. The majority of end users are unlikely to be affected by the change, and most website administrators will probably have to renew their SSL certificates within this period anyway, but certificates which are reissued with SHA-1 signatures run the risk of being unsupported by other browsers in the future.

Cryptographic weaknesses in SHA-1 have been discussed for several years. A notable better-than-brute-force attack was announced nine years ago, demonstrating a SHA-1 hash collision that could be achieved within 269 calculations, as opposed to the 280 that would be required by a brute-force approach.

More recently, the best public cryptanalysis of SHA-1 estimated that a full collision can be achieved with a complexity of around 261, while a near-collision can be achieved in 257.5. These attacks have been implemented in the HashClash framework, which provides differential path construction attacks against SHA-1, as well as chosen prefix collisions against the even-weaker MD5 algorithm. The CA/B Forum recommends that all certificate serial numbers should exhibit at least 20 bits of entropy, which would mitigate chosen-prefix collision attacks for non collision resistant hash functions, although such measures are not thought to be necessary for SHA-2 at the current time.

Windows XP has supported SHA-256, SHA-384 and SHA-512 since the release of Service Pack 3 in 2008, and Windows Server 2003 can also support SHA-2 if the KB938397 hotfix has been installed. Deprecating SHA-1 could therefore also have some other indirect security benefits: anyone still using Windows XP before Service Pack 3 will be unable to make effective use of the web as SHA-2 certificates gain prominence.

The SHA-1 algorithm is also used in all versions of the TLS cryptographic protocol, and only the latest version (TLS 1.2) introduces SHA-256 as a replacement for the MD5/SHA-1 combination for both the pseudorandom function and the finished message hash. Microsoft's SHA-1 deprecation policy will only apply to applications which call the CertGetCertificateChain API to build and validate a certificate chain, so older browsers and hardware devices which do not yet support TLS 1.2 will be unaffected.

Update 5 Feb 2014: Following the publication of this article, NIST today replaced the SHA-1 certificate on www.nist.gov with a new one which uses SHA-256 as a signature algorithm. At the time of writing, the certificate used by xnfiles.nist.gov is still signed with SHA-1.

February 2014 Web Server Survey

In the February 2014 survey we received responses from 920,102,079 sites — over 58 million more than last month.

Microsoft gained a staggering 48 million sites this month, increasing its total by 19% — most of this growth is attributable to new sites hosted by Nobis Technology Group. Along with Microsoft, nginx also made a large gain of 14 million sites, whereas Apache fell by 7 million. Unsurprisingly, these changes have had a dramatic effect on the overall market share of each web server vendor, with Microsoft's share growing by 3.38 percentage points to 32.8% (302 million sites) while Apache's has fallen by 3.41 to 38.2% (352 million sites).

Microsoft's market share is now only 5.4 percentage points lower than Apache's, which is the closest it has ever been. If recent trends continue, Microsoft could overtake Apache within the next few months, ending Apache's 17+ year reign as the most common web server. Apache is faring much better in both the active sites and top million sites datasets, however, where it is still dominating with just over half of the market share in both metrics.

Nearly 2% of the top million websites are now being served by CloudFlare's customised version of nginx (cloudflare-nginx), which it uses to serve web content via its globally distributed CDN edge nodes. This month's survey saw more than a thousand of the top million sites migrate to cloudflare-nginx from other web server software, including pizzahut.co.uk, pet-supermarket.co.uk, the image server used by the popular Cheezburger network of blogs, and the official PRINCE2 website which switched from Microsoft IIS 6.0 running on Windows Server 2003.

Overall, nginx powers 17.5% of the top million sites, including popular overclocking forum www.overclock.net, despite its server headers declaring that it is now using Microsoft IIS 4.1. Responses from the server also include an X-Powered-By header which claims the application is running on Visual Basic 2.0 on Rails (Visual Basic 2.0 is a long-deprecated language which was released more than 20 years ago, while Microsoft IIS 4.1 never actually existed). The server claimed to be running nginx during Netcraft's previous survey, and indeed, it exhibits characteristics which suggest it is still using nginx.

The number of sites using the .pw country-code top-level domain (ccTLD) grew by more than half this month, reaching 10M sites in total. This ccTLD is assigned to Palau, but the .pw registry has branded the domain as the Professional Web and allows domains to be registered by the general public, regardless of which country they are in. 97% of this month's new .pw sites are hosted in the US (87% at Nobis Technology Group alone), and 2.7 million of them are running on Windows.

The busiest .pw domain is the single-letter u.pw, which is used by viral social media site Upworthy. Other than that, the ccTLD remains relatively obscure, with less than 0.02% of the top million sites using .pw domains, while other single-letter domains have been sold for the modest sums of $8,000 each. Only two single-letter .pw domains actually appear within the top million sites, and there are no two-letter domains at all. Last year, Symantec noted an increase in spam messages containing URLs with .pw TLDs, and .pw later became the first TLD to adopt the Uniform Rapid Suspension (URS) rights protection mechanism.

Version 1.0 of the URS Technical Requirements were published by ICANN in October 2013, and are intended to make it faster and cheaper for trademark holders to seek resolution when there are obvious cases of infringement. URS is intended to complement rather than replace the existing Uniform Domain-Name Dispute-Resolution Policy (UDRP).





DeveloperJanuary 2014PercentFebruary 2014PercentChange
Apache358,669,01241.64%351,700,57238.22%-3.41
Microsoft253,438,49329.42%301,781,99732.80%3.38
nginx124,052,99614.40%138,056,44415.00%0.60
Google21,280,6392.47%21,129,5092.30%-0.17
Continue reading

January 2014 Web Server Survey

In the January 2014 survey we received responses from 861,379,152 sites, an increase of 355,935 since last month.

2013 has been a year of significant change: the web has grown by more than one third, the importance of SSL has been highlighted by a series of spying revelations, Microsoft now power just below 30% of all web sites, and Apache has lost almost 14 percentage points of market share. Additionally, nginx, the relative newcomer, saw its market share peak at 16%, just shy of Microsoft’s position at the beginning of last year.

The total number of web sites discovered has increased dramatically this year — from 630 million web sites in January 2013 to 861 million in January 2014 (+37%) — though the growth does not compare to the doubling in size during 2011.

With the revelations from the NSA documents leaked by Edward Snowden providing months of mainstream publicity, 2013 has been a bumper year for the SSL industry. Websites are increasingly being served over HTTPS: 48% more sites within the million busiest are using SSL than in January 2013. In total, there are over half a million more SSL certificates (+22%) in use on the web since January 2013. The estimated total revenue of the industry has increased even more rapidly, by 28% from September 2012 to September 2013, reflecting the increased uptake of more expensive certificates including Extended Validation, multi-domain, and wildcard certificates.

Apache remains the most commonly used web server on the internet, 10 million more web sites are using it than this time last year; however, this growth has not been sufficient to maintain its share of a market which grew by more than 200 million web sites. As a result, Apache’s market share has fallen by 14 percentage points since January and now stands at 42%.

In stark contrast to Apache, Microsoft had a strong year — almost 150 million more web sites use a Microsoft web server than in January 2013. Microsoft’s share is close to 30% of the entire market and a combination of its strong growth and the corresponding lack of growth of sites using Apache has resulted in Apache’s lead shrinking by more than 26 percentage points to just 12. Microsoft's own cloud platform, Azure, has grown steadily throughout 2013 — there are 39% more web-facing computers hosted by Microsoft in January 2014 than the same time last year — and despite offering alternatives, Microsoft's IIS is by far the most common web server on Azure.

Open-source web server nginx has continued to gain acceptance, especially amongst the busiest web sites. Nginx is now used on 14% of all web sites found, up 2 percentage points since January 2013, but has fallen slightly from the peak of 16% it achieved in October. In May 2013, nginx overtook Microsoft to become the second most common web server within the top million busiest sites and now powers almost 16% of them.





DeveloperDecember 2013PercentJanuary 2014PercentChange
Apache355,244,90041.26%358,669,01241.64%0.38
Microsoft241,777,72328.08%253,438,49329.42%1.34
nginx126,485,20414.69%124,052,99614.40%-0.29
Google38,263,5254.44%21,280,6392.47%-1.97
Continue reading