Google’s POODLE affects oodles

97% of SSL web servers are likely to be vulnerable to POODLE, a vulnerability that can be exploited in version 3 of the SSL protocol. POODLE, in common with BEAST, allows a man-in-the-middle attacker to extract secrets from SSL sessions by forcing the victim's browser into making many thousands of similar requests. As a result of the fallback behaviour in all major browsers, connections to web servers that support both SSL 3 and more modern versions of the protocol are also at risk.

The Secure Sockets Layer (SSL) protocol is used by millions of websites to protect confidential data in transit across the internet using strong cryptography. The protocol was designed by Netscape in the mid 1990s and was first released to the public as SSL 2 in February 1995. It was quickly replaced by SSL 3 in 1996 after serious security flaws were discovered. SSL 3 was replaced by the IETF-defined Transport Layer Security (TLS) version 1.0 in January 1999 with relatively few changes. Since TLS 1's release, TLS 1.1 and TLS 1.2 have succeeded it and should be used in its place wherever possible.

sslv3-vulnerable6

POODLE's bark may be worse than its bite

Unlike Heartbleed, POODLE can be used to attack client-server connections and is inherent to the protocol itself, rather than any one implementation such as OpenSSL or Microsoft's SChannel. In order to exploit it, an attacker must modify the victim's network traffic, know how the targeted secret information is structured (such as where a session cookie appears) and be able to force the victim into making a large number of requests.

Each SSL connection is split up into a number of chunks, known as SSL records. When using a block cipher, such as Triple DES in CBC mode, each block is mixed in with the next block and the record then padded to be a whole number of blocks long (8-bytes in the case of Triple DES). An attacker with network access can carefully manipulate the ordering of the cipher-blocks within a record to influence the decryption and exploit the padding oracle. If the attacker has been lucky (there's a 1 in 256 chance), she will have matched the correct value for the padding length in her manipulated record and correctly guessed the value of a single byte of the secret. This can be repeated to reveal the entire targeted secret.

SSL 3's padding is particularly easy to exploit as it relies on a single byte at the end of the padding, the padding length. Consequently an attacker must force the victim to make only 256×n requests for n bytes of secret to be revealed. TLS 1.0 changed this padding mechanism, requiring the padding bytes themselves to have a specific value making the attack far less likely to succeed.

The POODLE vulnerability makes session hijacking attacks against web applications reasonably feasible for a correctly-positioned attacker. For example, a typical 32-byte session cookie could be retrieved after eavesdropping just over 8,000 HTTPS requests using SSL 3. This could be achieved by tricking the victim into visiting a specially crafted web page which uses JavaScript to send the necessary requests.

Use of SSL v3

Within the top 1,000 SSL sites, SSL 3 remained very widely supported yesterday, with 97% of SSL sites accepting an SSL 3 handshake. CitiBank and Bank of America both support SSL 3 exclusively and presumably are vulnerable.

move-to-tls1

A number of SSL sites have already reacted to this vulnerability by disabling support for SSL 3, including CloudFlare and LinkedIn. On Tuesday 14th, the most common configuration within the top 1,000 SSL sites was to support SSL 3.0 all the way through to TLS 1.2, with almost two-thirds of popular sites taking this approach. One day later, this remains the most popular configuration; however, TLS 1.0 is now the minimum version for 11%.

Microsoft Internet Explorer 6 does not support TLS 1.0 or greater by default and may be the most notable victim of disabling SSL 3 internet-wide. Now 13 years old, IE6 was the default browser released with Windows Server 2003 and Windows XP in 2001 and will remain supported in Windows Server 2003 until July 2015. Despite its age and the end of Microsoft's support for Windows XP, IE6 remains popular, accounting for more 3.8% of web visits worldwide, and 12.5% in China. This vulnerability may ring the death knell for IE6 and Windows XP.

However, unless SSL 3 is completely disabled on the server side, a client supporting SSL 3 may still be vulnerable even if the server supports more recent versions of TLS. An attacker can take advantage of browser fallback behaviour to force otherwise secure connections to use SSL 3 in place of TLS version 1 or above.

SSL version negotiation

At the start of an SSL connection, servers and clients mutually agree upon a version of SSL/TLS to use for the remainder of the connection. The client's first message to the server includes its maximum supported version of the protocol, the server then compares the client's maximum version against its own maximum version to pick the highest mutually supported version.

While this mechanism protects against version downgrade attacks in theory, most browsers have an additional fallback mechanism that retries a connection attempt with successively lower version numbers until it succeeds in negotiating a connection or it reaches the lowest acceptable version. This additional fallback mechanism has proven necessary for practical interoperability with some TLS servers and corporate man-in-the-middle devices which, rather than gracefully downgrading when presented with a non-supported version of TLS, they instead terminate the connection prematurely.

An attacker with appropriate network access can exploit this behaviour to force a TLS connection to be downgraded by forging Handshake Alert messages. The browser will take the Handshake Alert message as a signal that the remote server (or some intermediate device) has version negotiation bugs and the browser will retry the connection with a lower maximum version in the initial Client Hello message.

handshake-alert

Operation of a forced downgrade to SSL 3 against a modern browser.

The fallback mechanism was previously not a security issue as it never results in the use of a protocol version that neither the client nor server will accept. However, those with clients that have not yet been updated to disable support for SSL 3 are relying on the server to have disabled SSL 3. What remains is a chicken and egg problem, where modern clients support SSL 3 in order to retain support for legacy servers, and modern servers retain support for SSL 3 for legacy clients.

There is, however, a proposed solution in the form of an indicator (an SCSV) in the fallback connection to inform compatible servers that this connection is a fallback and to reject the connection unless the fallback was expected. Google Chrome and Google's web sites already support this SCSV indicator.


Firefox 32

Chrome 40

IE 11

Opera 25

Safari 7.1
TLS 1.2 TLS 1.2 x 3 TLS 1.2 TLS 1.2 x 3 TLS 1.2
TLS 1.1 TLS 1.1 TLS 1.1
TLS 1.0 TLS 1.0 TLS 1.0 TLS 1.0 TLS 1.0
SSL 3.0 SSL 3.0 SSL 3.0 SSL 3.0 SSL 3.0

Comparison of browser fallback behaviour

We tested five major browsers with an attack based on the forged Handshake Alert method outlined above, and found that each browser has a variant of this fallback behaviour. Both Chrome and Opera try TLS 1.2 three times before trying to downgrade the maximum supported version, whereas the remainder immediately started downgrading. Curiously, Internet Explorer and Safari both skip TLS 1.1 and jump straight from TLS 1.2 to TLS 1.0.

Mitigation

Mitigation can take many forms: the fallback SCSV, disabling SSL 3 fallback, disabling SSL 3 in the client side, disabling SSL 3 in the server side, and disabling CBC cipher suites in SSL version 3. Each solution has its own problems, but the current trend is to disable SSL 3 entirely.

Disabling only the CBC cipher suites in SSL 3 leaves system administrators with a dilemma: RC4 is the only other practical choice and it has its fair share of problems making it an undesirable alternative. The SCSV requires support from both clients and servers, so may take some time before it is widely deployed enough to mitigate this vulnerability; it will also likely not be applied to legacy browsers such as IE 6.

Apache httpd can be configured to disable SSL 3 as follows:

SSLProtocol +TLSv1 +TLSv1.1 +TLSv1.2 -SSLv2 -SSLv3
Microsoft IIS and nginx can also be configured to avoid negotiating SSL version 3.

Firefox can be configured to disable support for SSL 3 by altering security.tls.version.min from 0 (SSL 3) to 1 (TLS 1) in about:config.

firefox-disable

Internet Explorer can also be configured to disable support using the Advanced tab in the Internet Options dialogue (found in the Control Panel). In a similar way, IE 6 users can also enable support for TLS 1.0.

internet-options-disable

Chrome can be configured to not use SSL 3 using a command line flag, --ssl-version-min=tls1.

Site Report

You can check which SSL sites are still using SSL 3 using the Netcraft Site Report:

Netcraft site report
URL:

October 2014 Web Server Survey

In the October 2014 survey we received responses from 1,028,932,208 sites, which is nearly six million more than last month.

Apache regains the lead

Microsoft lost the lead to Apache this month, as the two giants continue to battle closely for the largest share of all websites. Apache gained nearly 30 million sites, while Microsoft lost 22 million, causing Apache to be thrust back into the lead by more than 36 million sites. In total, 385 million sites are now powered by Apache, giving it a 37.45% share of the market.

A significant contributor to this change was the expiry of domains previously used for link farming on Microsoft IIS servers. The domains used by these link farms were acquired and the sites are now hosted on Apache servers at Confluence-Networks, which display Network Solutions parking notices.

A new major release in the Apache 2.2 legacy branch was announced on 3 September. Apache 2.2.29 also incorporates many changes — including several security fixes — from version 2.2.28, which was not officially released. New versions of nginx stable and mainline were also released during September, which included fixes for an SSL session reuse vulnerability, plus several other bugfixes.

Top million sites

The million busiest websites now represent less than 0.1% of all websites in the survey, but provide an insight into the preferences amongst the sites which are responsible for the great majority of today's web traffic.

Just over half (50.2%) of the top million sites use Apache, which is very similar to its share amongst all active sites; however, nginx's market share is skewed noticeably higher amongst the top million sites, where it powers 20.3% of sites, compared with only 14.3% of all active sites.

Computer growth

The most stable metric is the market share of web-facing computers — hundreds of thousands of websites can easily be served from a single computer (and subsequently disappear all in one go) but it is obviously far less trivial and less desirable to deploy or decommission a significant number of computers. Netcraft's survey is also able to identify distinct computers which use multiple web-facing IP addresses, which adds further stability.

Apache leads in this market with a 47.5% share, and Microsoft also performs well with 30.7%, but both have been gradually falling over the past few years as a result of nginx's strong growth. nginx gained more than 17,000 additional web-facing computers this month, helping to bring its market share up to 10.3%.

New top level domains

The relatively new .xyz domain, which showed tremendous growth over the past couple of months, has started to flatten out slightly after gaining only 33,000 sites this month (+8%). Nonetheless, this is still quite a healthy gain, albeit notably less than last month's growth of 177,000 hostnames which then boosted its total by 78%.

Other promising TLDs include .london, .hamburg and .公司, each of which had fewer than 50 sites in last month's survey, but now have 17,000, 11,000 and 10,000 sites respectively.

The internationalised .公司 (.xn--55qx5d) TLD is delegated to the Computer Network Information Center of Chinese Academy of Sciences. It means "company", making it the Chinese equivalent of .com.

Total number of websites

Web server market share

DeveloperSeptember 2014PercentOctober 2014PercentChange
Apache355,925,98534.79%385,354,99437.45%2.66
Microsoft371,406,90936.31%345,485,41933.58%-2.73
nginx144,717,67014.15%148,330,19014.42%0.27
Google19,499,1541.91%19,431,0261.89%-0.02
Continue reading

Phishing with data: URIs

A recent spate of phishing attacks has taken to using the data URI scheme for evil. Supported in most browsers, these special URIs allow the content of a phishing page to be contained entirely within the URI itself, effectively eliminating the need to host the page on a remote web server and adding an additional layer of indirection.

One of these attacks is demonstrated below, where a phishing campaign was used to herd victims to a compromised site in the US, which then redirected them to a Base64-encoded data URI. This particular example impersonates Google Docs in an attempt to steal email addresses and passwords from Yahoo, Gmail, Hotmail, and AOL customers.

Google Docs phishing site using data: URI

All of the attacks use Base64-encoded data URIs, rather than human-readable plain text, making it harder for people, simple firewalls and other content filters to detect the malicious content.

Most phishing sites are hosted on compromised websites, but can also be seen using purpose-bought domain names and bulletproof hosting packages that have been paid for fraudulently. However, fraudsters can take advantage of open redirect vulnerabilities to "host" these malicious data URIs without the need for conventional web hosting.

This situation is ideal for scenarios such as malware delivery and social engineering attacks where no subsequent client-server interaction is required, but phishing sites still need some way of transmitting their victim's credentials to the fraudster. Most phishing attacks that use data URIs resort to the traditional method of transmitting stolen credentials, i.e. POSTing them to a script on a remote web server. However, with no obvious phishing content being hosted on the remote web server, such scripts could be more difficult for third parties to take down; and as long as they remain functional, each one can continue to be used by any number of data URI attacks.

Another interesting example which impersonated an eBay login page is shown below. If a victim is unfortunate enough to fall for this particular phishing attack, his credentials will be transmitted to a PHP script hosted on a compromised web server in Germany.

eBay phishing site using a data: URI

This demonstrates an interesting deficiency in Google Chrome: If the data URI is longer than 100,000 characters, then none of the Base64-encoded data within the URI will be displayed in the address bar. Rather than truncating the URI, Chrome's address bar will only display the string "data:".

This behaviour could make it more difficult for wary victims to report such attacks. Although the victim is viewing an eBay phishing page, if he tries to copy the URI from the address bar in Chrome, the clipboard will still only contain the string "data:".

The Netcraft Extension provides protection against the redirects used in the phishing attacks above, and Netcraft's open redirect detection service can be used to identify website vulnerabilities which would allow fraudsters to easily redirect victims to similar phishing content.

Most Reliable Hosting Company Sites in September 2014

Rank Performance Graph OS Outage
hh:mm:ss
Failed
Req%
DNS Connect First
byte
Total
1 Qube Managed Services Linux 0:00:00 0.004 0.086 0.023 0.046 0.046
2 GoDaddy.com Inc Linux 0:00:00 0.013 0.149 0.012 0.200 0.205
3 Memset Linux 0:00:00 0.013 0.111 0.055 0.132 0.217
4 www.dinahosting.com Linux 0:00:00 0.013 0.242 0.080 0.159 0.159
5 Swishmail FreeBSD 0:00:00 0.022 0.124 0.073 0.144 0.186
6 ServerStack Linux 0:00:00 0.022 0.081 0.076 0.151 0.151
7 Datapipe FreeBSD 0:00:00 0.030 0.102 0.016 0.032 0.048
8 EveryCity SmartOS 0:00:00 0.030 0.083 0.054 0.107 0.107
9 Logicworks Linux 0:00:00 0.030 0.143 0.073 0.152 0.340
10 Pair Networks FreeBSD 0:00:00 0.030 0.219 0.082 0.166 0.579

See full table

Qube had the most reliable company site in September with only a single failed request. This is the fourth time this year that Qube has made it to first place, nudging ahead of Datapipe's track record this year. Qube offers a Hybrid cloud service, where physical servers and equipment are integrated with its cloud hosting with a secure connection between the two networks.

The second most reliable hosting company site belonged to GoDaddy, the world's largest domain registrar, and had only 3 failed requests in September. Memset and dinahosting also had only 3 failed requests and thus they were ranked by average connection times.

In third place is Memset. Memset was last ranked in the top 10 in June 2013 when it achieved 9th place with 6 failed requests. Memset offers its customers a Perimeter Patrol service, which involves regular scanning of Memset servers to highlight security vulnerabilities.

Linux was still the most popular operating system of choice, used by 6 of the top 10, followed by FreeBSD which was used by 3. EveryCity, however, uses SmartOS, a community fork of OpenSolaris geared towards cloud hosting using KVM virtualisation.

Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage. In the event the number of failed requests are equal then sites are ranked by average connection times.

Information on the measurement process and current measurements is available.

September 2014 Web Server Survey

In the September 2014 survey we received responses from 1,022,954,603 sites — nearly 31 million more than last month.

More than a billion websites

This is the first time the survey has exceeded a billion websites, a milestone achievement that was unimaginable two decades ago.

Netcraft's first ever survey was carried out over 19 years ago in August 1995. That survey found only 18,957 sites, although the first significant milestone of one million sites was reached in less than two years, by April 1997.

Fuelled by the dot-com bubble between 1997 and 2000, the survey reached nearly 10 million sites by the start of 2000. The active sites metric was added to our survey shortly afterwards, immediately showing that a significant proportion of websites were automatically generated, displaying identical tag structures, and used for activities such as holding pages, typo-squatting advertising providers, speculative domain registrants, and search-engine optimisation companies.

Rapid hostname growth has continued ever since, with the number of active sites increasing at a far gentler rate. Just under half of the hostnames in our June 2000 survey were active sites, whereas today, less than one in five are active — 178 million active sites in total.

Microsoft, Apache, and nginx

Microsoft and Apache currently take the lion's share of the web server market (just over 71% combined), while Microsoft edged into the lead for the first time in July 2014. Nginx has been steadily gaining share over the last 7 years, and is now used to serve just over 14% of all hostnames.

The view by number of active sites is very different, however. While Microsoft has seen a rapid growth in their hostname market share of around 20 percentage points since September 2011, there has been almost no change in their share of the active sites in this time. Nginx overtook Microsoft in terms of active sites in 2012, and today has a market share of 14.5% – more than 2 points ahead of Microsoft, whose web server software is used by only 11.9% of active sites. However, Apache truly dominates this market, with more than half of all active sites choosing to use Apache software.

Recently nginx has been seeing even greater gains in terms of web facing computers, doubling their market share in the last 2 years to just over 10% this month. Apache and Microsoft are continuing to experience increases in their number of web facing computers, however the growth is often far smaller than that of nginx. This month they gained just 323 and 414 computers respectively, compared to an increase of over 17k for nginx.

New top level domains

Dozens of new TLDs were added to the Root Zone during this month's survey, including .deals, .healthcare, .realtor, .auction, .yandex, .city and .lgbt. Recent additions which have now started to experience growth in the survey include .media, .services, .reisen, .pictures, .exchange and .toys. Each of these TLDs had only two or three sites last month, but all are now in their thousands.

The .xyz domain, which we mentioned last month, has outpaced all of the other new gTLDs after a Network Solutions promotion offering a free matching .xyz domain with each .com domain purchased. This month an additional 177,000 hostnames were found under this TLD, bringing the total number of .xyz sites up by 78% to 403,000. Even faster growth was seen among the .中国 (xn--fiqs8s) internationalised domain name for China, which grew by 181% to a total of 73,000 sites.

Total number of websites

Web server market share

DeveloperAugust 2014PercentSeptember 2014PercentChange
Microsoft367,805,41637.07%371,406,90936.31%-0.76
Apache346,702,99034.94%355,925,98534.79%-0.15
nginx135,037,73813.61%144,717,67014.15%0.54
Google20,076,8902.02%19,499,1541.91%-0.12
Continue reading

Most Reliable Hosting Company Sites in August 2014

Rank Performance Graph OS Outage
hh:mm:ss
Failed
Req%
DNS Connect First
byte
Total
1 EveryCity SmartOS 0:00:00 0.004 0.081 0.054 0.108 0.108
2 Hyve Managed Hosting Linux 0:00:00 0.008 0.187 0.052 0.103 0.105
3 XILO Communications Ltd. Linux 0:00:00 0.008 0.164 0.055 0.110 0.185
4 krystal.co.uk Linux 0:00:00 0.008 0.103 0.057 0.130 0.130
5 Webair Internet Development Linux 0:00:00 0.008 0.162 0.069 0.137 0.241
6 Server Intellect Windows Server 2012 0:00:00 0.008 0.063 0.125 0.255 0.634
7 Qube Managed Services Linux 0:00:00 0.013 0.073 0.021 0.043 0.043
8 Bigstep Linux 0:00:00 0.013 0.237 0.059 0.115 0.115
9 Host Europe Linux 0:00:00 0.013 0.125 0.062 0.149 0.152
10 ServerStack Linux 0:00:00 0.013 0.072 0.073 0.145 0.145

See full table

EveryCity had the most reliable hosting company site in August, with only one failed request. EveryCity has been in business for more than seven years, during which time it has hosted websites for many global brands, including Disney, Ikea, Lego, MTV, Skype, SoundCloud and Thomson Reuters.

Although EveryCity's site has only been monitored by Netcraft since April, it has attained 100% uptime ever since, and also ranked as the third most reliable hosting company site in both May and July. EveryCity uses the SmartOS operating system extensively, exploiting its combination of OpenSolaris and Linux KVM virtualisation technology.

With two failed requests (but a slightly faster average connection time), Hyve Managed Hosting had the second most reliable hosting company site in August. Hyve also ranked second last month, and has made it into the top ten a total of six times so far this year.

Hyve is the UK's first enterprise VMware cloud hosting provider, and its primary data centre is based in an enhanced tier III facility based in London. This data centre can store nearly half a million litres of diesel to support 50 hours of running at full capacity in the event of a power outage. Hyve's other data centres are based in New Jersey, California, Hong Kong and Shanghai.

XILO Communications came third in August, with two failed requests and an average connection time slightly longer than that of both EveryCity and Hyve. Its uptime over the past two years is 99.996%. krystal.co.uk, Webair Internet Development and Server Intellect also had two failed requests, but with longer average connection times than XILO.

Eight of August's top ten most reliable hosting company sites were served from Linux computers, while Server Intellect used Windows Server 2012 and EveryCity used SmartOS, which is a community fork of OpenSolaris designed specifically for cloud computing.

Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage. In the event the number of failed requests are equal then sites are ranked by average connection times.

Information on the measurement process and current measurements is available.