1. NIST continues using SHA-1 algorithm after banning it

    The National Institute of Standards and Technology (NIST) is still using SSL certificates signed with the SHA-1 signature algorithm, despite issuing a Special Publication disallowing the use of this algorithm for digital signature generation after 2013.

    "SHA-1 shall not be used for digital signature generation after December 31, 2013."
    — NIST recommendation

    The SSL certificate for www.nist.gov is signed using the SHA-1 hashing algorithm, and was issued by VeriSign on 23 January 2014, more than three weeks after NIST's own ban came into effect. Also issued this year, NIST's "Secure File Transfer Service" at xnfiles.nist.gov uses a SHA-1 certificate.

    An attacker able to find SHA-1 collisions could carefully construct a pair of certificates with colliding SHA-1 hashes: one a conventional certificate to be signed by a trusted CA, the other a sub-CA certificate able to be used to sign arbitrary SSL certificates. By substituting the signature from the CA-signed certificate into the sub-CA certificate, certificate chains containing the attacker-controlled sub-CA certificate will pass browser verification checks. This attack is, however, made more difficult by path constraints and the inclusion of unpredictable data into the certificate before signing it.

    The increasing practicality of finding SHA-1 hash collisions could make it possible for a well-funded attacker to impersonate any HTTPS website. With a practical attack against SHA-1 (using cloud computing resources) estimated to cost $2.77M in 2012, falling to $700k by 2015, it may attract government agencies.

    The SSL certificate for www.nist.gov with the signature algorithm and issuance date highlighted.

    Along with NIST itself, many US Government institutions have continued to generate new SSL certificates with SHA-1 signatures. Examples include the certificate for donogc.navy.mil, issued on 3 January 2014, and several United States Bankruptcy Court document filing systems (each state has its own site and uses its own SHA-1-signed SSL certificate). Despite receiving widespread criticism for a number of other security problems last year, the ObamaCare exchange at healthcare.gov also saw fit to deploy a new SSL certificate in January which uses the SHA-1 hashing algorithm.

    NIST and the rest of the US government are far from alone, however, as more than 92% of all certificates issued this year use the SHA-1 hashing algorithm.

    Although the recommendations from the National Institute of Standards and Technology have been prepared for US federal agencies, the cryptographic weaknesses of SHA-1 should concern anyone who relies on SHA-1 to generate or validate digital signatures. Microsoft shares these concerns and has announced plans to deprecate the use of SHA-1 in both SSL and code signing certificates by the end of 2016.

    The NSA-designed SHA-2 family (which includes SHA-224, SHA-256, SHA-384 and SHA-512) now provides the only cryptographic hash functions approved by NIST for digital signature generation. Whilst SHA-2 shares some similarities with SHA-1, there are significant structural differences: SHA-2 does not share the SHA-1's mathematical weakness. All of the SHA-2 algorithms have much longer digests: SHA-1 only produces a 160-bit digest, whereas the most common digest length for SHA-2 is 256-bits.


    Huge divide: SHA-256 uptake remains low, and is still only used by a handful of certificate authorities.
    Other signature algorithms with negligible shares (e.g. MD5 and SHA-512) are not displayed.

    In total, more than 98% of all SSL certificates in use on the Web are still using SHA-1 signatures. Netcraft's February 2014 SSL Survey found more than 256,000 of these certificates would otherwise be valid beyond the start of 2017 and, due to the planned deprecation of SHA-1, will need to be replaced before their natural expiry dates.

    SHA-256 is the most commonly used signature algorithm from the SHA-2 family, but it is used by only 1.86% of the valid certificates in Netcraft's February 2014 SSL Survey; nonetheless, this share has more than doubled in the space of 4 months, suggesting that some certificate authorities are starting to take the issue seriously. So far in 2014, more than 61% of the new certificates signed with SHA-256 were issued by a single certificate authority, Go Daddy. SHA-512 is the only other SHA-2 family algorithm to be seen used in SSL certificates, albeit deployed on only 4 websites so far.

    The CA/B Forum – which comprises of both certificate authorities and web browser vendors – put forward Ballot 111 last year, which motions to take advantage of the deprecation of SHA-1 by accelerating the forum's planned move to shorter maximum certificate lifetimes. The deprecation alone will mean that some five-year certificates that are valid today will not be usable for their entire lifetime.

    In practice, it is likely to be Microsoft's plans to deprecate the use of SHA-1 signatures by the end of 2016 which will force the mass adoption of SHA-2 by certificate authorities, although allowing three years for this to happen might seem generous. The majority of end users are unlikely to be affected by the change, and most website administrators will probably have to renew their SSL certificates within this period anyway, but certificates which are reissued with SHA-1 signatures run the risk of being unsupported by other browsers in the future.

    Cryptographic weaknesses in SHA-1 have been discussed for several years. A notable better-than-brute-force attack was announced nine years ago, demonstrating a SHA-1 hash collision that could be achieved within 269 calculations, as opposed to the 280 that would be required by a brute-force approach.

    More recently, the best public cryptanalysis of SHA-1 estimated that a full collision can be achieved with a complexity of around 261, while a near-collision can be achieved in 257.5. These attacks have been implemented in the HashClash framework, which provides differential path construction attacks against SHA-1, as well as chosen prefix collisions against the even-weaker MD5 algorithm. The CA/B Forum recommends that all certificate serial numbers should exhibit at least 20 bits of entropy, which would mitigate chosen-prefix collision attacks for non collision resistant hash functions, although such measures are not thought to be necessary for SHA-2 at the current time.

    Windows XP has supported SHA-256, SHA-384 and SHA-512 since the release of Service Pack 3 in 2008, and Windows Server 2003 can also support SHA-2 if the KB938397 hotfix has been installed. Deprecating SHA-1 could therefore also have some other indirect security benefits: anyone still using Windows XP before Service Pack 3 will be unable to make effective use of the web as SHA-2 certificates gain prominence.

    The SHA-1 algorithm is also used in all versions of the TLS cryptographic protocol, and only the latest version (TLS 1.2) introduces SHA-256 as a replacement for the MD5/SHA-1 combination for both the pseudorandom function and the finished message hash. Microsoft's SHA-1 deprecation policy will only apply to applications which call the CertGetCertificateChain API to build and validate a certificate chain, so older browsers and hardware devices which do not yet support TLS 1.2 will be unaffected.

    Update 5 Feb 2014: Following the publication of this article, NIST today replaced the SHA-1 certificate on www.nist.gov with a new one which uses SHA-256 as a signature algorithm. At the time of writing, the certificate used by xnfiles.nist.gov is still signed with SHA-1.

    Posted by Paul Mutton on 4th February, 2014 in Dogfood, Security

  2. February 2014 Web Server Survey

    In the February 2014 survey we received responses from 920,102,079 sites — over 58 million more than last month.

    Microsoft gained a staggering 48 million sites this month, increasing its total by 19% — most of this growth is attributable to new sites hosted by Nobis Technology Group. Along with Microsoft, nginx also made a large gain of 14 million sites, whereas Apache fell by 7 million. Unsurprisingly, these changes have had a dramatic effect on the overall market share of each web server vendor, with Microsoft's share growing by 3.38 percentage points to 32.8% (302 million sites) while Apache's has fallen by 3.41 to 38.2% (352 million sites).

    Microsoft's market share is now only 5.4 percentage points lower than Apache's, which is the closest it has ever been. If recent trends continue, Microsoft could overtake Apache within the next few months, ending Apache's 17+ year reign as the most common web server. Apache is faring much better in both the active sites and top million sites datasets, however, where it is still dominating with just over half of the market share in both metrics.

    Nearly 2% of the top million websites are now being served by CloudFlare's customised version of nginx (cloudflare-nginx), which it uses to serve web content via its globally distributed CDN edge nodes. This month's survey saw more than a thousand of the top million sites migrate to cloudflare-nginx from other web server software, including pizzahut.co.uk, pet-supermarket.co.uk, the image server used by the popular Cheezburger network of blogs, and the official PRINCE2 website which switched from Microsoft IIS 6.0 running on Windows Server 2003.

    Overall, nginx powers 17.5% of the top million sites, including popular overclocking forum www.overclock.net, despite its server headers declaring that it is now using Microsoft IIS 4.1. Responses from the server also include an X-Powered-By header which claims the application is running on Visual Basic 2.0 on Rails (Visual Basic 2.0 is a long-deprecated language which was released more than 20 years ago, while Microsoft IIS 4.1 never actually existed). The server claimed to be running nginx during Netcraft's previous survey, and indeed, it exhibits characteristics which suggest it is still using nginx.

    The number of sites using the .pw country-code top-level domain (ccTLD) grew by more than half this month, reaching 10M sites in total. This ccTLD is assigned to Palau, but the .pw registry has branded the domain as the Professional Web and allows domains to be registered by the general public, regardless of which country they are in. 97% of this month's new .pw sites are hosted in the US (87% at Nobis Technology Group alone), and 2.7 million of them are running on Windows.

    The busiest .pw domain is the single-letter u.pw, which is used by viral social media site Upworthy. Other than that, the ccTLD remains relatively obscure, with less than 0.02% of the top million sites using .pw domains, while other single-letter domains have been sold for the modest sums of $8,000 each. Only two single-letter .pw domains actually appear within the top million sites, and there are no two-letter domains at all. Last year, Symantec noted an increase in spam messages containing URLs with .pw TLDs, and .pw later became the first TLD to adopt the Uniform Rapid Suspension (URS) rights protection mechanism.

    Version 1.0 of the URS Technical Requirements were published by ICANN in October 2013, and are intended to make it faster and cheaper for trademark holders to seek resolution when there are obvious cases of infringement. URS is intended to complement rather than replace the existing Uniform Domain-Name Dispute-Resolution Policy (UDRP).





    DeveloperJanuary 2014PercentFebruary 2014PercentChange
    Apache358,669,01241.64%351,700,57238.22%-3.41
    Microsoft253,438,49329.42%301,781,99732.80%3.38
    nginx124,052,99614.40%138,056,44415.00%0.60
    Google21,280,6392.47%21,129,5092.30%-0.17
    (more...)

    Posted by Netcraft on 3rd February, 2014 in Web Server Survey

  3. January 2014 Web Server Survey

    In the January 2014 survey we received responses from 861,379,152 sites, an increase of 355,935 since last month.

    2013 has been a year of significant change: the web has grown by more than one third, the importance of SSL has been highlighted by a series of spying revelations, Microsoft now power just below 30% of all web sites, and Apache has lost almost 14 percentage points of market share. Additionally, nginx, the relative newcomer, saw its market share peak at 16%, just shy of Microsoft’s position at the beginning of last year.

    The total number of web sites discovered has increased dramatically this year — from 630 million web sites in January 2013 to 861 million in January 2014 (+37%) — though the growth does not compare to the doubling in size during 2011.

    With the revelations from the NSA documents leaked by Edward Snowden providing months of mainstream publicity, 2013 has been a bumper year for the SSL industry. Websites are increasingly being served over HTTPS: 48% more sites within the million busiest are using SSL than in January 2013. In total, there are over half a million more SSL certificates (+22%) in use on the web since January 2013. The estimated total revenue of the industry has increased even more rapidly, by 28% from September 2012 to September 2013, reflecting the increased uptake of more expensive certificates including Extended Validation, multi-domain, and wildcard certificates.

    Apache remains the most commonly used web server on the internet, 10 million more web sites are using it than this time last year; however, this growth has not been sufficient to maintain its share of a market which grew by more than 200 million web sites. As a result, Apache’s market share has fallen by 14 percentage points since January and now stands at 42%.

    In stark contrast to Apache, Microsoft had a strong year — almost 150 million more web sites use a Microsoft web server than in January 2013. Microsoft’s share is close to 30% of the entire market and a combination of its strong growth and the corresponding lack of growth of sites using Apache has resulted in Apache’s lead shrinking by more than 26 percentage points to just 12. Microsoft's own cloud platform, Azure, has grown steadily throughout 2013 — there are 39% more web-facing computers hosted by Microsoft in January 2014 than the same time last year — and despite offering alternatives, Microsoft's IIS is by far the most common web server on Azure.

    Open-source web server nginx has continued to gain acceptance, especially amongst the busiest web sites. Nginx is now used on 14% of all web sites found, up 2 percentage points since January 2013, but has fallen slightly from the peak of 16% it achieved in October. In May 2013, nginx overtook Microsoft to become the second most common web server within the top million busiest sites and now powers almost 16% of them.





    DeveloperDecember 2013PercentJanuary 2014PercentChange
    Apache355,244,90041.26%358,669,01241.64%0.38
    Microsoft241,777,72328.08%253,438,49329.42%1.34
    nginx126,485,20414.69%124,052,99614.40%-0.29
    Google38,263,5254.44%21,280,6392.47%-1.97
    (more...)

    Posted by Netcraft on 3rd January, 2014 in Web Server Survey

  4. Most Reliable Hosting Company Sites in December 2013

    Rank Performance Graph OS Outage
    hh:mm:ss
    Failed
    Req%
    DNS Connect First
    byte
    Total
    1 Qube Managed Services Linux 0:00:00 0.004 0.100 0.043 0.087 0.088
    2 Hosting 4 Less Linux 0:00:00 0.004 0.171 0.124 0.245 0.627
    3 New York Internet FreeBSD 0:00:00 0.007 0.140 0.074 0.148 0.577
    4 Pair Networks FreeBSD 0:00:00 0.007 0.234 0.083 0.169 0.572
    5 www.dinahosting.com Linux 0:00:00 0.007 0.245 0.087 0.177 0.177
    6 Webair Internet Development FreeBSD 0:00:00 0.011 0.167 0.073 0.154 0.353
    7 Server Intellect Windows Server 2012 0:00:00 0.011 0.073 0.101 0.330 0.701
    8 XILO Communications Ltd. Linux 0:00:00 0.019 0.202 0.068 0.136 0.234
    9 Swishmail FreeBSD 0:00:00 0.019 0.126 0.074 0.146 0.192
    10 ServerStack Linux 0:00:00 0.019 0.088 0.076 0.151 0.152

    See full table

    Qube Managed Services had the most reliable hosting company site in December 2013, making December Qube's fourth consecutive month in the top ten after attaining first place in September. Qube provides managed hosting services out of data centers in London, New York and Zurich. Qube and Hosting 4 Less, which placed second, both saw only a single failed request over the month with Hosting 4 Less losing out on the top spot due to a slightly longer average connection time (0.04s vs. 0.12s).

    Swishmail made an appearance at ninth in the table, taking its total number of appearances in the top ten in 2013 to nine. This means that both iWeb and Swishmail jointly hold the record for the most frequent appearances in the top ten in 2013. Swishmail provides corporate email services on a FreeBSD platform, while iWeb provides dedicated servers, managed hosting and colocation services on a Linux platform.

    To sign off the year, WebAir made its first appearance in the table for 2013, placing sixth. WebAir offers hosting services out of data centers in Los Angeles, Amsterdam, Montreal and a 'flagship' facility in New York (NY1), which WebAir claims offers 'the lowest network latency to Europe via direct feeds to Transatlantic fiber'.

    Five of the top ten most reliable hosting company sites were running Linux, while four ran FreeBSD. Server Intellect ran the single Windows Server-based site to complete the list: Windows Server makes a reappearance after not placing in the top ten in November.

    Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

    From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage. In the event the number of failed requests are equal then sites are ranked by average connection times.

    Information on the measurement process and current measurements is available.

    Posted by Netcraft on 1st January, 2014 in Hosting, Performance

  5. DigitalOcean now growing faster than Amazon

    Cloud computing provider DigitalOcean is now growing faster than Amazon Web Services. Our December 2013 Web Server Survey showed a month-on-month gain of 6,514 web-facing computers at DigitalOcean; Amazon, meanwhile, grew by an almost as huge 6,269 web-facing computers. Together, the two companies accounted for more than a third of the internet-wide growth in web-facing computers in December.

    DigitalOcean is now the 15th largest hosting company in terms of web-facing computers — a remarkable feat considering DigitalOcean had only 280 web-facing computers at the start of this year. Although Amazon is still the largest hosting company (by web-facing computers) and has nearly six times as many web-facing computers in total, the rapid growth at DigitalOcean may have startled those at Amazon who thought their major competitors were Microsoft Azure and Rackspace.

    Whilst DigitalOcean competes directly with Amazon EC2, there are a number of Amazon Web Services which do not have a DigitalOcean equivalent. For example, Amazon offer file storage (S3), load balancing (Elastic Load Balancing), and a Content Delivery Network (CloudFront). However, by simplifying their offering — lack of support for Microsoft Windows is notable — and not offering enterprise features, DigitalOcean appeals to users with straightforward requirements such as small businesses and developers.

    The cheapest virtual computer ("droplet") at DigitalOcean uses solid state storage and costs less than one cent per hour, about a third of the price of Amazon's cheapest on-demand instance. Unsurprisingly, such competitive pricing is attracting a large number of completely new customers as well as enticing other hosting companies' customers to switch to DigitalOcean.

    Sites migrating from Amazon to DigitalOcean

    818 existing websites migrated from Amazon to DigitalOcean this month, whereas only 88 sites moved in the opposite direction. Although the difference seems significant, the largest gains at DigitalOcean actually consist of new sites: DigitalOcean is currently hosting 490,000 websites, 120,000 of which were not present in last month's survey as per the Netcraft December 2013 Hosting Provider Switching Analysis.

    Sites which migrated from Amazon to DigitalOcean include text messaging service Phonify (plus its API at api.phonify.io), several Windshield Guru sites, and real-time crowd photo sharing site zingly.

    DigitalOcean growth trends and a timeline of events can be viewed at http://trends.netcraft.com/www.digitalocean.com

    Posted by Paul Mutton on 11th December, 2013 in Hosting

  6. December 2013 Web Server Survey

    In the December 2013 survey we received responses from 861,023,217 sites, an increase of 75.7M since last month.

    For the third consecutive month Microsoft experienced the largest growth in web server market share; an additional 51M sites boosted its market share by almost 4 percentage points. Apache had the biggest loss in market share, despite seeing an increase of 7M sites. Nginx’s growth this month has led to them regaining some of the market share lost last month, and results in a net gain of 3M sites over the last two months (a loss of 13M in November, followed by a gain this month of 16M).

    Considering only active sites, Microsoft’s growth was significantly lower (compared to its gain of 51M sites) with an increase of 478k active sites. Apache saw the largest increase of 622k active sites, and increased its market share by 2 percentage points to 54%. Google meanwhile saw a large loss of 8.6M active sites within its App Engine service.

    Microsoft's web server market share could see further gains from its own Windows Azure cloud service — Microsoft has made Azure more cost-effective with price reductions and a new enterprise plan. Microsoft also has plans to expand Azure into Brazil in 2014. Unsurprisingly Microsoft IIS is the most common Web Server seen on Azure, used by 87% of sites. Amazon followed Microsoft's price cuts by introducing price reductions on EC2 M3 Instances. At Amazon, 36% of sites are running on nginx, followed by Apache with 27% and Microsoft IIS with 14%.

    ICANN has begun delegating new generic Top-Level Domains (gTLDs) from its new gTLD expansion program, adding them into the Internet's Root Zone. The first four gTLDs (شبكة, онлайн, сайт, 游戏) were delegated on October 23rd and a further 30 new gTLDs have been delegated throughout November. ICANN has a mandatory 30-day sunrise period following delegation where registries "complete a final process built into the new gTLD Program to protect trademark rights holders"; after this has passed the new gTLD can be made available to the general public at the registry's discretion. The شبكة dotShabaka registry was first to start its sunrise period on 31 October, this is to be followed by an (optional) Landrush Period before general availability commences on February 4th. So far the only sites found by Netcraft have been for the NICs (Network Information Center) themselves, such as: nic.游戏, nic.сайт, nic.онлайн, nic.شبكة.

    The first gTLDs signed agreements back in July and more are being agreed each month. Recent notable gTLDs that have been agreed include the cities .london, .berlin, .budapest, .wein (Vienna) and .tokyo.





    DeveloperNovember 2013PercentDecember 2013PercentChange
    Apache348,159,70244.33%355,244,90041.26%-3.08
    Microsoft190,451,70224.25%241,777,72328.08%3.83
    nginx109,968,56414.00%126,485,20414.69%0.69
    Google37,748,7434.81%38,263,5254.44%-0.36
    (more...)

    Posted by Netcraft on 6th December, 2013 in Web Server Survey

Page 4 of 186« First...23456102030...Last »