WikiLeaks attacked during launch of cablegate

WikiLeaks experienced some website downtime last night, coinciding with its release of the US embassy cables at

Just before the latest leak was released to the world via their new "cablegate" site last night, WikiLeaks tweeted that they were under a mass distributed denial of service attack, but defiantly stated that "El Pais, Le Monde, Speigel, Guardian & NYT will publish many US embassy cables tonight, even if WikiLeaks goes down".

Twitter user th3j35t3r claimed to be carrying out the denial of service attack against, although in a tweet that has since been deleted, th3j35t3r stated that it was not a distributed attack. If WikiLeaks believed the attack to be distributed, it could suggest that other parties had also been carrying out separate attacks at the same time.

th3j35t3r's Twitter profile lists his location as "Everywhere" and he describes himself as a "Hacktivist for good. Obstructing the lines of communication for terrorists, sympathizers, fixers, facilitators, oppressive regimes and other general bad guys.".

th3j35t3r's Twitter feed lists dozens of other sites that have also been taken down, mainly communicated through "TANGO DOWN" messages posted via the XerCeS Attack Platform. The "tango down" phrase is used by special forces and is often heard in FPS games such as Rainbow 6 and Call of Duty, where it is used to describe a terrorist being eliminated.

Referring to the success of the attack, th3j35t3r also tweeted, “If I was a wikileaks 'source' right now I'd be getting a little twitchy, if they cant protect their own site, how can they protect a src? "

The main site appeared to bear the brunt of the attack, suffering patchy or slow availability for several hours. Last night, the site was hosted from a single IP address, but has since been configured to distribute its traffic between two Amazon EC2 IP addresses on a round-robin basis. One of these instances is hosted in the US, while the other is in Ireland.

Meanwhile, has so far escaped any significant downtime. This site has used 3 IP addresses since its launch, probably in anticipation of being attacked or deluged with legitimate traffic. Two of these IP addresses are at Octopuce in France, which also hosts the single IP address now used by Ironically, the third IP address being used to distribute secret US embassy cables is an Amazon EC2 instance hosted in – you guessed it – the US.

Performance graphs are available here:

Iraq War Logs no longer served by Amazon EC2

The Iraq War Logs site run by WikiLeaks has been showing some choppy performance since last weekend, when its remaining Amazon EC2 instance stopped responding to HTTP requests.

Over the past week, the DNS configuration for had been directing traffic to two IP addresses on a round robin basis. One of these IP addresses was at Octopuce in France, and successfully handled half of the HTTP requests sent to; however, the remaining 50% were directed towards an Amazon EC2 IP address in Ireland, which stopped accepting connections to port 80 last weekend.

WikiLeaks appeared to fix the DNS problem today (Friday) – is now being served from just a single IP address in France. This is in contrast to the situation a few weeks earlier, when the site was being served from as many as 5 IP addresses, presumably to make the site more resilient to attack and high demand.

November 2010 Web Server Survey

In the November 2010 survey we received responses from 249,461,227 sites.

Apache continues to gain market share, with an increase of 1.29 percentage points since last month. This is the result of 12.9M new Apache hostnames, mostly in the United States (8.1M) and the Netherlands (1.6M). As seen in previous months, other server vendors lost market share as a result, though all of the major vendors apart from Google actually gained hostnames this month.

nginx saw an overall increase of 927k hostnames, despite a loss of 135k at China Telecom, as the resulting loss in Asia was outweighed by large growth in both EMEA and North America. The most significant changes were 213k new hostnames at BurstNet and 207k new hostnames at ServePath, both in the United States. As a result, nginx overtakes Google in this metric, although nginx still trails in terms of active sites, where Google maintains a lead of more than 4M.

At the end of September, Microsoft announced the migration of Windows Live Spaces sites to, which will happen over the next few months. uses load-balanced hosting at Layered Technologies and Peer1 and this month both companies saw modest increases in the number of sites using nginx (60k and 48k hostnames respectively). For the moment, Windows Live Spaces sites in the domain whose blogs have been moved to remain online redirecting users to their new location. For example, still exists served by Microsoft but when accessed redirects to, which is running nginx. In contrast, blogs on their own domains will result in losses for Microsoft as the DNS can simply be updated with no need for redirection. An example of a site in this category is which switched over in the middle of October; at the time it was not clear if this change from IIS on Windows to nginx on Linux was a deliberate move by Ray Ozzie as he prepared to step down as Microsoft's Chief Software Architect, though it now appears to be part of the wider Windows Live Spaces to migration. Since is served by nginx, we expect to see a continued increase in sites using nginx as the migration takes place.

Despite the changes described above, Microsoft gained 3.1M hostnames this month, mostly in the United States. The largest increases were 942k hostnames at GoDaddy and 717k hostnames at Demand Media Inc.

Lighttpd gained 690k hostnames, making up for the large loss last month. The growth came as the result of large number of new hostnames at SAVVIS Communications in Australia.

Total Sites Across All Domains
August 1995 - November 2010

Total Sites Across All Domains, August 1995 - November 2010

Market Share for Top Servers Across All Domains
August 1995 - November 2010

Graph of market share for top servers across all domains, August 1995 - November 2010

DeveloperOctober 2010PercentNovember 2010PercentChange
Continue reading

GitHub moves to SSL, but remains Firesheepable

Earlier this morning, GitHub announced that it had changed its revision control website to use SSL only; however, a significant flaw in the implementation means that session cookies can still be captured by Firesheep and other network sniffing tools.

Firesheep brought session hijacking to the masses when it was released last month. Ironically, its own GitHub repository includes a github.js handler, which was designed to capture unencrypted session cookies from GitHub users. This allowed novice attackers to monitor shared network traffic (such as public WiFi) and hijack those sessions.

A day after its release, Firesheep's author stated that a basic expectation of privacy should not be a premium feature, referring to the fact that, at the time, you had to pay GitHub if you wanted to use full-session SSL. GitHub's move to SSL this morning should have eliminated the session hijacking vulnerability, rendering Firesheep useless; however, the session cookies used by the site are not always handled securely.

When a user logs in to GitHub, the server sets a _gh_ses session cookie in the client browser. This cookie is not marked with the Secure flag, which means it will be transmitted unencrypted if the user subsequently visits, even though that page immediately redirects the user to This means the site's users may still be vulnerable to sniffing tools such as Firesheep.

Netcraft successfully hijacked a session from the GitHub site by sniffing the cookies that were sent via unencrypted HTTP. Many legacy URLs will still point to the HTTP version of the site, so an attacker may not even need to entice a victim into visiting the HTTP site. Once a session has been hijacked, the attacker can freely create repositories, delete/add email addresses and change passwords, so it looks like the sidejack prevention that GitHub implemented a week ago (which did use a Secure cookie) has been undone.

Although GitHub's move to SSL has not yet been implemented securely, it is at least a step in the right direction for Firesheep's author, Eric Butler. When he released the tool on 24 October 2010, he said:

Websites have a responsibility to protect the people who depend on their services. They've been ignoring this responsibility for too long, and it's time for everyone to demand a more secure web. My hope is that Firesheep will help the users win.

GitHub announced the SSL-only change on Twitter this morning, and is expected to publish a blog post about it soon.


GitHub has since fixed the session cookie to be secure. Now that it can only be transmitted over encrypted connections, this makes the site invulnerable to Firesheep.

Most Reliable Hosting Company Sites in October 2010

Rank Company site OS Outage
DNS Connect First
1 Virtual Internet Linux 0:00:00 0.015 0.211 0.068 0.138 0.138
2 New York Internet FreeBSD 0:00:00 0.019 0.159 0.082 0.173 0.464
3 INetU FreeBSD 0:00:00 0.022 0.157 0.082 0.186 0.493
4 Linux 0:00:00 0.030 0.157 0.065 0.351 0.642
5 Datapipe FreeBSD 0:00:00 0.034 0.069 0.010 0.021 0.026
6 iWeb Technologies Linux 0:00:00 0.041 0.112 0.087 0.174 0.174
7 Linux 0:00:00 0.041 0.192 0.099 0.384 0.563
8 Swishmail FreeBSD 0:00:00 0.049 0.316 0.070 0.140 0.363
9 Linux 0:00:00 0.049 0.659 0.074 0.313 0.570
10 Multacom FreeBSD 0:00:00 0.056 0.172 0.137 0.275 0.752

See full table

Top of the rankings this month is Virtual Internet, whose site responded to all but four of Netcraft's requests. Virtual Internet focuses on availability and reliability, with a high capacity data centre network throughout Europe. Its UK data centres provide high connectivity as well as redundant power and cooling, multiple fault-tolerant distribution paths and strict access controls.

In second place this month is New York Internet. The company has consistently performed well in Netcraft's most reliable hosters rankings, having been in the top five every month for the last six months. NYI has a strong commitment to network availability, maintaining upstream connectivity to multiple top tier providers, as well as its own peering points with small to medium ISPs.

Third place goes to INetU, which failed to respond to only six of Netcraft's requests in the last month. INetU has also been a regular fixture in the most reliable hosters recently, appearing in the top five eight times in the last year.

In terms of operating systems used by the most reliable hosters in October, the top ten are evenly split between Linux and FreeBSD.

Netcraft measures and makes available the response times of around forty leading hosting providers' sites. The performance measurements are made at fifteen minute intervals from separate points around the internet, and averages are calculated over the immediately preceding 24 hour period.

From a customer's point of view, the percentage of failed requests is more pertinent than outages on hosting companies' own sites, as this gives a pointer to reliability of routing, and this is why we choose to rank our table by fewest failed requests, rather than shortest periods of outage.

Information on the measurement process and current measurements is available. suffers global downtime suffered an outage for a short period this morning.

Yahoo!'s main website is currently the 14th most visited website in the Netcraft Toolbar dataset, so even a relatively short outage like this will have affected a large number of people. The site also suffered a worldwide outage last month.

Many of Yahoo!'s websites, including, are served with the YTS/1.18.5 (Yahoo! Traffic Server) header. Traffic Server was originally developed by Inktomi Corporation as a proxy cache for web traffic and streaming media. The company was later acquired by Yahoo! in 2002.

Yahoo!'s widespread use of YTS was largely hidden until November 2008, when the YTS/1.17.8 server banner was seen on more than 220,000 Yahoo!-hosted sites. Prior to that time, the sites did not return a Server header at all.

Netcraft's November 2010 Web Server Survey includes nearly 1.4 million sites using YTS.