More than three years have flown by since the first new generic top-level domain (gTLD) was delegated on 23 October 2013. Today, hundreds of new gTLDs are now available, giving consumers and businesses the opportunity to register domains under the likes of .science, .guru, .xyz, .expert, .ninja, .pizza, .wine, and many more.
ICANN's New gTLD Program was launched in June 2011, and it received nearly 2,000 applications when the application window eventually opened in January 2012. Guided by a 338-page application book, each applicant was required to pay a $185,000 evaluation fee, which was intended to recover the costs involved in running the New gTLD Program.
The initial application fees alone have netted ICANN more than $300 million to date, so the program has arguably been worthwhile from its point of view, avoiding the need to subsidise it with ICANN's other funding sources; but with such high fees, how successful has it been for the applicants?
The fact that each applicant had stumped up $185,000 for each gTLD evaluation suggests that they must have had a fair degree of confidence in their own business plans before filing their gTLD applications. Applicants are required to provide financial projections, which would typically include forecasted registration volumes and the associated cash inflows. Every application that passes ICANN's Initial Evaluation process implies that both the applicant and ICANN were satisfied that the operation of the new gTLD would be sustainable. Even so, profits are not necessarily expected to be instant – the applicant's demonstration of a sustainable business model does not have to reach break-even within the first three years of operation.
Success in numbers
Now, after a few years of growth, it is clear that some of the new gTLDs have been very successful indeed. Take .guru, for instance: this was launched in January 2014, and quickly became one of the most commonly purchased new gTLDs offered by its operator, Donuts Inc. It has nearly 64,000 active registrations, and more than 56,000 of these are running websites that appear in our latest survey.
This registration volume likely translates to somewhere between 1.5-2.0 million dollars in registration fees being paid by consumers each year, depending on which registrar is used. While .guru's domain registry will only receive a portion of the consumer cost of the domain, with the rest being split between ICANN and the registrar, the amounts are likely to be significant.
Beyond the initial evaluation fees, applicants are also required to pay ICANN ongoing quarterly fees; but for the majority of gTLD operators, these will be much lower than the initial application costs. It is likely that .guru in particular is making a handsome amount of profit for its operator.
Donuts is evidently a firm believer in the potential for new gTLDs. Founded by Paul Stahura, who sold the domain name registrar eNom in 2006, this start-up company raised $100m in venture funding and ploughed most of it into 307 applications for new gTLDs.
Donuts operates nearly 200 of the 1,000 or so gTLDs that have been delegated so far (i.e. introduced to the internet's authoritative Root Zone database). While Donuts' .guru gTLD quickly established itself as a favourite, it has since been taken over by .life, .email, .today and .solutions. All of these — including .guru — were launched in 2014, giving them a head start in gaining popularity compared with newer new gTLDs. The .life gTLD is the current leader amongst Donuts' domains, with nearly 79,000 registrations.
In terms of the number of websites (rather than domains) using new gTLDs, the most common one in Netcraft's April 2017 survey is .top. This entered general availability in November 2014 and broke through one million registrations by 2016. The .top gTLD is operated from China by .Top Registry, and is now used by 160 million websites across more than 2 million unique second-level domains (e.g. anlink.top).
Many of these .top sites are nothing more than webspam, but it is the registration volume that counts when it comes to potential revenue, regardless of how interesting the websites are. However, depending which registrar a customer uses, a .guru domain could cost roughly three times the price of a .top domain, so the higher registration volume of .top does not necessarily translate to an equivalently higher revenue. Taking Namecheap as an example, a .top domain costs $0.88 for the first year and $10.88 per year thereafter; whereas a .guru domain costs $6.88 for the first year and $24.88 after that.
The actual revenue being drawn from new gTLDs is not clear, as the financial projections submitted by applicants do not have to be made public; however, a leaked presentation back in 2013 revealed that Famous Four Media put the potential year 1 revenue for each new gTLD at almost $30 million. Famous Four Media is another prominent applicant in ICANN's New gTLD Program, using separate limited companies to apply for 60 new gTLDs. A year ago, its .science gTLD was the most used new gTLD (by hostnames), then used by 66 million websites across more than 160,000 unique second-level domain names (e.g. bmgathome.science).
.top might have most websites (e.g. mail.simplegoods.top), but in terms of unique second-level domains (e.g. gen.xyz), and therefore active registrations, .xyz is the most commonly registered new gTLD in use on the web. Netcraft's latest survey shows it has a registration volume of more than 3.7 million, although many of these domains will have been given away by XYZ.COM LLC for free or at very low cost.
This time last year, much of the interest in the .xyz gTLD came from China: About 40% of all .xyz websites were hosted in China; more than half of all .xyz registrations originated from China; many of its 200,000 IDNs (internationalised domain names) were in Chinese scripts (e.g. 台北郵購網.xyz); and the single-digit domain 1.xyz sold at auction for a record $182,000 to a Chinese registrant. However, today, the United States hosts nearly 80% of all .xyz sites.
Things are evidently going well for XYZ.COM LLC, which also operates several other new gTLDs. Its CEO, Daniel Negari, notably put out a $5 million offer to buy four gTLDs from Rightside Group Ltd, and has also expressed its desire to buy gTLDs from other registry operators, saying it is "cashed up, and ready to do deals".
The large registration volumes of .xyz, .top and .life make these gTLDs serve as flagships for their respective operators, but not all gTLDs are this popular. For example, .accountants has only 1,400 registrations, even though it has been operated by Donuts since 2014. However, this lower uptake is not too surprising, as the target registrants for this particular gTLD are professionals practicing in the field of accounting and auditing. Lower registration volumes are therefore to be expected among these niche gTLDs, but the operational costs can be countered by charging more per registration – registering a new .accountants domains costs around five times more than a .guru domain (again, depending which registrar is used).
The .accountants gTLD also has to contend with the similar—but much cheaper—.accountant gTLD, which is managed by Famous Four Media. Despite the obvious similarity and mission overlap, the .accountant gTLD was approved by ICANN and delegated in March 2015.
Netcraft's survey found more than 50 times as many domains registered under the cheaper .accountant gTLD. While there are undoubtedly more individual accountants than there are groups of accountants, the cheaper cost of .accountant domains must also play a big part in these different registration volumes.
Most obviously, cheaper domains are more likely to appeal to domain squatters and ad networks. Demonstrating this, more than half of all .accountant websites are hosted by a single company, with most of these sites being used to display monetized search links rather than anything to do with accountancy.
Nonetheless, registrations are a gTLD operator's primary source of revenue, and so it is largely inconsequential to the operator what the registrants end up using these domains for. Although the .accountant gTLD is aimed at accountants and related businesses, it is actually possible for anyone to register these domains. Registrants of .accountant domains are required to agree to the Registry's Abuse and Rights Protection Terms and Conditions, which includes displaying an APM seal on their homepages. This measure is supposed to "augment the security and stability" of the gTLD, but it seems that this requirement is not actively enforced, as many of the spam sites using the .accountant gTLD do not display this seal at all.
Other metrics for success
Financially, it looks like the well-established new gTLDs have been successful, and many of the newer ones have similar potential; but this success has not yet manifested itself so visibly on the internet.
The most commonly registered gTLD, .xyz, might have 3.7 million current registrations, but fewer than 2,500 of these domains appear amongst the top million websites; and even though .science was the most commonly used new gTLD this time last year, even fewer of these—just 22—have made it into the top million. These amounts are mere drops in the ocean compared with the well-established .com, which is used by more than 403,000 unique domains within the top million sites.
Much of the early success of .xyz—relative to other new gTLDs, at least—can be put down to a Network Solutions promotion which offered a free matching .xyz domain with each .com domain purchased. Within its first ten days of operation, Network Solutions had registered nearly 100,000 .xyz domains, but many of these could not be monetized until the following year when the domains became due for renewal.
Phishers seizing new opportunities
Unsurprisingly, fraudsters have also exploited the plethora of new gTLDs by registering domains that are then used to host phishing sites. Many of the domains involved in recent attacks appear to have been registered specifically for the purpose of fraud, rather than belonging to sites that had been compromised.
While ICANN requires all gTLD registries to deal only with registrars that prohibit end-users from carrying out phishing attacks, each registry maintains its own safeguards, meaning that some are better than others at proactively defending against fraud.
With some new gTLD operators allowing domains to be registered by fraudsters, and others failing to enforce their own safeguarding policies effectively, it is clear that more could be done to make new gTLDs safer; but fraud prevention and policy enforcement often consumes time and money. The availability of both of these resources is largely dependent on how much revenue the gTLD operator makes, and so the operator's effectiveness at wiping out fraud could, bizarrely, also serve as a metric for success.
So, are they a success?
In conclusion, most new gTLDs appear to have been successful in some way or another, whether that be measured in registration volumes or revenue. Many of the new gTLDs that have low registration volumes are operated by companies who also operate several other gTLDs, so even if they were to make a loss on one, it would likely be offset by their more successful gTLDs. One thing that can be said for certain is that the new gTLD program has succeeded in its goal of giving registrants a much wider choice of domain names, whilst resulting in millions of dollars being exchanged between ICANN, the operating registries, and domain registrars.
However, there are indications of a slowdown in applications for new gTLDs: ICANN's Draft FY18 Operating Plan and Budget forecasts that its revenue from new gTLD applicant fees in FY2017 will be only $21 million, compared with $27 million (actual) the previous year, and $71 million the year before that. While this projection is unlikely to affect the revenue being made by the operators of existing new gTLDs, it suggests that the hundreds of new gTLDs in operation today may already provide more than enough choice for most consumers.
Netcraft services for new gTLD operators
New gTLD operators can confidently protect their top-level domains against phishing and malware with Netcraft's suite of services for domain registries. Taking a proactive stance against these attacks is vital, as it demonstrates to fraudsters that they are unwelcome, and thus ensures that the reputation of the new gTLD is not tarnished.
More than 600,000 web-facing computers — which host millions of websites — are still running Windows Server 2003, despite it no longer being supported.
Extended support for Windows Server 2003 ended on July 14, 2015. Crucially, this means that Microsoft will no longer be issuing security updates for any version of Windows Server 2003. US-CERT warns that these unsupported installations of Windows Server 2003 are exposed to an elevated risk of cybersecurity dangers, such as malicious attacks or electronic data loss.
Windows Server 2003 was originally launched over 12 years ago, with the latest major update being released 8 years ago in the form of Service Pack 2. This update was particularly beneficial for web servers, as it added the Scalable Networking Pack (SNP), which allowed for hardware acceleration of network packet processing.
Fifth of the internet still running Windows Server 2003
Netcraft's July 2015 Web Server Survey found 175 million websites that are served directly from Windows Server 2003 computers. These account for more than a fifth of all websites in the survey, making the potential attack surface huge.
Most of these sites (73%) are served by Microsoft Internet Information Services 6.0, which is the version of IIS that shipped with Windows Server 2003 and the 64-bit edition of Windows XP Professional; however, it is rare to see the latter being used as a web server platform.
The remaining Windows Server 2003-powered sites use a variety of web server software, with GSHD 3.0, Safedog 4.0.0, Apache 2.2.8 (Win32), kangle 3.4.8, NetBox Version 2.8 Build 4128 and nginx/1.0.13-win32 being amongst the most commonly seen Server headers. While vulnerabilities in these software products can be addressed by applying patches or updates, future vulnerabilities in the underlying Windows Server 2003 operating system may never be fixed.
14 million of the sites did not send a Server header at all, so it was not apparent whether the web server software used by these sites could be updated, but the underlying computers could still be identified as running Windows Server 2003. Netcraft determines the operating system of a remote web server by analysing the low-level TCP/IP characteristics of response packets, and so it is independent of whichever server software the site claims to be running.
Backend servers might also be exploitable
In addition to the 175 million websites that are served directly from Windows Server 2003 computers, a further 1.7 million sites served from other operating systems sent the Microsoft-IIS/6.0 Server header. This indicates the presence of backend Windows Server 2003 machines behind load balances and similar devices that are not running Windows.
For example, if the TCP/IP characteristics of a web server's response indicate that it is running Linux, but the HTTP Server header reports it is using Microsoft-IIS/6.0, then the Linux machine is likely to be acting as a reverse proxy to a Windows Server 2003 machine running IIS 6.0. Although the Windows Server 2003 machine is not directly exposed to the internet, it may still be possible for a remote attacker to exploit certain Windows and IIS vulnerabilities.
How many Windows Server 2003 installations are exposed to the web?
Netcraft has developed a technique for identifying the number of unique computers that act as web servers on the internet. The 175 million sites that use Windows Server 2003 make use of 1.6 million distinct IP addresses. However, an individual computer running Windows Server 2003 may have multiple IP addresses, which makes this an unsuitable metric for determining how many installations there are.
Further analysis of the low-level TCP/IP characteristics reveals a total of 609,000 web-facing computers running Windows Server 2003. This is over 10% of all web-facing computers, and shows the true potential cost of migration, as software licensing is typically charged on a per-machine rather than per-IP address basis.
Who's still using Windows Server 2003?
China and the United States account for 55% of the world's Windows Server 2003 computers (169,000 in China and 166,000 in the US), yet only 43% of all other web facing computers.
Within China, more than 24,000 of these computers are hosted by Alibaba Group. Nearly half of these are hosted by HiChina, which was acquired by Alibaba in 2009, while 7,500 are hosted at its rapidly growing cloud hosting unit, Aliyun.
One of the most prominent companies still using Windows Server 2003 on the internet is LivePerson, which is best known for the live chat software that allows its customers to talk to their visitors in realtime. Its main site at www.liveperson.com uses Microsoft IIS 6.0 on Windows Server 2003, and several other sites related to its live chat functionality — such as sales.liveperson.net — also appear to use IIS 6.0 on Server 2003, but are served via F5 BIG IP web-facing devices.
Even some banks are still using Windows Server 2003 and IIS 6.0 on their main sites, with the most popular ones including Natwest, ANZ, and Grupo Bancolombia. These sites rank amongst the top 10,000 in the world, and hundreds of other banking sites also appear to be using Windows Server 2003.
ING Direct and Caisse d'Epargne are also using IIS 6.0, but these sites appear to be served through F5 BIG-IP or similar devices, rather than having Windows Server 2003 machines exposed directly to the internet. Even some security and antivirus software vendors are still running IIS 6.0 on public-facing sites, including Panda Security and eScan.
While Microsoft does not officially offer any support beyond the extended support period ("Once a product transitions out of support, no further support will be provided for the product"), reports suggest that some companies who have not migrated in time have arranged to pay millions of dollars for custom support deals.
PCI compliance: Automatic failure
Companies still using unsupported operating systems like Windows Server 2003 in a cardholder data environment should migrate immediately. All organisations and merchants who accept, transmit or store cardholder data must maintain a secure PCI compliant environment.
The Payment Card Industry Data Security Standard (PCI DSS) provides a baseline of technical and operational requirements designed to protect cardholder data and sensitive authentication data. PCI DSS Requirement 6.2 requires all system components and software to be protected from known vulnerabilities by installing vendor-supplied security patches. This will not be possible with Windows Server 2003, as no more security updates will be made available by Microsoft.
Additionally, merchants and service providers who handle a large enough volume of cardholder data must have quarterly security scans by a PCI SSC Approved Scanning Vendor (such as Netcraft) in order to maintain compliance. ASVs are required to record an automatic failure if the merchant's cardholder data environment uses an operating system that is no longer supported.
In some cases, the PCI SSC can allow for risks to be mitigated through the implementation of suitable compensating controls, but these are unlikely to be sufficient for an unsupported web-facing operating system – especially one which will become less secure as time goes by, as new vulnerabilities are discovered.
Consequently, many merchants still using Windows Server 2003 is likely to be noncompliant, and could face fines, increased transaction fees, reputational damage, or other potentially disastrous penalties such as cancelled accounts.
Microsoft advises that any datacenter still using Windows Server 2003 needs to protect its infrastructure by planning and executing a migration strategy. Some possible options suggested by Microsoft include switching to Windows Server 2012 R2, Microsoft Azure or Office 365. To help customers migrate, Microsoft has provided an interactive Windows Server 2003 Migration Planning Assistant, which, incidentally, is hosted on Microsoft Azure.
Finding out more
Netcraft's techniques provide an independent view with a consistent methodology on the number of web-facing computers at each hosting location worldwide. For more information, see our Hosting Provider Server Count, or contact us at email@example.com for bespoke datasets.
For more information about Netcraft's Automated Vulnerability Scanning for PCI Compliance, please contact us at firstname.lastname@example.org.
Typosquatters are cashing in by registering new .uk domains which look similar to those used by existing high-traffic .co.uk websites. By simply registering a .uk domain that ends in "co", the squatters have obtained dangerously deceptive domains such as paypalco.uk and americanexpressco.uk in an attempt to steal traffic from the real domains, paypal.co.uk and americanexpress.co.uk.
Many of these typosquatting domains are being monetized by displaying ads related to the legitimate domains they are impersonating, or by using referral schemes to redirect visitors to the corresponding legitimate site — or even driving visitors towards competing services.
However, the potential for abuse is not limited to making money through advertising and referral schemes. With the only difference being a single additional dot in the real domain name, this form of typosquatting could be exploited to make extremely potent phishing attacks.
First introduced in 1985, the .uk country code top-level domain (ccTLD) has only recently allowed ordinary consumers to register domains directly under .uk (such as stephenfry.uk). Before 10 June 2014, practically all UK domains had to be registered under second-level domains, which categorised the activity of the site. By far the most popular of these second-level domains is .co.uk, which is intended for commercial and general use.
To limit the most obvious potential for domain squatting, existing owners of .co.uk domains were given automatic rights to the corresponding .uk domain (for example nationalrail.uk) on 10 June 2014, providing there was no other equivalent .org.uk, .me.uk, .net.uk, .ltd.uk and .plc.uk domain in existence. The reservation period runs for a period of five years, during which time no other party can register the domain, even if the rightful party chooses not to.
However, these measures are inconsequential to the typosquatters, who seem to have found no barriers in registering deceptive domains such as nationalrailco.uk, barclaysco.uk and hsbcco.uk. The latter two deceptive domains are registered to a corporation in Sweden, and currently display a set of sponsored listings with titles such as "Need a New Bank Account?". Other registered domains which target high-traffic financial institutions include nationwideco.uk, lloydsbankco.uk, bankofscotlandco.uk, halifax-onlineco.uk, natwestco.uk, and westernunionco.uk.
The potential for financial fraud is immense, particularly as many online banking transactions are now carried out using mobile devices, on which typographical errors are naturally more common.
Some of the .uk typosquatting sites are clearly optimised for use on mobile devices, such as nationalrailco.uk, which displays a small form to search for train tickets. However, rather than taking users to the real National Rail website at nationalrail.co.uk, the search form uses the TradeDoubler affiliate scheme to monetize the typo-traffic by directing users to a train ticket sales website at thetrainline.com.
Flagrant typosquatting of popular sites amongst the .uk top-level domain is rife. Another brazen example is mbnaco.uk, which is clearly trying to scoop up typo-traffic from credit card provider MBNA, which uses mbna.co.uk for its main website. The typo domain presents adverts which invite visitors to apply for credit cards at various competitors, including American Express and Capital One.
Companies concerned about typosquatting attacks against their customers can use Netcraft's Fraud Detection service to pre-emptively identify fraudulent domain name registrations. Domain name registrars can use Netcraft's Domain Registration Risk service to analyse the likelihood of a new domain being used for fraudulent activity.
97% of SSL web servers are likely to be vulnerable to POODLE, a vulnerability that can be exploited in version 3 of the SSL protocol. POODLE, in common with BEAST, allows a man-in-the-middle attacker to extract secrets from SSL sessions by forcing the victim's browser into making many thousands of similar requests. As a result of the fallback behaviour in all major browsers, connections to web servers that support both SSL 3 and more modern versions of the protocol are also at risk.
The Secure Sockets Layer (SSL) protocol is used by millions of websites to protect confidential data in transit across the internet using strong cryptography. The protocol was designed by Netscape in the mid 1990s and was first released to the public as SSL 2 in February 1995. It was quickly replaced by SSL 3 in 1996 after serious security flaws were discovered. SSL 3 was replaced by the IETF-defined Transport Layer Security (TLS) version 1.0 in January 1999 with relatively few changes. Since TLS 1's release, TLS 1.1 and TLS 1.2 have succeeded it and should be used in its place wherever possible.
POODLE's bark may be worse than its bite
Unlike Heartbleed, POODLE can be used to attack client-server connections and is inherent to the protocol itself, rather than any one implementation such as OpenSSL or Microsoft's SChannel. In order to exploit it, an attacker must modify the victim's network traffic, know how the targeted secret information is structured (such as where a session cookie appears) and be able to force the victim into making a large number of requests.
Each SSL connection is split up into a number of chunks, known as SSL records. When using a block cipher, such as Triple DES in CBC mode, each block is mixed in with the next block and the record then padded to be a whole number of blocks long (8-bytes in the case of Triple DES). An attacker with network access can carefully manipulate the ordering of the cipher-blocks within a record to influence the decryption and exploit the padding oracle. If the attacker has been lucky (there's a 1 in 256 chance), she will have matched the correct value for the padding length in her manipulated record and correctly guessed the value of a single byte of the secret. This can be repeated to reveal the entire targeted secret.
SSL 3's padding is particularly easy to exploit as it relies on a single byte at the end of the padding, the padding length. Consequently an attacker must force the victim to make only 256×n requests for n bytes of secret to be revealed. TLS 1.0 changed this padding mechanism, requiring the padding bytes themselves to have a specific value making the attack far less likely to succeed.
Use of SSL v3
Within the top 1,000 SSL sites, SSL 3 remained very widely supported yesterday, with 97% of SSL sites accepting an SSL 3 handshake. CitiBank and Bank of America both support SSL 3 exclusively and presumably are vulnerable.
A number of SSL sites have already reacted to this vulnerability by disabling support for SSL 3, including CloudFlare and LinkedIn. On Tuesday 14th, the most common configuration within the top 1,000 SSL sites was to support SSL 3.0 all the way through to TLS 1.2, with almost two-thirds of popular sites taking this approach. One day later, this remains the most popular configuration; however, TLS 1.0 is now the minimum version for 11%.
Microsoft Internet Explorer 6 does not support TLS 1.0 or greater by default and may be the most notable victim of disabling SSL 3 internet-wide. Now 13 years old, IE6 was the default browser released with Windows Server 2003 and Windows XP in 2001 and will remain supported in Windows Server 2003 until July 2015. Despite its age and the end of Microsoft's support for Windows XP, IE6 remains popular, accounting for more 3.8% of web visits worldwide, and 12.5% in China. This vulnerability may ring the death knell for IE6 and Windows XP.
However, unless SSL 3 is completely disabled on the server side, a client supporting SSL 3 may still be vulnerable even if the server supports more recent versions of TLS. An attacker can take advantage of browser fallback behaviour to force otherwise secure connections to use SSL 3 in place of TLS version 1 or above.
SSL version negotiation
At the start of an SSL connection, servers and clients mutually agree upon a version of SSL/TLS to use for the remainder of the connection. The client's first message to the server includes its maximum supported version of the protocol, the server then compares the client's maximum version against its own maximum version to pick the highest mutually supported version.
While this mechanism protects against version downgrade attacks in theory, most browsers have an additional fallback mechanism that retries a connection attempt with successively lower version numbers until it succeeds in negotiating a connection or it reaches the lowest acceptable version. This additional fallback mechanism has proven necessary for practical interoperability with some TLS servers and corporate man-in-the-middle devices which, rather than gracefully downgrading when presented with a non-supported version of TLS, they instead terminate the connection prematurely.
An attacker with appropriate network access can exploit this behaviour to force a TLS connection to be downgraded by forging Handshake Alert messages. The browser will take the Handshake Alert message as a signal that the remote server (or some intermediate device) has version negotiation bugs and the browser will retry the connection with a lower maximum version in the initial Client Hello message.
Operation of a forced downgrade to SSL 3 against a modern browser.
The fallback mechanism was previously not a security issue as it never results in the use of a protocol version that neither the client nor server will accept. However, those with clients that have not yet been updated to disable support for SSL 3 are relying on the server to have disabled SSL 3. What remains is a chicken and egg problem, where modern clients support SSL 3 in order to retain support for legacy servers, and modern servers retain support for SSL 3 for legacy clients.
There is, however, a proposed solution in the form of an indicator (an SCSV) in the fallback connection to inform compatible servers that this connection is a fallback and to reject the connection unless the fallback was expected. Google Chrome and Google's web sites already support this SCSV indicator.
|x 3||x 3|
Comparison of browser fallback behaviour
We tested five major browsers with an attack based on the forged Handshake Alert method outlined above, and found that each browser has a variant of this fallback behaviour. Both Chrome and Opera try TLS 1.2 three times before trying to downgrade the maximum supported version, whereas the remainder immediately started downgrading. Curiously, Internet Explorer and Safari both skip TLS 1.1 and jump straight from TLS 1.2 to TLS 1.0.
Mitigation can take many forms: the fallback SCSV, disabling SSL 3 fallback, disabling SSL 3 in the client side, disabling SSL 3 in the server side, and disabling CBC cipher suites in SSL version 3. Each solution has its own problems, but the current trend is to disable SSL 3 entirely.
Disabling only the CBC cipher suites in SSL 3 leaves system administrators with a dilemma: RC4 is the only other practical choice and it has its fair share of problems making it an undesirable alternative. The SCSV requires support from both clients and servers, so may take some time before it is widely deployed enough to mitigate this vulnerability; it will also likely not be applied to legacy browsers such as IE 6.
Apache httpd can be configured to disable SSL 3 as follows:
SSLProtocol +TLSv1 +TLSv1.1 +TLSv1.2 -SSLv2 -SSLv3Microsoft IIS and nginx can also be configured to avoid negotiating SSL version 3.
Firefox can be configured to disable support for SSL 3 by altering security.tls.version.min from 0 (SSL 3) to 1 (TLS 1) in about:config.
Internet Explorer can also be configured to disable support using the Advanced tab in the Internet Options dialogue (found in the Control Panel). In a similar way, IE 6 users can also enable support for TLS 1.0.
Chrome can be configured to not use SSL 3 using a command line flag, --ssl-version-min=tls1.
You can check which SSL sites are still using SSL 3 using the Netcraft Site Report:
Apache has been the most common web server on the internet since April 1996, and is currently used by 38% of all websites. Most nefarious activity takes place on compromised servers, but just how many of these Apache servers are actually vulnerable?
The latest major release of the 2.4 stable branch is Apache 2.4.7, which was released in November 2013. However, very few websites claim to be using the stable branch of 2.4 releases, despite Apache encouraging users to upgrade from 2.2 and earlier versions.
Less than 1% of all Apache-powered websites feature an Apache/2.4.x server header, although amongst the top million websites, more than twice as many sites claim to be using Apache 2.4.x. Some of the busiest websites using the latest version of Apache (2.4.7) are associated with the Apache Software Foundation and run on the FreeBSD operating system, including httpd.apache.org, www.openoffice.org, wiki.apache.org, tomcat.apache.org and mail-archives.apache.org.
The most recent security vulnerabilities affecting Apache were addressed in version 2.4.5, which included fixes for the vulnerabilities described in CVE-2013-1896 and CVE-2013-2249. Depending which Apache modules are installed, and how they are used, earlier versions may be vulnerable to unauthorised disclosure of information and disruption of service. The previous release in the 2.4 branch (2.4.4), also addressed several cross-site scripting (XSS) vulnerabilities in various modules; such vulnerabilities can severely compromise a web application by facilitating remote session hijacking and the theft of user credentials. Nonetheless, millions of websites still appear to be using vulnerable versions of Apache, including versions which are no longer supported.
Top 15 versions of Apache in February 2014, where the full version string is announced in the Server HTTP response header.
Note that no versions of the Apache 2.4 branch appear within the top 15.
Apache 1.3.41 and 2.0.63 are both end-of-lined.
The Apache 2.0 branch was retired in July 2013 with the conclusive release of Apache 2.0.65. This release addressed a few security vulnerabilities, but no subsequent vulnerabilities will be addressed by official patches or subsequent releases in the 2.0 branch. Anyone still using this branch of releases should strongly consider updating to the latest version in the stable 2.4 or legacy 2.2 branches.
Nevertheless, 6.5 million websites claim to be using the end of life 2.0 branch of Apache, with the most common versions being 2.0.63 and 2.0.52. Only 12k sites are running the conclusive release of this branch (2.0.65). However, it is worth noting that just over half of all Apache-powered websites hide their version numbers, so it is not always possible to accurately determine which version is installed without carrying out additional tests. Hiding software version numbers is usually a deliberate act by a server administrator – Apache 2.4.7 will reveal its full version number by default when installed on Arch Linux, and installing the apache2 package on the latest version of Ubuntu Linux will also reveal "Apache 2.4.6 (Ubuntu)" as the default Server banner.
Due to hidden version numbers, the number of sites openly reporting to be running Apache 2.4.x could be regarded as a lower bound, but conversely, exhibiting a vulnerable version number does not necessarily mean that a server can be exploited by a remote attacker.
For example, the Red Hat Linux operating system uses a backporting approach to applying security fixes, which means that a vulnerability in Apache 2.2.3 can be patched without affecting the apparent version number of the software. From an external point of view, the server will still appear to be running Apache 2.2.3, but it might not be vulnerable to any security problems that would affect a fresh installation of Apache 2.2.3.
Red Hat 5 and 6 use Apache 2.2.3 and 2.2.15 respectively, which explains why these seemingly old versions remain so prominent today (2.2.3 was originally release in July 2006). Both are still supported by Red Hat, and providing the necessary backported patches have been applied, Red Hat Apache servers which exhibit these version numbers can be just as secure as the latest release of Apache. However, because the version numbers correspond to Apache versions which were released several years ago, it is not unusual for Red Hat powered websites to attract unfair criticism for appearing to run insecure versions of Apache.
Certain Apache vulnerabilities can also be eliminated by removing or simply not using the affected modules – a configuration which is also difficult to ascertain remotely. However, exhibiting an apparently-vulnerable version number can still have its downsides, even if there are no vulnerabilities to exploit – as well as attracting unwarranted criticism from observers who falsely believe that the server is insecure, it could also attract undesirable scrutiny from hackers who might stumble upon different vulnerabilities instead. These are both common reasons why server administrators sometimes opt to hide version information from a web server's headers. Sites which do this include wikipedia.org, www.bbc.co.uk, www.nytimes.com and www.paypal.com, all of which claim to be running Apache, but do not directly reveal which version.
A further 6.0 million websites are still using Apache 1.3.x, even though the final version in this branch was released four years ago. The release of Apache 1.3.42 in February 2010 marked the end of life for the 1.3 branch, although 2.4 million sites are still using the previous version, (1.3.41), which contains a denial of service and remote code execution vulnerability in in its mod_proxy module.
The busiest site still using Apache 1.3 is Weather Underground, which uses Apache 1.3.42. This currently has a Netcraft site rank of 177, which makes it even more popular than the busiest Apache 2.0.x website. It is served from a device which exhibits the characteristics of a Citrix NetScaler application delivery controller. Weather Underground also uses Apache 1.3.42 for the mobile version of its site at m.wund.com.
Amongst the million busiest websites, Linux is by far the most common operating system used to run Apache web server software. With near-ubiquitous support for PHP, such platforms make tempting targets for fraudsters. Most of the phishing sites analysed by Netcraft rely on PHP to process the content of web forms and send emails.
The Audited by Netcraft service provides a means of regularly testing internet infrastructure for similarly vulnerable web server software, faulty configurations, weak encryption and other issues which would fail to meet the PCI DSS standard. Netcraft's heuristic fingerprinting techniques can often use the behaviour of a web server to identify which version of Apache is installed, even if the server does not directly state which version is being used. These automated scans can be run as frequently as every day, and can be augmented by Netcraft's Web Application Security Testing service, which provides a much deeper manual analysis of a web application by an experienced security professional.
At the start of the first US Government shutdown since 1996, an SSL certificate used on barackobama.com has expired. Issued by Go Daddy in September 2012, the SSL certificate for *.barackobama.com and barackobama.com was used by Organizing for Action, a non-profit grassroots organisation aligned with Obama's political policies. Whilst not directly associated with the US Government, the expiry of the SSL certificate for barackobama.com during a US Government shutdown is nonetheless a curious coincidence.
Warning in Google Chrome when visiting a website using the SSL certificate for *.barackobama.com.
Several SSL certificates controlled by the US Government expired today and are still being used — for example, the SSL certificates used on both ui.tn.gov and webmail.coop-uspto.gov have expired and may not be replaced any time soon. Furthermore, there are at least 30 US Government sites still using SSL certificates that are scheduled to expire before Friday.
Your link here? Advertising on the Netcraft Blog