Protracted Availability problems for

The web site is experiencing the latest in a series of outages, which began April 24 and have intensified since Tuesday, with the site either exhibiting very slow response times or being unreachable. The duration of the performance problems is unusual for a network provider the size of (previously Metromedia Fiber Network). The company has not yet responded to an inquiry about its site performance.

AboveNet site performance

Our Hosting Provider Network Performance summary provides current information on the uptime for web sites of major hosting companies.

Interview with Miguel de Icaza, co-founder of Gnome, Ximian and Mono

Born in Mexico City, Miguel de Icaza was the driving force behind the creation of the Gnome free software desktop, and co-founded the open source company Ximian, bought last August by Novell. In July 2001, he helped start another ambitious project, Mono: a free implementation for GNU/Linux of Microsoft's .Net framework. He talks to Glyn Moody about Mono's progress, how Ximian was bought by Novell, and why he is so scared of Microsoft's Longhorn.

Q. How has your vision of Mono changed since you began the project, and what are the main aims of Mono today?

A. A lot of the things that Microsoft was addressing with .Net were touching on existing pain points for us. We've been using C and C++ way too much - they're nice, but they're very close to the machine and what we wanted was to empower regular users to build applications for Linux. Windows has a lot of tools that address a particular problem but on Linux we're kind of on our own in terms of development So when Microsoft came out with this [.Net] thing, initially what we saw was very interesting, and that's how the project got started. But as people got together and started to work and collaborate on this effort, a couple of things happened.

The first one is that there was more and more momentum behind building APIs that were compatible with the Microsoft ones. Novell and Ximian were focused just on the core and C#; a lot of the people who came and contributed software to the project were interested in Windows Forms, or ASP.Net or Web services or databases, which were part of the Microsoft stack.

And at the same time we have grown organically a stack completely independent of the Microsoft stack, which we call the Mono stack but it includes things like tools for doing GUI development for Linux - that was one thing that we were very interested in and we actually invested a lot of effort into that.

So today at the core we still have Mono, which is what we wanted to do, and now we've got two very healthy independent stacks: the Microsoft-compatible stack for people who want to bring their applications from Windows to Linux, and also this completely new and fresh stack of things that in some cases are portable from Linux to Windows, and in some cases are very, very Linux specific.

Q. Microsoft doesn't seem to be making so much noise about .Net these days: what's your view of .Net's progress at the moment: how is it shaping up as a platform for writing software?

Continue reading

CrystalTech Hosting Bought by Financial Services Firm

Windows hosting specialist CrystalTech Web Hosting has been acquired by financial services firm Newtek Business Services. Both companies target the market for small and medium-sized businesses. CrystalTech, based in Phoenix, Ariz., hosts more than 30,000 active sites, including 25,000 running on Windows Server 2003.

CrystalTech President and CEO Tim Uzzanti said the pressure to reach new prospects in the price-sensitive hosting industry was a major factor in seeking an acquirer. "The problem is that marketing a single product or service line to what is a largely untapped market costs money, and those costs are generally passed on to the end user in the form of higher service fees or other add-ons." The deal allows CrystalTech's hosting services to be marketed to NewTek's base of existing customers.

Continue reading

Attackers Use SSL Exploit to Target Australian Banks

Attackers appear to be actively scanning for Windows servers running Secure Sockets Layer (SSL) that remain unpatched against the PCT security hole, with the most active efforts apparently targeting Australian banks.

Scanning of port 443 increased late last week, according to the SANS Institute, which urged administrators running Windows servers to install the patch issued by Microsoft. Port 443 is used by SSL, which encrypts sensitive information for e-commerce transactions. Several published exploits allow attackers to gain control of unpatched Windows SSL servers and any customer data stored on them.

"Internet hackers based in Brazil, Germany and the Netherlands have launched attacks against some of Australia’s largest financial institutions over the Anzac Day long weekend," Internet Security Systems said in a press statement, saying the activity became pronounced Thursday evening. "By Friday 8 am the attacks had escalated significantly and by lunch time we became aware that hackers were trying to infiltrate many of Australia’s largest financial institutions," said ISS (Australia) Managing Director Kim Duffy. "Hackers have now developed and published three attack ‘tools’ and, as these tools become more widely available, it is expected that the target base will grow and include government and commercial."

Continue reading Admits Cloaking Customer Sites to Improve Search Ranking

Aplus.Net admitted Friday that it had manipulated customer web sites to try and improve its ranking in the Google search engine, inserting "hidden links" that made it appear that more than 17,000 sites were linking to's home page. The technique may have helped achieve a first-place Google ranking for the term "dedicated servers."

The San Diego web hosting company said the links had been installed by a company hired to optimize's search engine ranking, and that it had completely removed the hidden links from customer sites. "We didn't apply enough control over what our subcontractor was doing," said Ivan Vachovsky, CEO of Aplus.Net. "We have changed our procedures so that it never happens again." used a technique known as "cloaking," detecting when Google's spider was visiting any of its customer sites, and then inserting HTML code with the terms "Web Hosting," "Dedicated Servers" and "Domain Names," all linked to

Continue reading

Desperately seeking Web Search 2.0

It is a moot point whether the first Web era began with the announcement of the general availability of Tim Berners-Lee's initial code; with Mosaic, the first popular browser; or with Netscape Navigator, its commercial offspring and nemesis. But the Web only turned from an exciting technology into a mass medium once directories like Galaxy and Yahoo, and early search engines such as Lycos, the World Wide Web Worm and Webcrawler, provided ordinary users with something just as important as the browser, and complementing it: a way to find things.

Subsequent developments in the navigational field were largely a matter of scaling-up. Those around at the time will probably remember the excitement in early 1996 when Digital's Altavista first appeare d offering an unprecedented full-text search of no less than 16 million Web pages. The culmination of what might be called Web Search 1.0 was, of course, Google. Forget about the fancy algorithms: what really counted was the fact that it was just so much bigger than anything that had gone before.

Today, though, sheer size is not enough. It has been claimed that Google employs 100,000 computers for its search platform - making it the biggest and highest-profile deployment of GNU/Linux in the world. But its store of 4 billion pages is only 20 times the current number on the upstart search engine Gigablast, which runs on just eight servers, and which ultimately aims to index 5 billion pages.

Continue reading