Google recently released an inside look into its data centres for the first time. Up until then they have kept their sites under close guard, which according to them, is in the interests of protecting the security and privacy of “your” data that they hold.
Other opinions indicate that the primary reason for this is that Google still views its data centre empire as one of its most important advantages over the online competition, and it’s determined to keep the latest technology hidden from rivals. So, not surprisingly, the peek that they provide of “where the Internet lives” gives no technical details away.
These data centres (also known as Googlenet) are widely regarded as the most advanced operation on the web, but there’s still much that isn’t disclosed, including exactly how many servers it operates worldwide. It owns six in the U.S. and three in Europe, and four more are under construction (three in Asia and one in South America). But the company declines to say how many other “colocation” data centres it uses, sharing data centre space with other outfits.
Also, it tends to keep its latest technology to itself. Especially, the networking tech used inside its worldwide data centre empire, as Google infrastructure boss Urs Hölzle explains: “we try to be as open as possible — without giving up our competitive advantage… we will communicate the idea, but not the implementation.”
What we do know is that because Google designs its own networking equipment and servers, it’s driving a massive shift in the worldwide hardware market. It’s believed that Google is now Intel’s fifth largest server chip customer (a clear sign that it’s now one of the world’s largest hardware makers), but it contracts with outside manufacturers when it comes time to actually build its machines. It’s thought that the company is using a contract manufacturer located in Canada or Mexico, or perhaps South America.
It’s also at the forefront of green cooling system development for data centres, which it’s designing from the ground up. (Images of this can be seen on Google’s Green Blog). According to Joe Kava, Senior Director, Data Centre Operations “by providing this view into our data centre operations, we hope to inspire other companies to rethink their approaches to data centre cooling. Building our own cooling systems means we can keep our data centres cool using a fraction of the energy used by a typical data centre chiller and that translates to reliable, carbon neutral services you can use for free”.
So although Google is still being understandably discreet about the details of its technology, it’s apparent that the techniques employed during the development of these data centres are at the cutting-edge of technology.
If you would like to know more about these Data Centres, contact us now.