Microsoft’s Datacenters and the Deep Blue Sea

Posted

oceanIt’s not often that datacenter cooling is big news in the consumer media. But it was yesterday: Microsoft introduced Project Natick, which focuses on the placement of a datacenter on the ocean floor. The story was covered everywhere.

The initial research vessel is named the Leona Philpot, a character from the Halo video game. It was submerged and ran on the ocean floor about one kilometer off the coast of San Luis Obispo, CA from August to November of last year.

There are several advantages to the dumping a datacenter in the ocean: Microsoft ways that almost half of people live near the coast. Serving them from nearby reduces latency and generally builds efficiencies. The ocean provides a natural source of power. It also cuts costs: Since there are no people, those associated costs – from cafeterias to parking garages – are eliminated. Real estate costs are less as well.

While each of those advantages is appealing – and they cumulatively add up to tremendous savings -- the most important driver of the project is cooling. Natural cold water cooling saves prodigious amounts of energy.

Bob Johnson at Jilard offers good details on what Microsoft did. Leona Philpot contains a metal pod, eight feet across, which has a rack of servers. These are surrounded by pressurized nitrogen which, the piece says, aids in removing heat. Performance is monitored by more than 100 monitors.

The on-board sensors helped the research team monitor data such as motion, humidity, and pressure inside the pod. They discovered that sounds from the pod’s fans were quieter than nearby wildlife, and heat dissipated well so that temperatures were only high about a few inches away from the data center.

It is no slam dunk: Johnson points out that the salty environment is corrosive and could lead to leaks, storms could bounce the submergible around and damage the sensitive gear and, of course, equipment can’t be fixed or replaced. Clearly, there is potential. The question is if those liabilities can be neutralized to the extent necessary carry the project forward.

Clearly, what Microsoft announced is the highest profile example of liquid cooling of datacenter equipment. It is far from the only one, however.

Another approach is being taken by Nautilus Data Technologies. The company is using the water – but not by submerging the datacenter. Its Waterborne Data Center is mounted on a barge, which is moored in a secure location.

The system employs a secondary closed-loop heat exchange technique that does not waste water, which is a considerable advantage over other implementations. Nautilus claimed this can save up to 130 million gallons of water a year in comparison to a mid-size land-based datacenter with a water cooling architecture.

The story says that that datacenter is 30,000 square feet. The power source, conversion and backup systems are on land. This means that the equivalent land-based datacenter would be 80,000 square feet. That’s a bit misleading, of course, because the only space that truly is saved is for the cooling.

It will be interesting to see if Nautilus eventually creates a version of the datacenter that is self-contained on the barge. That would enable it to move in case of a storm and travel to where it is needed, such as to a natural disaster.

Whenever cool sounding but somewhat “out there” technology is mentioned, Google’s name generally is mentioned. This is no different. The company got a patent on floating datacenters all the way back in 2009. SEO by the Sea had the story way back then. Conceptually, the idea is similar  to Nautilus’ and fairly well advanced. The floating datacenter idea often is linked to the mysterious Google barges that sporadically pop up in the news.

Several trends make water-based datacenters likely: The first, of course, is the sheer explosion of demand for datacenter capacity. The power necessary to meet that demand is overwhelming in and of itself. On top of that, it is growing in an era in which energy efficiency and carbon neutrality is paramount. Putting datacenter on or in the water seems to be very good fit for meeting demand in an environmentally friendly way. Finally, the parallel modularization of datacenters will generate technology and knowhow that will help make these concepts commercially viable.

In any case, the technology is changing because it has to. "The traditional data center model is unsustainable,” Arnold Magcale, Nautilus Co-Founder and CEO Arnold Magcale told Energy Manager Today. “Land-based models consume massive amounts of water and energy to power the facilities, wasting natural resources at an alarming rate. Innovative and sustainable designs are critical to ensure global data centers are able to keep up with the demand of a world increasingly dependent on technology to survive and thrive.”

Environment + Energy Leader