Free Cooling: the Server Side of the Storyby Johan De Gelas on February 11, 2014 7:00 AM EST
- Posted in
- Cloud Computing
- IT Computing
- Ivy Bridge EP
Data centers are the massive engines under the hood of the mobile internet economy. And it is no secret that they demand a lot of energy: with energy capacities ranging from 10MW to 100MW, they require up to 80,000 times more than what a typical US home needs.
And yet, you do not have to be a genius to figure out how the enormous energy bills could be reduced. The main energy gobblers are the CRACs, Computer Room Air Conditioners or the alternative, the CRAHs, the Computer Room Air Handlers. Most data centers still rely on some form of mechanical cooling. And to the outsider, it looks pretty wasteful, even stupid, that a data center is consuming energy to cool servers down while the outside air in a mild climate is more than cold enough most of the time (less than 20°C/68 °F).
There are quite a few data centers that have embraced "free cooling" totally, i.e. using the cold air outside. The data center of Microsoft in Dublin uses large air-side economizers and make good use of the lower temperature of the outside air.
Microsoft's data center in Dublin: free cooling with air economizers (source: Microsoft)
The air side economizers bring outside air into the building and distribute it via a series of dampers and fans. Hot air is simply flushed outside. As mechanical cooling is typically good for 40-50% of the traditional data center's energy consumption, it is clear that enormous energy savings can be possible with "free cooling".
Air economizers in the data center
This is easy to illustrate with the most important - although far from perfect - benchmark for data centers, PUE or Power Usage Effectiveness. PUE is simply the ratio of the total amount of energy consumed by the data center as a whole to the energy consumed by the IT equipment. Ideally it is 1, which means that all energy goes to the IT equipment. Most data centers that host third party IT equipment are in the range of 1.4 to 2. In other words, for each watt consumed by the servers/storage/network equipment, 0.4 to 1 Watt is necessary for cooling, ventilation, UPS, power conversion and so on.
The "single-tenant" data centers of Facebook, Google, Microsoft and Yahoo that use "free cooling" to its full potential are able to achieve an astonishing PUE of 1.15-1.2. You can imagine that the internet giants save massive amounts of energy this way. But as you have guessed, most enterprises and "multi-tenant" data centers cannot simply copy the data center technologies of the internet giants. According to a survey of more than 500 data centers conducted by The Uptime Institute, the average Power Usage Effectiveness (PUE) rating for data centers is 1.8. There is still a lot of room for improvement.
Let's see what the hurdles are and how buying the right servers could lead to much more efficient data centers and ultimately an Internet that requires much less energy.
Post Your CommentPlease log in or sign up to comment.
View All Comments
lwatcdr - Thursday, February 20, 2014 - linkHere in south florida it would probably be cheaper. The water table is very high and many wells are only 35 feet deep.
rrinker - Tuesday, February 11, 2014 - linkIt's been done already. I know I've seen it in an article on new data centers in one industry publication or another.
A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center.
noeldillabough - Tuesday, February 11, 2014 - linkI was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".
Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers!
superflex - Wednesday, February 12, 2014 - linkThe enviroweenies would scream about you defrosting the permafrost.
Some slug or bacteria might become endangered.
evonitzer - Sunday, February 23, 2014 - linkUnfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise.
fluxtatic - Wednesday, February 12, 2014 - linkI proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.
I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.
Varno - Tuesday, February 11, 2014 - linkYou seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?
JohanAnandtech - Tuesday, February 11, 2014 - linkDo you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet.
Kristian Vättö - Tuesday, February 11, 2014 - linkGoogle's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold.
JohanAnandtech - Tuesday, February 11, 2014 - linkI admit, I somewhat ignored the Scandinavian datacenters as "free cooling" is a bit obvious there. :-)
I thought some readers would be surprised to find out that even in Sunny California free cooling is available most of the year.