Google's top energy executive has offered some simple steps for making data centres more energy efficient, including raising the thermostat to 80 degrees Fahrenheit, or 27 degrees Celsius, to cut down on cooling costs. Data centre staff at some companies walk around in jackets because the buildings are kept so cold, said Bill Weihl, Google's "green energy czar," at the GreenNet conference. "In our facilities, the data centre guys are often wearing shorts and t-shirts," he said.
The tips he offered have been batted around at data centre conferences for a few years, but it's likely that many companies still aren't making use of them, especially to the degree Google does at its own tightly-run facilities. By taking fairly basic steps, most data centres could lower their PUE to 1.5, Weihl said, compared to an industry average of 2.0 or more. PUE, or Power Usage Effectiveness, measures the total energy consumed by a data centre against how much actually reaches the IT equipment.
Cutting energy use in datacentres has become critical for many businesses. Capacity constraints, rising energy costs and the threat of carbon legislation have made the issue an urgent one for some companies.
The first place to look for savings is in the cooling and power distribution systems that account for about half of datacentre energy consumption, Weihl said.
His first tip is to keep cold air used for cooling servers separate from the hot air that gets generated after cooling. This hot aisle/cold aisle containment is often done with a plastic roof laid over the server aisles and heavy plastic curtains, like those used in meat lockers, at each end for access.
Some companies are already doing this in large datacentres. Google appears to do it even in smaller server rooms, judging from a photograph Weihl showed of a "small, couple-of-hundred-kilowatts" facility. The photo showed large metal plates as well as the plastic curtains to contain the aisles.
He also suggested running datacentres at higher temperatures. Google typically runs its datacentres at 80 Fahrenheit, he said, compared to a norm of 70 degrees or lower. "Look at the rated inlet temperature for your hardware. If the server can handle 90 degrees then turn the heat up to 85, even 88 degrees," Weihl said.
His third tip, "giving your chillers a rest," requires a bit more work. Chillers, which are basically giant air conditioners, can be supplemented with fresh air cooling, using outside air to cool the data centre, and evaporative cooling towers, which use water evaporation to aid cooling, much as the human body uses perspiration.
He also recommends virtualisation to improve utilisation rates, turning on the power management tools that come with most equipment, and investing in newer, more power efficient equipment. "In almost all cases it's worth your money to buy a more efficient component up front, then you'll save in energy costs over the life of the equipment," he said.
PCs, printers and other client devices account for the biggest share of emissions from IT and communications gear, Weihl said, citing The Climate Group's Smart 2020 report. "But the data centre side is growing a lot more quickly, and that's something we've got to get a handle on."