With its 7,000-square-foot datacentre nearing capacity, NetApp decided last year to squeeze more out of the space. By consolidating servers and replacing older hardware, the company created an energy-efficient, high-density facility with superior server utilisation.
That, in and of itself, is a worthy sustainable-tech project, but special kudos go to NetApp's unsung heroes who dealt with the upgraded datacentre's dirty little secret -- a whole lot of hot air.
NetApp, a Silicon Valley-based maker of storage and data management products, originally built its datacentre in 2004. In order to prevent servers from overheating, the basic design had racks arranged in hot-and-cold aisles for collecting and expelling hot air, as well as air handlers equipped with Economisers.
"Our design in 2004 was considered somewhat leading edge," says Ralph Renne, site operations manager at NetApp, adding that wiring, cabling, and air-conditioning systems were placed near the ceiling instead of under a raised floor, since cold air falls and hot air rises.
The air-conditioning units, though, operated at an inefficient 8-degree Fahrenheit Delta T (or the temperature difference between the supplied cool air and the returning hot air). The short 8-degree Delta T cycle meant that the overall room temperature needed to be kept at a cool 55 degrees in order to provide adequate removal of the heat in the room.
If the outside air was 55 degrees or below, Economisers (which control the use of outside air) would kick in. If the outside air was slightly more than 55 degrees, then the Economisers would blend this air with cool air generated from the air conditioners to achieve 55 degrees. If the outside air was 65 degrees or higher, then the cooling system would have to bypass outside air by circulating cool air inside the datacentre -- the costliest and most energy-intense option.
A dense datacentre changes the cooling rules
Transforming the facility into a high-density datacentre threw the cooling setup for a loop. The remake consisted of consolidating 343 servers to 177 via virtualisation and server retirement, and replacing 50 storage systems with 10 new ones. The footprint of the storage equipment, in particular, was reduced from nearly 25 racks to just over 5 racks. Fewer storage machines meant less demand for chilled water and electricity. Specifically, savings for this higher-density setup included 94 tons of chilled water and 494,208 kWh of electricity, or US$60,000 in annual savings.
The problem was, a higher-density design pushes out more heat in tighter spaces, so NetApp needed to get better at cooling. The trick lay in segregating hot and cold air, rather than merely arranging cold and hot air aisles. The latter allows too much mixing of hot and cold air, and thus supplied cold air quickly becomes returning hot air without having done much cooling work.
Renne and his team placed wireless temperature sensors on 388 server racks to gain a better understanding of hot and cold air spaces in the new high-density layout. They used vinyl curtains to enclose hot aisles so that hot and cold air couldn't mix as freely. They installed a cogeneration system that captures heat waste and turns it into electricity to power chillers.
No more sweaters in July
The new cooling design reduced hot and cold air mixing and increased the Delta T to nearly 22 degrees. With more efficient expulsion of hot air and better use of cold air, the overall datacentre no longer had to be kept at a chilly 55 degrees. The new room temperature target: 67 degrees.
Economisers could now use outside air in the 70-degree range. "This literally gets us through spring," Renne says. And during summer months, Economisers can continue to blend outside air with cool air generated from air conditioners to achieve 67 degrees (although the datacentre will still need recirculated cool air during the hottest days).
By using Economisers more often, NetApp not only keeps its new high-density datacentre cool but stands to save about 930,000 kWh annually, or 15 percent less energy usage from the old datacentre. This translates into $105,000 savings every year.
So how much did this new cooling design cost the company? Not much. Total capital expenditure, says Renne, was $167,000 -- not including the $140,000 rebate from PG&E, the local utility, as an incentive to do this kind of project. "ROI was in three months," Renne says.
Have you registered for Green IT08?
ComputerworldUK is sponsoring Green IT 08, the UK’s first event dedicated to Green IT issues, strategies and solutions across the Enterprise and Public Sector.
Computerworld UK readers can get a 15% registration discount for the event. Learn from leading analysts Forrester, 451 Group and Aberdeen Group and IT leaders from BT, Tesco, MoD, HMRC, how a Green IT strategy can drive costs savings and operational efficiencies through your IT organisation. To secure this discount register here using the code IDG 567.
Find your next job with computerworld UK jobs