Secrets of successful data centres Neal Weinberg July 12, 2010 Network World US Share Twitter Facebook LinkedIn 1. Security by obscurity You don't see flashing neon signs on today's data centres. The goal is to keep as low a profile as possible. 2. Security and biometrics go hand in hand For example, at Navisite's data center in Andover, Mass., everyone entering the data center must swipe a smart card and pass a sophisticated palm reader. 3. Here comes the sun At Emerson's new data centre in St. Louis, a 7,800-square-foot rooftop solar array can generate 100 kilowatts of energy. 4. Brrrrrrrring it on. At Thomson Reuters' new data centre in Eagan, Minnesota, ambient-air cooling is used between 3,300 to 3,500 hours, or roughly 140 days per year. 5. Hold the HOH If a fire breaks out in a data centre, a traditional sprinkler system would put it out. It would also put the company out of commission by destroying the servers, storage equipment and data stored on those devices. A waterless system can fight a fire by using a special gas instead. 6. Hang 'em high If you're blowing cold air up from under the floor, you should think about putting all of the cables in the ceiling tiles. That way the cables don't interfere with the air flow. 7. Up on the roof Putting heat exchangers on the roof allows data centre mangers to save on the underground copper pipes that typically connect the air conditioners to the heat exchangers located on the ground near the building. 8. Cool chips The source of all that unwanted heat, after all, is the CPU, so if you want to tackle the problem at the source, look for the latest, more energy efficient chips. An example would be Intel's Microarchitecture, known as Nehalem. 9. Attack the rack Sticking to the theory that attacking server-generated heat closest to the source is the most efficient approach, IBM has designed a product it calls Rear-Door Heat eXchanger. This four-inch wide, liquid-cooled device fits on the back of a standard server rack and passively removes an estimated 55% of the heat generated by a full server rack. 10. Map it You can't develop a comprehensive plan to reduce data center energy costs without first doing an analysis of where your hot spots and cold spots are today. 11. Sensor overload The key in setting up and operating a successful data centre is continually monitoring the temperate, at both the ceiling and rack levels. For example, IBM has deployed 100 sensors in a 2,000-square-foot data centre. All that data is fed into an automated monitoring system. 12. Blowing hot and cold One of the core concepts in today's data centre is the hot aisle/cold aisle architecture. Cold air is pumped up from the floor into the front of the servers. Hot air is vented out the back. The hot air rises into a venting/air conditioning system that cools the air and re-circulates it back up through the floor. But data centres are now getting extremely granular, using variable-speed fans linked to sophisticated sensor networks to dynamically adjust the cold air flow based on CPU usage. 13. Redundant WAN links If your WAN links go down, your data centre is kaput. The trick is to go with multiple ISPs and to run redundant physical WAN links. 14. Back it up We know that things can go wrong, so it's critical to have multiple back-up systems in place. That means offsite storage, it means battery backup, it means backup generators. 15. Get virtual The underlying scenario for today's data centre is server consolidation brought about by virtualisation technology. Companies are drastically reducing the number of data centres, and reducing the number of physical servers. On the hardware side, blade servers are allowing companies to squeeze more computer power into smaller areas.