National Snow and Ice Data Center Greens Up How it Cools Off


Photo via Ron Weaver, NSIDC

One of a data center's biggest problems is keeping servers cool, and that's true even for the National Snow and Ice Data Center (NSIDC), which offers data products to researchers and businesses about Arctic and Antarctic snow and ice. Of course, it doesn't want to be a contributor to the GHGs that speed up ice melting, so it is undergoing a $600,000 renovation to change the way it cools its servers and reduce its total energy use. Check out how it is becoming one of the most energy efficient data centers in the country. The NSIDC is located at the University of Colorado, at Boulder, which is already a fairly cool place. The data center did as most do -- it uses a lot of electricity to cool internal air and push it over the servers to keep them cool, constantly recycling air. However, a smarter idea was just put on the table, one that more and more data centers in cooler climates are using as a strategy. Why not use the already cold air that's outside to cool the servers, and push the warmed air back outside to cool naturally?

National Geographic reports that the data center "needs 100 kilowatts per hour of fossil fuel power to process data on the state of the world's frozen regions. That's roughly the amount of electricity it takes to power about 80 average U.S. homes, according to the latest figures. And about half of that power is spent not to crunch data, but just to cool the equipment." But with a $600,000 investment in renovating their system, they expect a 90% reduction in cooling costs.

Through virtualization and higher density memory disks, the data center has reduced the number of servers by 60% from its original 70 servers, according to NatGeo. A 25 kw solar array on the building's roof will help reduce how much fossil fuel it uses to keep what remains cool. And using outside air for cooling, and using evaporation cooling technology on warm days, will show such a reduction in energy use that the renovation will pay for itself in just three years.

Technology for reducing data center energy use is getting better all the time, and much of it stems out of using common sense. Location, such as putting data centers in cooler climates or areas where there is readily available renewable energy, how the server rooms and racks are designed, and even the shape of the data center itself are examples. Also, virtualization, which reduces the number of servers needed in a single data center, solid state memory technology that reduced how much energy is needed to keep the servers accessible, and software systems that minimize how long servers need to run at a given time are pieces in a very large puzzle. Our need for data centers is only going to grow, so the innovation in the industry is very encouraging.

Follow Jaymi on Twitter for more stories like this
More on Efficient Data Centers
Energy Star Hands Out First Ever Certification For a Data Center
HP Opens World's First 100% Air-Cooled Data Center
Microsoft Looking at Ultra-Modular Data Centers as Ultra-Efficient Solution
HP Turns Dairy Farm Poop into Data Center Power

Tags: Computing | Corporate Responsibility | Energy Efficiency

Best of TreeHugger 2014

WHAT'S HOT ON FACEBOOK