Dimanche 08 Décembre 2024
taille du texte
   
Jeudi, 22 Décembre 2011 18:10

World's Data Centers Refuse to Exit Ice Age

Rate this item
(0 Votes)
World's Data Centers Refuse to Exit Ice Age

Some data-center operators are happy to feel a chill in the air. Photo: Scootie/Flickr

There’s an old vaudeville joke about a man who walks into a doctor’s office. And it goes something like this:

“Doctor,” the man says, “it hurts when I do this.”

“Then don’t do that,” the doctor replies.

Amazon data-center guru James Hamilton alluded to this gag at a recent event in New York, using it to explain how data centers should treat power-sapping water chillers and air-conditioning units. If you’re running a data center, air conditioning hurts, he said, so you shouldn’t do it. Air conditioning is expensive, he explained, and in many cases it’s unnecessary. He advised data-center operators to run their facilities at higher temperatures and use outside air for cooling.

Traditionally, data centers have been kept very cold to protect servers and other equipment, but Hamilton is part of growing chorus telling operators they’ve gone too far. Google has advocated raising data-center temperatures for years, and just about all the big-name web outfits — including Google, Facebook, and Microsoft — have shown the world how they’ve managed to cool their data centers with outside air.

As the internet expands — and businesses bring more and more data online — it’s more important the ever that data centers keep their power consumption down. This not only saves money, it eases the burden on the environment, and raising temperatures is one rather easy way to save power. The trouble is that many data-center operators still refuse to dial up the savings. They’re afraid of damaging equipment. They’re afraid of voiding warranties. They’re afraid of change.

The Big Chill

Most data center temperatures are a legacy of the days when computer equipment was sensitive to even moderate heat and computer rooms were routinely chilled to the 60s. “I was recently in a mid-size data center that was so cold the operators were wearing sweaters when they were inside,” says Rich Fichera, a vice president and principal analyst at market research firm Forrester Research.

Data center operators have plenty of guidance telling them that they can safely operate at higher temperatures. The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) set an allowable range of 64.4 to 80.6 degrees Fahrenheit for the temperature of air entering computer equipment. The organization plans to expand that range early next year.

Meanwhile, many equipment makers are pushing the upper bounds of their allowable temperatures into the 90s. “Vendors have already begun to ease their warranty restrictions,” says John Stanley, an analyst at market research firm 451 Research. “Dell has begun to warrant certain server models to run hotter for limited periods of time. They allow these servers to run up to 900 hours per year at 104 degrees Fahrenheit, and up to 90 hours per year at 113 Fahrenheit.”

Turning up the temperature reduces the workload of a data center’s cooling equipment, which lowers its energy bill. “This is clearly the lowest hanging fruit on cost savings and energy efficiency,” says Jim Smith, CTO of Digital Realty, one of the world’s largest data center operators.

Trinity Health, a not-for-profit healthcare provider based in Michigan, runs 25 data centers across the country. The company began experimenting with raising data center temperatures a year and a half ago, and it has since been steadily increasing the temperatures at all of its centers. “When I first came here we were originally in the low to mid 60s for temperatures,” said David Filas, a data center engineer. “We have since brought all of our temperatures up to the mid to upper 70s and had no adverse effects.”

Trinity Health is saving $10,000 to $15,000 a year on air-conditioning costs at its main data center, said Filas. “It’s something that’s easy to do and it costs nothing to do it,” he said.

A simple change data centers can make to improve energy efficiency is putting walls around its hot aisles, according to Amazon’s Hamilton. Servers in data centers are typically stacked in racks, which are positioned side-by-side in long rows. These rows are oriented front-to-front and back-to-back. Cool air is pumped into the front-facing rows. The air heats up as it moves through the servers and is exhausted into the back-facing rows. Hence cold aisles and hot aisles.

Walling off hot aisles prevents hot air from spilling over the tops of the server racks to the cold aisles. This one change can improve a data center’s power usage effectiveness (PUE) rating from 3.0 down to 2.5 or even 2.0, according to Hamilton. “It’s phenomenally powerful,” he said.

Au Natural

Another relatively simple step data center operators can take is piping outside air into the cooling system. Traditional data center cooling systems recirculate air, moving hot air exhausted from server racks into air-cooling units and then pumping the cool air back to the servers. The exhaust temperature is usually 115°F, according to Hamilton. “So even if the outside air is 100°, it’s still the right thing,” he said. “Go get it. Go bring the outside air in.”

A growing number of data centers are using outside air directly to cool computer equipment, a practice dubbed air-side economization. A higher operating temperature means there are a greater number of days in a year when data centers can use outside air.

For organizations building new data centers, the potential savings are even higher. If you design your data center to use air-side economization and you build it in the right place, you might not have to install air conditioning at all. The key is that ASHRAE and many computer equipment makers allow temperatures outside the recommended ranges for limited periods, according to 451 Research’s Stanley.

If you build a data center in a cool climate that allows you to use outside air most of the time, being able to exceed recommended temperatures for a few weeks every summer means you can skip installing mechanical cooling systems altogether. “That’s a big savings in capital costs,” says Stanley.

Fear Factor

The reason many data center operators keep temperatures low can be summed up in two words: server mortality. High temperatures can kill servers, and the fear of losing servers is a powerful force for someone whose job requirement can be summed up in one word: uptime.

Part of the problem is that there isn’t a lot of data about the effect of temperatures on server failure rates. Running an experiment is an expensive proposition, says Rhonda Ascierto, a senior analyst for energy and sustainability technology at market research firm Ovum. “Mass server failures equals multiple millions of dollars,” she says. “Oftentimes facilities run much colder than they need to just to be ‘safe’.”

Some of the resistance is fear of change, which tends to run high in risk-averse populations like data center operators. “For decades people expected to be freezing when they walk into a data center,” says Forrester’s Fichera. “Ingrained behavior is hard to change. But it’s becoming easier as power gets more and more expensive.”

Fear can also lead to faulty logic. Some data center operators keep temperatures low to give themselves time to take action in the event their cooling systems fail, says Trinity Health’s Filas. This doesn’t make sense because you’re only buying yourself 5 to 10 minutes, 15 at the most, he said. “That’s not enough time to drive in to the office,” he said. “It’s not enough time to call in an HVAC contractor. It’s not enough time to react at all.”

There’s also not much data about data center temperatures. According to Stanley’s anecdotal evidence, many data centers run colder than they need to. About 20 percent of Digital Realty’s customers have raised their temperatures to save energy. “Our most efficient customers are ruthlessly implementing best practices and capturing 75 to 90% of the efficiency gains available,” Jim Smith says. At the other end of the spectrum, he says, there are maybe 25 percent that will resist changes “under any circumstances.”

Limits

While some cutting edge data center operators like Amazon are pushing temperatures above the ASHRAE recommended limits, most data center operators who are boosting temperatures are holding in the upper 70s. One reason is a measurable trade-off. There is a point where the energy saved at the facility level is consumed by increased energy drawn by server fans. “Some of our customers are telling us that this is around 74 degrees or so,” Smith says. “We haven’t seen a good quantitative study yet, but we expect that some of our sophisticated customers have modeled this and have a view of where they should be.”

There’s also the limits of human physiology. “I’m all for hotter data centers that save energy on cooling — and for sure most data centers are run too cold — but there comes a point of human consideration,” says Ovum’s Ascierto. “People still have to work in these facilities.”

Authors:

French (Fr)English (United Kingdom)

Parmi nos clients

mobileporn