CHICAGO - It happened during one of those bitter cold weeks last winter when temperatures dropped and stayed below...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
freezing. A data center housed on the 27th floor was running smoothly until a water pipe located on the roof froze then burst as the outside temperature started to rise.
The broken water pipe meant no air conditioning. The servers in the data center started to cook. Within minutes, the IT staff was alerted, and with the help of facilities the group managed to bring in several portable air conditioning units. The temperature in the servers had reached 90 degrees, but the group was eventually able to get the room to a more moderate temperature.
Some data center managers would view this as a success story. To others, it's just a band-aid away from a major disaster.
Situations like this are all too common, said Bob McFarlane, Interport Division President, Shen, Milsom & Wilke, Inc., New York, N.Y. who spoke at last week's Data Center Decisions conference. How to keep the data center cool is a growing concern of data center managers everywhere and they're scrambling for solutions.
The problem, said McFarlane, who spent much of the conference looking more like the Pied Piper than a data center design specialist as attendees sought him out for help with specific problems, is that vendors continue to develop smaller and smaller servers with more computing power.
"Even though it's more efficient, its total power consumption is still rising," he said. "Everyone is saying it's more efficient. Well, it is. But there's so much more of it."
In some cases, two small servers can generate as much heat as a mainframe. Put 250 of those servers in one data center and you've got yourself a major cooling problem.
More air isn't the answer
The situation is compounded by poor design of the data center. While consolidation is a great remedy for sever sprawl it creates havoc on the servers stuffed in a room that was probably designed for only one or two large servers.
"It's like trying to cool off your kitchen by putting an air conditioner in the living room window," said McFarlane.
Instead of being able to spread out the servers so air can flow, businesses are stacking them. Blades servers, growing in popularity, are the biggest culprits, say some experts.
"It's not a matter of getting the right temperature, it's a matter of distribution – getting the right amount of cooling equally across the room," said Camilo Trujillo, network consultant for the City of Chicago. "The problem is we're trying to fit our servers in a room that was meant for something else."
Too many managers treat the problem by upping the switch on the air conditioning.
They get more AC," said McFarlane, "but they can't get the cooling where it's suppose to be – inside the servers."
Unfortunately, McFarlane estimates that only 1% of all managers take design into consideration when setting up a data center, even fewer think about cooling.
"They find out [that it's a problem] afterwards."
"Heating and cooling issues are killing us," said Steve Memenga, manager of infrastructure services, Bank of Montreal.
Menenga, who will be helping to design "one or two" data centers within the next several months, said about one third of his job is dealing with physical environment issues, including heating and power.
Overheating was such an issue at one point, he said, the company had to bring in portable fans and rig them into the data center, he said.
"We had a massive heat problem," he said.
Lost in space
Of course, the obvious solution is to spread everything out, said McFarlane. But with space at a premium, most companies are not willing to allocate more space.
The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), an association dedicated to the advancement of heating, ventilation, air conditioning, and refrigeration with commercial and residential environments, is trying to figure out a universal solution to the problem, but continues to hit brick walls. The difficultly, said McFarlane, is that the technology is changing so rapidly.
"They're trying to establish standard guidelines and they've discovered it has become more of a task than they thought it would be," he said.
The group has been able to send guidelines for vendors so the industry has started to get realistic information on the cooling requirements, said MacFarlane, although there are still problems delivering the information.
"What they struggle with is they have learned that there is no easy answer to this."
In the meantime, the data center manager needs to spend time totally understanding the data center to make realistic predictions.
"So much of the information is so new and partly because it's inconclusive. It's experimental and it takes serious time," he said.