Data center cooling advancements let you leave your jacket at home

Data centers that used to be 55 degrees are now running comfortably at 75 degrees, which means your company is saving money, and your IT pros no longer need to bundle up to do their jobs.

“Put on your jacket, we’re going to the data center,” is a statement IT jockeys may no longer hear if a pair of industry groups get their way.

AHRI (the Air Conditioning, Heating, and Refrigeration Institute) and ASHRAE (the American Society of Heating, Refrigerating and Air-Conditioning Engineers) both released standards last year to make data centers and cooling equipment more efficient and less like ice boxes. Now they’re working to get customer adoption.

AHRI certifies cooling hardware for vendors, who in turn sell their systems to data center owners. The AHRI data communications group formed in 2012; engineers there examine everything from air flow to decibels to electrical power.

SEE: Data centers becoming more energy efficient, thanks in part to cloud computing (ZDNet)

“We started because the Department of Energy made notice they were going to start regulating,” certification engineer Justin Prosser said. He noted that data center giants such as Amazon are innovating faster than standards can keep up.

Every data center operator may dream of having the resources of a Facebook, Google, or three-letter agency, but few can achieve it. “The datacom cooling industry changes very quickly,” Prosser said. “Our members are following them, following their customers, and building whatever products meet their needs.”

Data Center Must-Reads

“Quickly” is relative to the pace of government regulators, Prosser acknowledged. The Department of Energy is using standards from 2007; it’s unclear when or if the department will adopt AHRI’s 2016 version. Messages to a department spokesperson were not returned this week.

A similar situation exists for Ron Jarnagin, leader of the ASHRAE 90.4 standard for data center efficiency. He said his formation committee recently disbanded due to completing its work, and will soon be replaced by a maintenance committee to determine how the standard is working and to help it gain industry traction.

So far there’s not enough involvement from data center owners and operators, he said.

“Part of the issue when you’re trying to do this kind of stuff is you’ve got people out there like consulting engineers, vendors, and people like that [but] the people that own these data centers are pretty astute about what they’re doing, what works and doesn’t work,” Jarnagin observed. If industry bodies aren’t careful, “We could end up with a standard that really wasn’t right with them and may in fact cause problems for them.”

Vertiv—the former Emerson Network Power, one of the data center cooling industry giants—recommends that modern data centers keep their air temperature in the low 70s, compared to the mid-50s a decade ago. The past few years have seen this dramatic change, explained Vertiv’s JP Valiulis, vice-president of product strategy and marketing. New technologies include machine learning, which lets cooling products talk to each other. This year Vertiv will expand access to the machine insights so that data center administrators can get alerts through a mobile app, he added.

SEE: Ebook—IT leader’s guide to the automated enterprise (Tech Pro Research)

How much any of this matters to real-world data center operators is open for debate. As long as the equipment works, do they care about standards?

“They used to,” said Pete Sacco, a consultant and data center owner in Oakland, New Jersey. “It is important to them because it is the faith they can have that some governing body has said it’s safe to operate that way. That being said, how to execute that is all over the map today.”

Particularly in co-location data centers, “It’s important because the perception is that cooler is better. The answer is that’s not the case,” Sacco continued. In his own non-shared data center, “I allow it to go up to even the 80s, and I don’t care. In the 20 years I’ve been operating I’ve never had a failure due to heat.”