Running Hot and Cold: What Your Data Center Shouldn’t Do

Stay Informed

When it comes to a data center’s power usage effectiveness (PUE), every IT manager wants to get as close to that perfect score of 1.0 as possible. One of the biggest causes of high PUE is an inefficient data center infrastructure – namely, your physical setup. Just like in relationships, the last thing you want is a data center running hot and cold.

Learn why data center efficiency is affected by mixing air and discover techniques to improve overall efficiency.

How do hot and cold air drive down efficiency?

Data center efficiency is heavily impacted by airflow. When a data center runs hot and cold, it increases the load put on equipment and facility capabilities. Hot and cold air can influence a data center in different ways. Hot air entering the server intake makes cooling server fans speed up to compensate. The hot air can also lead to internal component damage, downtime, and voided product warranties.

Cold air, on the other hand, can contribute to your facility’s operating costs. If there are “hot spots” or other ineffective setups, a building’s cooling system will be working overtime to compensate. The cost to you? Higher PUE and higher electricity bills.

What are some easy fixes to avoid this?

While hot and cold air can have a big influence on your facility, there are simple actions you can take to increase efficiency. Here are some initial steps your business can take to ensure that hot and cold air are not mixing in your data center:

  • Use a hot aisle/cold aisle row configuration. Align rows so the front of the racks face each other (likewise for the rears). This is the number-one step you can take toward a more efficient data center. However, this isn’t always possible if you’re retrofitting the facility.
  • Install blanking panels. These can be bought in 1RU increments and save you 12 percent on your electric bill.
  • Use perforated tiles and raised floor grommets to properly circulate cool air.
  • Organize your cables and remove unused or excessive ones. This lessens the obstacles cool air has to circulate through.
  • Set cooling units at the optimum temperature. ASHRAE recommends an IT equipment inlet temperature range of 77° to 80.6°F. Many data centers operate at much colder temperatures, overcooling equipment and creating unnecessary costs.

Taking Steps Toward Efficiency

Hot and cold air mixing can prevent your data center from performing at its peak efficiency. Keep your airflows contained, whether it’s through hot or cold air methods, so your equipment and devices can run at the optimum temperature. Getting to that perfect PUE score is right in sight.

FREE WHITEPAPER: Energy-Efficient Data Centers