A piece on cooling in data centers, this article features Upsite as a thought-leader at the forefront of the conversation on cooling. From the article:
Data center cooling technologies have become a top priority for many operators as servers and other essential equipment depend on them to function properly. Along with deploying monitoring services, management must keep servers at the appropriate temperature and strategize power usage in the face of heightened computing demands. As server room environmental conditions become a more influential factor on the bottom line, it will be up to the supervisors to guarantee that energy-efficiency rules and recommended temperature standards are met on a consistent basis.
Cooling solutions can require a lot of power in order to stabilize the server room temperature, necessitating the need for a thorough examination of the facility in order to maximize energy efficiency and area coverage. Upsite Technologies recently found that only 13 percent of assessed data centers havemaximized their cooling capacity, showing the low implementation of best practices. By studying cooling infrastructure utilization, Upsite aimed to identify common practices that resulted in organizations leaving money on the table. With a best-of-breed evaluation methodology, businesses can better regulate their facility using appropriate methods and ensure that they are following standards.
“Our assessment applies our knowledge of the CCF [Cooling Capacity Factor] and AFM [airflow management] best practices, and our check-out meeting and conference call ensure that a data center fully understands the gathered information to implement the best solution to fit their cooling needs and maximize their capacity,” Upsite senior engineer Lars Strong said.
Improving modern cooling capabilities
Operators have increasingly implemented temperature monitoring and other solutions to ensure that the server room remains at the proper levels. TechTarget contributor Clive Longbottom recently noted the importance of the correct temperature in the data center, mentioning that standard computer room air conditioning (CRAC) units could be useful by running at the speed required to maintain the desired temperature. Some units may also be switched off to reduce the risk of over cooling and save maintenance costs. Hot and cold aisle containment can also be effective in giving the servers close contact with air to stabilize it and reduce the volume of space that requires cooling. By focusing on the cooling solutions, data centers can better ensure that the facility is provisioned with the right tools to run at optimal efficiency.
“Running standard fixed-rate CRAC units at 100 [percent] capacity can build up thermal inertia, another cooling strategy that reduces costs,” Longbottom wrote. “The data center cools considerably below its target temperature, then the CRAC units are switched off. The data center is then allowed to warm up until it reaches a defined point, and the CRAC units come back on.”