Dave King, Product Manager for Future Facilities, today published to Data Center Knowledge’s Industry Perspectives on the subject of data center simulation and how it’s evolved over time. From the article:
Dave King is Product Manager at Future Facilities.
It’s been said that CFD provides a historical view of the airflow in a data center, one that is probably out of date by the time the report is produced. This view of CFD as a snapshot of the past misses the real power of the technology (that of prediction); an unfortunate line of thinking that seems to be widely held within the industry. I’ve lost count of the number of conversations I’ve had at conferences with data center operators who have said something along the lines of, “Why do I need someone to perform a CFD study to show me what my facility looked like two weeks ago? I have sensors that can tell me what’s happening right now.” This perception hasn’t come about by accident.
In the Beginning
CFD first entered data centers about 10 to 15 years ago when power densities started to rise. When IT equipment failed due to thermal problems, operators found it very difficult to understand why because they lacked the data to analyze the problem. This is where CFD came in: Operators engaged engineering consultants to model their facilities and tell them what was going wrong.
The consultant would return after about three weeks with a report displaying the environment in the facility. Invariably, these reports would contain temperature planes or temperature maps showing the surrounding environment.
For many operators, this was the first time that they visualized the facility environment. Being able to see how conditions varied within the space, often for the first time, offered great value.
In addition, the CFD simulation allowed the source of issues to be traced, giving deep insight into how the facility was performing. The consultant would work with the operator to find a solution and then show it working in the model before being implemented, fully using the predictive power of the technology.
Developing Real-time Data
As time went on, monitoring systems that gave operators the ability to see what was happening real-time in the data center started to appear on the market. The manufacturers of these systems had to find a way of presenting the data from many (probably at least 100) individual sensors in an easy-to-digest way. They chose to use a process called interpolation to try to join the dots between sensors and create temperature maps, which looked very much like the outputs from the CFD models that operators were used to seeing.
At this point, it’s worth thinking about the primary question operators were really asking when having a CFD analysis performed: What is happening in my data center? They may have received answers on why is this happening and what will happen if we do this as a bonus from the CFD model, but that wasn’t the main thrust of the thought process. As far as the market was concerned, the temperature maps provided by the monitoring systems already in use could answer this question without the need to engage with an expensive consultant. They also had the added bonus of being a display of what was happening right now, rather than three weeks ago.
Where We are Today
Operators that were using CFD as a tool to get a snapshot of what was happening in their facility came to the conclusion that they could get almost the same information in real time through modern monitoring technology, without the expense (however, a CFD analysis will always give you more information than a monitoring system). Thus, CFD would be written off as no longer necessary.
I wouldn’t necessarily disagree.
CFD is expensive and cumbersome compared to a monitoring system if all you are using it for is getting a snapshot of the conditions in your data center. But there’s the rub: The real benefits of CFD are in its ability to answer the “whys” and “what ifs.”
The introduction of monitoring systems allowed massive improvements in data center performance because they showed operators when they were exceeding limits. Rather than providing the same data, CFD modeling adds new information to the operator’s armory. Future plans can be stress tested and optimized in a way that simply is not possible with any other technology. Doing this will allow the data center envelope to be pushed further, utilizing more capacity and squeezing every last drop of efficiency out of the cooling system without risking the IT load.
Case Study: Financial Institution
To illustrate what can be achieved, I want to share an illustration. The goal of the project was to rip out roughly 150 old direct-cooled, glass-fronted cabinets and replace them with a more modern hot aisle/cold aisle arrangement to make better use of the available cooling. This amounted to around 50 percent of the server cabinets in the facility. At the same time, an extra 200kW of load was being migrated into the hall from server rooms in other locations, increasing the total from 900kW to 1.1MW. The work took place over the course of 20 weekends, with the rest of the data center remaining fully functional and resilient.
To begin, we simulated the end point of each of the 20 stages up front to make sure that the plans were sound. This exercise highlighted a number of cable trays in the floor that would need to be removed as they would sit directly below the new cold aisles, affecting airflow.
However, the really interesting part was once work had begun. As was always going to be the case, work quickly deviated from the plan as applications had to be kept running when they had been scheduled to be moved. We worked on-site with the project teams to update the CFD model with the work that had actually been completed each weekend and the new plan for the upcoming weekend. After this, we ran a fresh simulation to give the migration teams safe load limits for each of the new cabinets. These weekly safe limits were often significantly less than the final design load of each cabinet.
The project was completed within the estimated timeframe and without a single thermal shutdown. This was because the migration teams knew exactly where the limits were and could approach them with confidence due to having previously simulated each situation. Without the use of simulation, this would not have been the case, and limits would either have been exceeded (causing thermal shutdowns) or less equipment would have been installed each week (extending the length of the project).
Complementary Not Competitive
The data that CFD provides can enable the same leaps in data center performance that the addition of monitoring systems have been able to achieve over the past decade. While there are sound reasons for the market to view CFD and monitoring as competing technologies, they are in fact completely complementary. As data center operators are asked to do more with less, they are going to need both working together to achieve their goals.