Future Facilities’ Mark Seymour recently published a whitepaper exploring the calibration of models for predictively modeling data centers, reports Cabling Installation & Maintenance. From the article:
Model refinement and calibration for data center predictive modeling
Senior Editor
Future Facilities, a provider of data center design and operations management software, announced that Mark Seymour, the company’s data center cooling expert and chief technical officer, has published the first in a series of white papers explaining the importance of model refinement and calibration when predictively modeling a data center’s availability, physical capacity and cooling efficiency. Aimed at owner-operators, the white paper, entitled What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling, brings clarity to an area of data center operations that is increasingly important.
The white paper’s executive summary is as follows:
“For many data center owner-operators, using computational fluid dynamics (CFD) simulations to predictively model the impact that future changes will have on availability, physical capacity and cooling efficiency (ACE), or to help resolve ACE problems in a data center, is second nature. And, despite the historical connotations that CFD brings to mind – a complex and intimidating solution requiring expert knowledge to use – the reality is that predictive modeling has never been simpler or easier for the lay person to take advantage of.
But the success of predictive modeling still lies ultimately in the hands of the user. Summed up colloquially as “garbage in, garbage out”, the most pressing dangers for predictive modelers are that their computer models lack fidelity and are uncalibrated. Why? Because low-fidelity models (garbage in) lead to inaccurate results (garbage out) that bear no resemblance to reality (uncalibrated). For some, the solution to the “garbage in, garbage out” challenge is not to improve the model and calibrate it, but to lazily fix the results of the model to match what is being seen in real life.
Related: Analyst: Market for close-coupled data center cooling is, well, cool
“That renders the model useless,” writes Seymour. “Instead, owner-operators and consultants must exercise due diligence: review and measure the actual installation, then improve the accuracy of the model until it produces dependable results.”
So, how does one make the model dependable and how does one calibrate the model? In the paper, Seymour provides introductory answers to exactly these questions, while also covering common mistakes. “It is a fairly simple process, but one that benefits from a systematic approach,” he writes. Using real life examples illustrated using Future Facilities’ 6SigmaDC suite of tools, the paper shows how to overcome systematic errors affecting floor tiles, grilles, cabinets, cable bundles and other common data center objects.
Seymour also provides advice on “tough modeling decisions,” including whether or not to model poorly defined obstructions “such as water pipes under a cooling unit.” Specific advice is provided for calibration of the air supply system and its component parts, with Seymour cautioning upfront, “Do not overlook the fact that it is not just the bulk airflow that matters, but also the flow distribution.”
By reading the white paper, Future Facilities says “the reader will not only have a sound appreciation for good, systematic calibration practice, but [will] also understand that, while the overall facility is complex, many of the individual elements can be individually assessed. Seymour concludes, “This will make it possible to diagnose why the initial model does not adequately represent the facility — normally, it won’t!”
Read the full article at Cabling Installation & Maintenance.