Future Facilities publishes to Government Computer News on how computational fluid dynamics modeling can accelerate compliance with the Data Center Optimization Initiative (DCOI)

Future Facilities publishes to Government Computer News on how computational fluid dynamics modeling can accelerate compliance with the Data Center Optimization Initiative (DCOI)

Share

Aitor Zabalegui, lead consultant engineer, East Coast, for Future Facilities recently published to the Industry Insight section of Government Computer News about two basic approaches to efficiency that help businesses comply with the DCOI. From the article:

By Aitor Zabalegui Jun 07, 2017

Since August 2016, federal agencies have been actively pursuing measures to comply with the Data Center Optimization Initiative. Issued by the Office of Management and Budget, the DCOI provides categorized efficiency targets intended to bring federal sites up to modern standards of performance by the end of fiscal year 2018.

Since 2010, the federal government’s approach to data center efficiency has been to consolidate and close nearly 30 percent of its reported 10,584 facilities. Despite the $2.8 billion in cost savings achieved through fiscal year 2015, the approach appears to be losing steam. This is not a total surprise, as only four agencies accounted for 86 percent of all cost savings from consolidation efforts thus far. The imbalance may suggest that plenty of savings are being left on the table, but 10 of the 22 agencies submitting 2016 data center closure/consolidation reports to the OMB did not include any 2017 savings plans, and two agencies did not submit a report at all.

For existing data centers, the OMB sees optimization as a last resort. Dave Powner, director of IT management issues for the Government Accountability Office, has strongly implied that federal agencies should consider a transition to the cloud. Already, DCOI is off to a slow enough start that Powner has recommended that Congress extend its deadline to the end of 2020.

At the end of the day, agencies left running their own facilities will be forced to optimize by increasing their power usage effectiveness (PUE). How will legacy sites reduce their energy demand to meet the DCOI mandate? One of the OMB’s requirements is to enlist the services of at least one certified data center energy practitioner to guide the optimization effort.

Assessments by DCEPs are designed to streamline the DCOI by standardizing data collection and documentation with official toolkits, process manuals and report templates. For example, the DC Pro tool offers a survey-like format for data input, then delivers a PUE diagnostic and list of industry best practices that should facilitate the site’s efforts toward PUE compliance.

I have little doubt that best practices will help federal sites immensely, but will they be enough to meet the mandated PUE targets? When put in perspective, there are two basic approaches to efficiency, and the most effective strategy will successfully implement both:

The first involves upgrading inefficient components. Whether these efforts encompass replacing air cooled chillers with water cooled or retrofitting an air-side economizer, they create an additional capital-expense burden but are relatively safe. In other words, their impact on energy efficiency is known and the effects are immediate when integrated into the system.

The second measure is to optimize inefficient systems. This involves projects like fixing hotspots and raising the air-supply temperature or turning down fans in computer room air handlers. They are generally thought of as riskier endeavors because they come with a degree of uncertainty as to how the system will respond. However, optimizations are predominantly more energy- and carbon-neutral than upgrades and provide a faster payback.

In a likely scenario, a data center may not have the budget to upgrade system components and will be left with no choice but to work with what it has. Efficiency measures for electrical and IT systems are well established and relatively simple to implement: replace inefficient hardware, remove dead load, consolidate and rebalance loads.

The HVAC system, on the other hand, is much more difficult to remediate because of the direct impact to IT thermal risk. With upwards of 37 percent of total data center power consumption tied up in cooling and airflow, efficiency initiatives need a more reliable way to access HVAC energy savings.

Many airflow management issues are easily solved with low-cost remediation efforts, such as blanking, reconfiguration of tile placement or adding scoops to air handler units, but data center managers need visibility of airflow to gauge the effectiveness of these solutions. Blanket changes like outfitting every cold aisle with containment comes with a large upfront cost, which may not be entirely necessary.

To help accomplish this, there is one method that allows DCEPs to balance the two types of efficiency gains and mitigate implementation risk in a way that compliments the existing assessment process. I am referring to a predictive approach to airflow management, using a computational fluid dynamics model.

I am proposing that integrating a simple CFD model in the DCEP assessment process will fast-track and enhance the optimization of data center HVAC systems, while keeping remediation costs down. The bulk of the modeling effort can take place during pre-onsite activities, with data provided from CAD drawings. A baseline CFD study can accompany the DC Pro preliminary assessment and provide additional insight into any HVAC efficiency measures being pursued.  Ultimately, the efficiency action plan developed by the site team and the DCEP can be tested and iteratively improved within the CFD model.

The main benefit to this approach is the assurance that IT thermal compliance will remain safeguarded upon the implementation of any optimization plan. The added value is a plan that does not overlook or overengineer airflow management solutions.

Having a CFD model can also kick-start work on a future “investment-grade” audit, where field measurements taken during onsite activities can be used to calibrate the model.  A 2017 case study, presented at DCD Converged in New York by Citigroup and NYSERDA, exemplified the broader scope of efficiency optimization offered by a calibrated site model.

With the DCOI well underway and a pool of certified DCEPs steadily growing, I encourage data center managers to consider how a simple CFD model can expedite and enhance efforts toward meeting the mandate’s requirements.

About the Author

Aitor Zabalegui is lead consultant engineer, East Coast, for Future Facilities.

Read the full article at Government Computer News