Data Center Dynamics Questions “DOES THE DATA CENTER INDUSTRY NEED A CAPACITY GOD?” After Future Facilities Executive Dinner

Share

Future Facilities hosted an Executive Dinner in Santa Clara on Monday, June 16, and Data Center Dynamics reported on the event. From the article:

The divide between facilities and IT teams within the data center created some lively debate this week at DatacenterDynamics Converged Santa Clara. This time the conversation was around unused capacity, cost and risk. Judging by the thoughts of those working here on the US West Coast, the overall responsibility for managing these areas is a real ‘hot potato’ that is turning efforts to drive efficiency and reduce costs to mash.

But it appears to be the fault of no single team or technology. What it really boils down to (not intending to put another potato pun out there!) is a lack of education, or even an ensuing candidate position to assume such a role. It seems IT teams have enough on their plate to start learning facilities, and facilities the same regarding IT. And finance, well they often have other parts of the business to think about, despite paying the power bill. But when things go wrong, this hot potato can cause havoc for all teams involved.

On the evening leading to the event, a roundtable organised by predictive modelling vendor Future Facilities, hosted by industry advisor Bruce Taylor and attended by a number of industry stalwarts and a handful of newer industry members, discussed hindrances to capacity planning. Most agreed that the main reason we have stranded capacity in the data center is that the industry has created so many silos – teams working on individual projects inside the facility – that there is rarely someone tasked with taking on the bigger picture, looking at the farm from the top of the mountain.

Air flow is complicated, and Future Facilities argues that predictive modelling is the only science that can really help when deploying, then maintaining levels of efficiency as data center demands – and equipment – change.

Dr Jon Koomey, research fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University said only when you know the physics of the situation inside the data center, and the effect of changes you are likely to make in future, can you remove the problem of stranded capacity, and in turn drive better levels of efficiency through reduced power use.

“The goal, ultimately, is to match energy services demanded with those supplied to deliver information services at the total lowest cost. The only way to do that is to manage stranded capacity that comes from IT deployments that do not match the original design of the facility,” Koomey said.

He likened the situation today to Tetris, drawing on the analogy of the different shaped blocks in the game.

“IT loads come in to the facility in all different shapes, leaving spaces. Those spaces are capacity, so that 5MW IT facility you think you have bought will typically have 30% to 40% unused.”

Despite the obvious draw for making maximum use of your data center many attendees agreed that predictive modelling, and even data center infrastructure management (DCIM) tools that offer more clarity on the individual situation at real time, can be a difficult sell. Once again, the hot potato (of no one tasked with complete responsibility) often gets in the way.

Mark Thiele, EVP of data center technology at Switch, who has also worked for ServiceMesh, VMware and Brocade, said in most cases there is not a single person in the data center with a vision or understanding of the facility’s entire operations – from design and build to IT, facilities and even economics.

“Today 75 to 80% of all data centers don’t have a holistic person that knows and understands everything about the data center, so the target opportunity for [sale of] these tools is often someone that has no responsibility for managing this in their job description,” Thiele said.

“We also find that a majority of facilities today are still bespoke – they are designed to be repaired after they are created. These are serious thresholds that have to be overcome in the market on the whole.”

But this is a situation the industry has created for itself, according to dinner host and Future Facilities CEO Hassan Moezzi.

“If you go back to IBM, 40 years ago it dominated the mainframe market. At the time, the concept of IBM having blank cheque for customers was a really painful thing but everyone accepted that because it was the only way. IBM built the power, cooled the data center and provided the hardware and software and if anything went wrong with the data center it was all put back on to IBM,” Moezzi said.

Today we have the siloes and distributed systems we have asked for. Anyone can buy a computer and plug it into a wall. The shackles have gone, and so too has that one throat to choke – or to sell capacity planning systems to.

The discussion continued into the next day with one roundtable at DCD Converged San Francisco, featuring consultants and wholesale data center operator Sentinel Data Centers. Participants said the issue of stranded capacity can cause just as much pain for a provider offering space as it can to those taking it.

“We spend a great amount of time and energy optimizing around PUE (power usage effectiveness) and power costs but utilisation rates and the ability to optimise around utilisation will have a much greater impact on cost,” owner of Hooked Communications David Holub said.

He said for those using colo or wholesale data center space, the benefits of capacity management could go to waste if flexible contracts that allow the scaling up or down of space are not offered.

Data center end user specialist Scott Stein, of Cassidy Turley, even discredited the benefits of capacity planning altogether.

“To me, capacity planning is the biggest amount of voodoo and uncertainty a company can have – it is typically a function for P&L but it is often instead mapped into IT teams and engineering teams. I rarely see it done accurately, and it is such a moving target,” Stein said. “It can leave you with a vast amount of unused resources.”

Sean Ivery, senior director at Cushman and Wakefield said he agreed capacity planning is near impossible. He said instead users of colo or wholesale space should be focussing on negotiating as much flexibility into their contracts as possible, instead of focussing on cost.

Like the previous night, the panel members all agreed that what is really needed to get this right – capacity planning tools or not – is a new function in the data center that oversees all areas of need, including IT and facilities.

Read the full article at Data Center Dynamics.