Why is the data centre industry so scared of change? Jonathan Leppard, director at Future Facilities shares his insight.
In order to keep up with increasing demands, the data centre industry cannot stand still. A recent study by AFCOM found that over the next 12 months, 7.8 data centres will be renovated per organisation.
Like it or not, change is inevitable for DC operators, and how this change is managed will ultimately make or break their chances of survival.
Unsurprisingly then, change is often met with resistance. Change can come in many forms for data centre professionals.
It can be a consolidation of company infrastructure following a merger or acquisition, relocation of facilities into a pre-existing building, or even construction of a new data centre in light of extra demand. In almost all cases, performance and cost are the key considerations.
But with any proposed change, risk and cost come along for the ride. Neither of which are appealing to those responsible for the performance of the business or the data centre that delivers it.
There is safety in the ethos of ‘if it isn’t broke, don’t fix it’ and this safety blanket is responsible for many businesses putting up roadblocks to change.
Changing minds
Making things even harder for business decision makers are the increased demands that are being placed on them.
In fact, in a recent independent survey it was found that 77% of data centre decision makers are seeing increased demands placed on their infrastructure.
These demands from new applications and use cases are putting pressure on existing data centre infrastructure as teams are asked to do more with the same amount of resources.
As demands increase, facilities managers may find themselves having to maximise the storage potential of a facility by spending more on critical power and cooling infrastructure, without proper insight into the root cause of the issue. This brings with it a mass of potential issues from a technical, logistical and compliance perspective.
Success for businesses is going to lie in being flexible to the changing business demands placed on them by ensuring that their data centre infrastructure is as agile as possible.
To change this perspective, businesses need to improve visibility and have a greater understanding of how their facility is set up. The danger is that if you can’t see the impact of your actions, then they can quickly spiral out of control.
The dangers of standing still
What’s more, the research has found that the underlying discomfort with change is the reason that 85% of organisations have experienced outages or downtime in the last 12 months.
Alarmingly in one-third of data centres temperature is being managed using “rule of thumb,” with operators basing their decisions on “the way things have always been done”.
Combining this issue with the news that two out of five data centres have suffered outages because of human error and it becomes clear that a better, more carefully managed processes are needed.
It’s strange then that despite the vast technical capability of the data centre itself, the industry is not as careful with its planning as you would think.
This is in no small part because 74% of organisations have to compromise in their capacity planning decision making most or all of the time. Why? Because they don’t have access to quality data about IT equipment and system performance. Staggeringly, almost two thirds (29%) have to compromise ‘all the time’.
This is clearly a trend that can no longer continue. Businesses need to get to know their data centre. One of the most effective and powerful ways to do this is via a physics-based digital twin.
By utilising Computational Fluid Dynamics (CFD), Power Systems Simulation (PSS) and live sensor data, businesses can finally see what is happening in their own data centre and make predictions about how changes, both on the IT and facilities side will affect the data centre performance.
Creating a digital twin of a data centre to safely model changes in scenarios such as ‘what-if’ analysis can help teams better prepare for worst case scenarios.
By carrying out modelling in a digital twin, teams can remove many of the roadblocks that have existed to the need to change.
Time to act
It’s clear that we’re at an inflection point, where things need to change with greater speed and accuracy.
While today, less than a third (29%) of organisations have invested in 3D modelling tools that could answer the problems they’re having in freeing capacity and avoiding outages, only 22% have proper simulations of their data centre in operation.
However, change is on the cards. Within the next 12 months, 67% of organisations will have implemented digital twin technologies in their data centre, which is then projected to grow to 92% within the next five years.
Staying efficient and competitive in the future is going to mean having a digital twin. With one in place, businesses can face up to the need to change safe in the knowledge that any changes have been checked and modelled before.
While the speed and need for change cannot be reduced, with the use of a digital twin, the risk and worry certainly can be.