Simulation: The key to powerful artificial intelligence

When it comes to crunching huge amounts of data, technology will always reign supreme, and the future of the data centre most certainly lies in the machine rather than the man. But are we ready to put so much faith in technology? Mark Fenton, product manager at Future Facilities explores the role simulation has to play in getting AI right without the risk.

Whether we like it or not, the future will be completely defined, supported and optimised by data. As networks of smart devices and vehicles grow from conception to a mass-produced reality, data remains the driving force behind next generation technology.

However, alongside this rise in demand comes a significant influx in the amount of data that needs to be managed. For data centres to continue to support this, we must turn to machines to analyse performance, understand problems and ultimately make decisions autonomously.

Data centres are highly complex environments, where the smallest changes can have a dramatic impact on the entire facility. When monitoring the modern data centre’s ongoing performance, DC professionals need look no further than the stacks in front them. With reams of information around the power draw throughout the facility, ongoing temperature levels and humidities, and the overall demand and capacity on the cooling system, all of these factors must be carefully monitored.

The recent triumphs of machines against man, only serve to illustrate that when it comes to crunching large amounts of data, tech will always come out on top. This is why businesses are continuing to leverage machines to draw upon the huge datasets available, and ultimately drive day-to-day data centre operations.

To support the ever growing amounts of data organisations produce, the future data centre must be one that is automated, self-aware and self-controlling. Whether these are algorithms that manage power and application deployment within your virtualised infrastructure, or the next generation of cooling controls, data centres will rely on machine learning, and wider artificial intelligence technologies to ensure optimum performance.

Despite this, humans will always have a key role to play. It is vital for us to ensure that the algorithms in place continue to make the right decisions in the background. Any AI system is reliant on a thought out and rigorously interrogated training set to drive its algorithms. In consumer industries, the likes of Netflix and Amazon continuously build on their AI-driven personalisation and recommendation systems by collecting huge volumes of data, using this to map out customer behaviour over time. DC professionals don’t have this luxury.

Training with a digital clone

When planning the build of a prospective data centre, businesses simply don’t have any measured data to teach their AI systems how to operate. This means that these algorithms need to learn on the job. However, teaching such a critical function in a live production environment adds a significant layer of risk to the overall data centre operation. How can we reap the rewards that AI has to offer the industry, whilst avoiding the layers of risk associated with training such complex technology on the go?

Simulation holds the key. Modern data centre professionals increasingly use a digital clone of their existing or prospective DC, through which they are able to test new implementations, asses risk and monitor performance. What better training ground is there for our AI implementations?

With risk reduced to zero, we are free to create multiple training scenarios with a huge amount of variables. While simulation is becoming an increasingly used tool in general data centre planning and maintenance, this demonstrates how important simulation technology will be going forward.

To demonstrate the potential value that AI can deliver to DC operators, consider the task of predicting the best location for new hardware in a rack. Variables such as power, space, cooling and networking all need to be taken into account to ensure optimal performance. To achieve this, there are various areas that all need to be examined, including but not limited to airflow, temperature of the rack over time, the type of rack and equipment being installed, the location in the white space and the cooling controls in place.

It can seem like a monumental task for us to analyse the constantly moving variables and come to the optimal solution. Linking all of the existing variables to AI algorithms in a simulated environment provides DC professionals with a way to operate risk free, rather than jeopardise their live production environment.

Ultimately, machines are a natural next step for us to build on our approach to data centre performance, given their ability to continually run and optimise their environment autonomously.

The only setback is the significant amount of data that is required to become acclimated to their environment and seamlessly carry out day-to-day operations. It is here that simulation connected to AI will become the future of optimising data centre performance, without the risk of untrained machines running rampant. 

Related Articles

Top Stories