Veeam: Meeting the Intelligent Data Management needs of 2019

Dave Russell, vice president for Product Strategy at Veeam, highlights five major trends for the year ahead that are vital for decision-makers to keep front of mind.

The world of today has changed drastically due to data. Every process, whether an external client interaction or internal employee task, leaves a trail of data. Human and machine generated data is growing ten times faster than traditional business data, and machine data is growing at 50 times that of traditional business data.

With the way we consume and interact with data changing daily, the number of innovations to enhance business agility and operational efficiency are also plentiful. In this environment, it is vital for enterprises to understand the demand for Intelligent Data Management in order to stay one step ahead and deliver enhanced services to their customers.

I’ve highlighted 5 hot trends in 2019 decision-makers need to know – keeping the EMEA market in mind, here are my views:

  1. Multi-Cloud usage and exploitation will rise

A report from McKinsey & Company revealed that data flow to Asia has increased by at least 45 times since 2005. Data from key regions such as North America and Europe has risen drastically to 5,000 – 20,000 Gbps and 1,000 to 5,000 Gbps respectively, from the original 100-500 Gbps and less than 50 Gbps in 2005.

With companies operating across borders and the reliance on technology growing more prominent than ever, an expansion in multi-cloud usage is almost inevitable.

IDC estimates that customers will spend US$554 billion on cloud computing and related services in 2021, more than double the level of 2016. On-premises data and applications will not become obsolete, but that the deployment models for your data will expand with an increasing mix of on-prem, SaaS, IaaS, managed clouds and private clouds.

Over time, we expect more of the workload to shift off-premises, but this transition will take place over years, and we believe that it is important to be ready to meet this new reality today.

  1. Flash memory supply shortages, and prices, will improve in 2019

According to a report by Gartner in October this year, flash memory supply is expected to revert to a modest shortage in mid-2019, with prices expected to stabilise largely due to the ramping of Chinese memory production.

Greater supply and improved pricing will result in greater use of flash deployment in the operational recovery tier, which typically hosts the most recent 14 days of backup and replica data. We see this greater flash capacity leading to broader usage of instant mounting of backed up machine images (or Copy Data Management). 

Systems that offer Copy Data Management capability will be able to deliver value beyond availability, along with better business outcomes. Example use cases for leveraging backup and replica data include DevOps, DevSecOps and DevTest, Patch Testing, Analytics and Reporting.

  1. Predictive Analytics will become mainstream and ubiquitous

The Predictive Analytics market is forecast to reach $12.41 billion by 2022, marking a 272% increase from 2017, at a CAGR of 22.1%. APAC, in particular, is projected to grow at the highest CAGR during this forecast period.

Predictive Analytics based on Telemetry data, essentially Machine Learning (ML) driven guidance and recommendations is one of the categories that is most likely to become mainstream and ubiquitous.  

Machine Learning predictions are not new, but we will begin to see them utilising signatures and fingerprints, containing best practice configurations and policies, to allow the business to get more value out of the infrastructure that you have deployed and are responsible for.

Predictive Analytics, or Diagnostics, will assist us in ensuring continuous operations, while reducing the administrative burden of keeping systems optimised. This capability becomes vitally important as IT organisations are required to manage an increasingly diverse environment, with more data, and with more stringent service level objectives.

As Predictive Analytics become more mainstream, SLAs and SLOs are rising and businesses’ SLEs, Service Level Expectations, are even higher. This means that we need more assistance, more intelligence in order to deliver on what the business expects from us.

  1. The ‘versatalist’ (or generalist) role will increasingly become the new operating model for the majority of IT organisations

While the first two trends were technology-focused, the future of digital is still analogue: it’s people. Talent shortages combined with new, collapsing on-premises infrastructure and public cloud + SaaS, are leading to broader technicians with background in a wide variety of disciplines, and increasingly a greater business awareness as well. For example, the Information Technology (IT) job market in Singapore continues to see high levels of recruitment.

Standardisation, orchestration and automation are contributing factors that will accelerate this, as more capable systems allow for administrators to take a more horizontal view rather than a deep specialisation. Specialisation will of course remain important, but as IT becomes more and more fundamental to business outcomes, it stands to reason that IT talent will likewise need to understand the wider business and add value across many IT domains. 

Yet, while we see these trends challenging the status quo next year, some things will not change. There are always constants in the world, and we see two major factors that will remain top-of-mind for companies everywhere:

a. Frustration with legacy backup approaches and solutions.

The top three vendors in the market continue to lose market share in 2019. In fact, the largest provider in the market has been losing share for 10 years. Companies are moving away from legacy providers and embracing more agile, dynamic, disruptive vendors, to offer the capabilities that are needed to thrive in the data-driven age. A report by Cognizant highlighted that 82% of APAC business leaders believe the future of work is in intelligent machines.

b. The pain points of the three C’s: Cost, Complexity and Capability 

These three C’s continue to be why people in data centres are unhappy with solutions from other vendors. Broadly speaking, these are excessive costs, unnecessary complexity and a lack of capability, which manifests as speed of backup, speed of restoration or instant mounting to a virtual machine image. These three major criteria will continue to dominate the reasons why organisations augment or fully replace their backup solution.

  1. The arrival of the first 5G networks will create new opportunities for resellers and CSPs to help collect, manage, store and process the higher volumes of data

In early 2019 we will witness the first 5G-enabled handsets hitting the market at CES in the US and MWC in Barcelona. I believe 5G will likely be most quickly adopted by businesses for Machine-to-Machine communication and Internet of Things (IoT) technology. Consumer mobile network speeds have reached a point where they are probably as fast as most of us need with 4G.

2019 will be more about the technology becoming fully standardised and tested, and future-proofing devices to ensure they can work with the technology when it becomes more widely available, and Europe becomes a truly Gigabit Society. 

For resellers and cloud service providers, excitement will centre on the arrival of new revenue opportunities leveraging 5G or infrastructure to support it. Processing these higher volumes of data in real-time, at a faster speed, new hardware and device requirements, and new applications for managing data will all present opportunities and will help facilitate conversations around edge computing.  

Related Articles

Top Stories