Storing data – why organisations need to rethink their approach

Peter Ruffley, Zizo’s CEO, explains how there is an urgent need for existing data centres to be utilised better and for businesses to become savvier in how they store and move data.

According to research group IDC, the number of connected devices is forecasted to grow to 42 billion by 2025. With the demand for the Internet of Things (IoT), automation and 5G continuing to grow, and heavily influencing businesses and supply chains over the coming years, the sheer volume of data that companies will be dealing with will become more and more overwhelming. 

Whereas five to ten years ago we’d see new data centres popping up everywhere to store and move all of the data around, this is no longer the case. Many cities, such as Amsterdam, have put a stop to anymore data centres being built as they drain power from the grid and cities have to invest more in power and cooling systems to keep them running efficiently. This means we have to rethink data, and just because businesses can store data, doesn’t mean they should.

Keeping the lights on

Many companies continue to operate on the ethos that nearly all of their data, if not all of it, should be kept as there might be some value in it, either now or in the future. However, whilst in many aspects the cost of storing data has dropped substantially, making it a lot cheaper to store more, it’s not declining at the same rate as the increase in the amount of data being created. The type of data that is actually proving to be mission-critical is also a lot less than organisations first thought. 

So, can businesses start to look at ways to not aggregate the data, but only keep that which is necessary and valuable to their business? How do they ensure they hold data for the right period of time for regulatory reasons, but no longer, and how can they make sure they have all the data they desperately need? Edge computing is undoubtedly going to have a substantial role to play in helping organisations become more efficient and help them better utilise their resources. Analysing data at the point where it is created and limiting the amount of data being sent on to the cloud will also ease pressure on the cloud network.

An overarching change is required

The cloud has to date been heralded as the saviour of data storage, with businesses only using the resources they need at any one time. Any data that isn’t needed is put into low cost, ‘third tier’ areas of storage. However, the cloud has proven to not be an efficient way of allowing people to store, manipulate and manage their data, and is in fact turning out to be an expensive option for many, with it costing them to put data in and store it as well as take it out and move it. 

There is a misconception that having more servers (effectively ‘throwing more tin at the problem’) is the way forward. While they are able to store large volumes of data, they do not reduce the power needed and increase cooling costs in the data centre, and only a few of the capabilities of these servers are ever fully utilised. Why are businesses buying more capability than they really need or those that they will never use? Either way, this isn’t really getting the crux of the issue. These and other smaller changes just aren’t enough and will not make the intended impact. There must be smarter initiatives put in place.

Gartner predicts that by 2025, 75% of enterprise generated data will be created and processed outside a traditional centralised data centre or cloud. Whilst we are going to need a lot of them to cope with the growing influx of data, edge devices are small and they’re more compact and efficient. But how do we do that?

Regulatory and environmental concerns

Many businesses sight regulatory requirements as a reason to collect and keep all of their data but this is simply not justifiable. There is a limited amount that industries and businesses truly need to keep for these reasons.

There’s no getting away from the fact that more data means more infrastructure to help store and move it from one location to another. Moving data can not only be costly but due to the amount of energy used in transferring it but there is also the environmental impact that must be considered.

The best time for businesses to decide what they are going to do with their data is when it is first collected on an edge/connected device – when it’s as current, complete and meaningful as it is ever going to be. Companies need to make a decision at that point about what they’re going to do with it – whether they’re going to keep it, change it or add to it. Just keeping it has no use. Keeping a lot of data drives a lot of waste. 

5G is a very short range network that, whilst it will help transfer data faster, will need to be supported by a lot of redistribution points (nodes) so will not help businesses until a full network has been rolled out. This will require significant investment from telecommunications companies and governments alike. These nodes could be smaller edge servers, providing more than just connectivity.

But how do we ensure performance at the edge? We are, in fact, seeing a move by the large silicon vendors away from the scale out model to more of a scale up model, the difference being rather than buying more servers, we increase the power within each server. This has a knock-on effect at the edge, where we now have tremendous processing capability.  Another option (with significantly better green credentials) would be to reuse older mobiles and/or laptops as lightweight edge nodes, providing a cheaper deployment option, especially in emerging markets in Africa and Asia.

Conclusion

Edge computing is going to provide a much easier way for businesses to quantify and understand what they are investing in when looking at collecting data, processing it and moving it. It provides the opportunity to have greater agility and move towards truly ‘real time’ analytics.

In many ways, the hardware constraints placed on infrastructure at the edge is definitely going to give organisations the onus to have a more in-depth understanding of what data they need to collect versus what they don’t need to collect, and what is valuable to their business. 

Investing heavily into high cost, high overhead server infrastructure, with long contracts and a high total cost of ownership is becoming a thing of the past, especially when little thought is given to how or when this data could be moved to other infrastructure such as the cloud.

The current economic climate is forcing businesses to review how they work and they need to have a flexible business model to be able to cope with the uncertainty that the future holds. They need to be able to scale their operation up or down as and when they need to.

Related Articles

Top Stories