Simplifying data management and storage in a ‘Connected Everything’ world

Simplifying data management and storage in a ‘Connected Everything’ world

Article by: Eran Brown, EMEA CTO, INFINIDAT

We’re moving rapidly towards a completely connected world, with approximately 75 billion connected devices expected to be in use globally by 2025, according to Statista. Organisations are recognising the opportunities offered by the Internet of Things (IoT) and Machine to Machine (M2M) communication, which is driven by the commercial and operational value that can be derived from the data generated by these technologies.

However, the closer we get to the Connected Everything scenario, the more data we have to deal with – this puts a new strain on the IT organisation’s budget and its infrastructure.

Addressing the sheer scale of data

IoT generates more data than traditional applications can handle and at a greater speed too, resulting in a deluge of data. The simultaneous growth in the number of edge devices and the amount of data each of them produces, generates exponential growth of both data sent to the core and the bandwidth required to get it to the core. This volume, combined with velocity and the variety of data generated, makes managing data both operationally and financially incredibly challenging. Adding to this challenge is the question of retention. For example, some data needs to be retained for extended periods for long-term trend analytics, based on the intrinsic value that can be gained from that data.

From the business’ perspective, IoT and Big Data are no different than any previous data-hungry technologies that organisations adopt in that they enable organisations to reduce costs, accelerate revenue or identify and develop new business opportunities. However, organisations also need to leverage this data quickly to stay ahead of the competition.

In all cases, there’s a business value for the introduction of a new tool to extract data value in almost real-time, but only if the initial and ongoing costs of the data management solution are justified by an increase in revenue. Since digital transformation makes infrastructure a big part of the cost, this is often what makes or breaks these projects.

In short, the challenge is to keep the cost of this data infrastructure to the absolute minimum, but also make capacity and processing power quickly accessible so that the business can leverage it as needed, particularly with digital transformation projects.

The data security conundrum

Real-time data management also poses a new challenge, with the need to secure, protect and maintain the privacy of that data.

The most recent Society for Information Management (SIM) survey indicates that privacy is considered a top IT concern for most CIOs. While this is not yet reflected in IT budget spending, organisations are starting to ask the question: how will we be able to protect our infrastructure from cyberattacks that are growing more and more sophisticated?

The new tools being introduced to face these complicated attacks need to analyse more data points from more sources. To do this they gather, store and analyse huge amounts of data, making cybersecurity one of the fastest growing data consumers in an already fast-growing data sphere.

Will more capacity sacrifice availability, performance?

Traditional storage solutions, hyperconverged infrastructure and public cloud storage platforms are all options that have been deployed with relative ease but still require a high cost to the operational budget.

There is a demand for higher capacity storage to meet the needs of these environments and the cost reduction and ease of administration will no doubt require a lot of consolidation. This begs the question: can dual-controller architectures, originally designed for a capacity of one to 10 terabytes, meet the resiliency requirements of environments of larger magnitudes? The same question can be asked about the availability offered by the public clouds: is the ‘four nines’ or ‘five nines’ (99.99% – 99.999%) availability good enough for business-critical applications?

Time to change to software-centric model?

If organisations are to remain competitive in this data-centric world, it will be their infrastructure that makes them competitive. Just ‘following the herd’ is a sure way to fail in creating a competitive advantage in this new economy.

Now is the time to consider alternatives, as traditional approaches force companies to choose between performance, availability and cost. There are modern solutions available that are based on intelligent software that don’t require a business to compromise one or more of these elements. These solutions offer reliability, simplified management, greater ease of data recovery, unmatched security and improved storage density while not taking away performance nor detracting from the capabilities of existing systems – all while making use of low-cost hardware.

The ability to virtually double an organisation’s available storage, while increasing resiliency and simultaneously reducing the need for hardware by a third, not only results in better data management and efficiency, but also slows down ever-growing infrastructure investments.

Offered on a ‘capacity-on-demand’ or as-a-service consumption model, means that organisations can leverage and grow without delays, while deferring paying for unused capacity. Thus they avoid the challenge of provisioning expensive hardware in advance based on predicted data growth – which is becoming difficult to predict while we are still moving towards a Connected Everything world.

Click below to share this article

Browse our latest issue

Intelligent CIO Africa

View Magazine Archive