Containers meet microservices, DevOps, and the IoT
Gordon Haff, Red Hat Technology Evangelist

Containers meet microservices, DevOps, and the IoT

To cope with an increasingly networked and interconnected world, industrial automation is evolving to incorporate many of the new or repurposed technologies underpinning general-purpose computing. For example, the Internet-of-Things (IoT) is especially interesting today because a number of converging technology trends are making useful solutions more practical. IoT opens vast possibilities for information gathering and automation. This in turn gives rise to new opportunities to innovate, increase revenues, and gain efficiencies, says Gordon Haff, Red Hat Technology Evangelist.

One of the key technologies being deployed to enable the easy deployment and isolation of applications running in both gateway devices and back-end servers is Linux containers. Containers provide lightweight and efficient application isolation and packages applications together with any components they require to run; this avoids conflicts between apps that otherwise rely on key components of the underlying host operating system.

According to a Forrester Consulting Thought Leadership Paper commissioned by Red Hat, container benefits cover a wide range with higher quality releases (31%), better application scalability (29%), and easier management (28%) cited as among the top three reasons to adopt containers. Forrester notes: “That the top benefits cited are so spread out is a testament to the broad appeal of containers to businesses with various objectives.”

Containers are part and parcel of the set of technologies and practices through which new applications are being developed in industrial automation and elsewhere. The lightweight isolation provided by containers allows them to be used to package up loosely-coupled services that may perform only a single, simple function such as reading a sensor, aggregating some data, or sending a message. These small independent services that can operate independently of each other are often called “microservices.”

Microservices can avoid many of the pitfalls of more monolithic and complex applications in that the interfaces between the different functions are cleaner and services can be changed independently. Services are, in effect, black boxes from the perspective of other services.

These clean interactions in turn make it easier for small teams to work on individual services, test them, and do rapid and iterative releases. This, in turn, makes it easier to implement DevOps, which is an approach to culture, automation, and system design for delivering increased business value and responsiveness through rapid, iterative, and high-quality service delivery. Thus containers, microservices, and DevOps – while, in principle, independent – mutually support and enable each other to create a more flexible and efficient infrastructure.

For example, the aforementioned Forrester Consulting study also found that containers provide an “easier path to implementing DevOps,” especially in concert with additional tools. Forrester wrote that “organisations with configuration and cluster management tools have a leg up on breaking down silos within the software development life cycle.” Almost three times (42% vs. 15%) of the organisations using such tools identified themselves as being aligned with DevOps compared to organisations using containers alone.

From a technical perspective, services running in Linux containers are isolated within a single copy of the operating system running on a physical server (or, potentially, within a virtual machine). This approach stands in contrast to hypervisor-based virtualisation in which each isolated service is bound to a complete copy of a guest operating system, such as Linux.

All the security hardening, performance tuning, reliability engineering, and certifications that apply to the virtualised world still apply in the containerised one. In fact, the operating system shoulders a greater responsibility for providing security and resource isolation than in the case where a hypervisor is handling some of those tasks.

We’re also moving toward a future in which the operating system explicitly deals with multi-host applications, serving as an orchestrator and scheduler for them. This includes modelling the app across multiple hosts and containers and providing the services and interfaces to place the apps onto the appropriate resources. In other words, Linux is evolving to support an environment in which the “computer” is increasingly a complex of connected systems rather than a single discrete server.

There represents an ongoing abstraction of the operating system; we’re moving away from the handcrafted and hardcoded operating instances that accompanied each application instance—just as we previously moved away from operating system instances lovingly crafted for each individual server.

In addition to the operating system’s role in securing and orchestrating containerised applications in an automated way, it’s also important for providing consistency (and therefore portability) in other ways as well. For example, true container portability requires being able to deploy across physical hardware, hypervisors, private clouds, and public clouds. It requires safe access to digitally signed container images that are certified to run on certified container hosts. It requires an integrated application delivery platform built on open standards from application container to deployment target.

Add it all together and applications are becoming much more adaptable, much more mobile, much more distributed, and much more lightweight. Their placement and provisioning is becoming more automated. They’re better able to adapt to changes in infrastructure and process flow driven by business requirements.

All of this requires the operating system to adapt as well while building on and making use of existing security, performance, and reliability capabilities. Linux is doing so in concert with of other open source communities and projects to not only run containers but to run them in a portable, managed, and secure way.

Click below to share this article

Browse our latest issue

Intelligent CIO Middle East

View Magazine Archive