Micro-virtualization

Containers are here to stay, it seems. Four in five CIOs will be increasing their investment in the technology, according to a survey by tech firm Robin Systems. But what are they, and how should you use them?

Containers are a way to deploy and manage applications while shielding them from the underlying IT infrastructure. This enables CIOs to deploy highly scalable and resilient apps more easily. Because containers aren’t tied to a particular piece of hardware, they can be deployed en masse on different computers, enabling them to run in parallel. This lets CIOs scale out their applications, running many of them at once to share workloads. Then, if a single container fails for some reason, there are many others to pick up the slack.

“With this distributed cloud you’re dealing with many more nodes, and you have a different challenge,” explained Kamesh Pemmaraju, vice-president of marketing at Mirantis, which provides a distribution of the OpenStack private cloud system, alongside cloud consulting services. “Containers can isolate those services and move them around.”

Containers often go hand in hand with micro services, which are small pieces of software designed to offer specific business functions, like a currency converter or an ecommerce shopping cart, say. The micro services talk to each other using lightweight messaging buses, or directly using REST-like protocols. They are quickly written and updated, which shortens the software development cycle and lets companies develop and deploy software more quickly.

Containers are different to virtual machines in at least one crucial way: they encapsulate not only the application they’re running, but also its dependencies. This means that the micro service itself is bundled along with libraries and other specific things it needs to run. Containers typically then share aspects of an underlying operating system, unlike a virtual machine which replicates an entire operating system itself. This makes containers far smaller than virtual machines.

The Robin Systems survey may be a little self-serving – the company sells container software, and surveys from vendors with a vested interest in the technology must always be taken with a healthy dose of salt. Nevertheless, there’s no mistaking the general interest in containers over the last couple of years.

The Robin Systems survey interviewed more than 200 respondents from various industry verticals. It found that 35 per cent of them were already using containers in production, while another quarter were experimenting with the technology. They were particularly popular for running databases, and for big data applications like Hadoop, the survey said.

Open sourced container platform Docker bought container technologies into the spotlight, with support for Linux, although it has since done deals with Microsoft to support Docker containers on the Windows platform. Containers are not a new technology, though. Linux Containers (LXC) have been around since 2008, and enable developers to run their own applications in a container-based system on Linux.

The Robin Systems survey found that LXC still has lots of traction. Of those people using containers in production, 39 per cent of them used LXC, compared to 45 per cent who used Docker.



Related Download
Understanding How IBM Spectrum Protect Enables Hybrid Data Protection Sponsor: IBM
Understanding How IBM Spectrum Protect Enables Hybrid Data Protection
Download this whitepaper by Enterprise Strategy Group to learn how to choose a backup technology that is capable of supporting a hybrid protection approach capable of covering both on-premises technology and offsite cloud capabilities.
Register Now