Why container tech is the backbone of DevOps

DevOps Logo

Containerisation is one of the most exciting tech trends to emerge over the last few years. Designed to work at operating system level, it's a popular virtualisation method that allows IT professionals to deploy and distribute applications easily.

This form of application management, which is named after shipping containers, allows almost any application to be stored and transported quickly. Containers, which are simply encapsulated systems, are given their own environments and hosts to run on. This process strips the need for potential dependencies.

Containerisation is being used in a range of areas in technology, particularly DevOps, a software practice that results in the unification of software development and software operations. There's a growing demand for new ways of managing application systems and containers represent a big change in the industry, enabling easy migration from legacy platforms to more modern ones, including the cloud.

RELATED RESOURCE

Deliver secure automated multicloud for containers with Red Hat and Juniper

Learn how to get started with the multicloud enabler from Red Hat and Juniper

FREE DOWNLOAD

Reliable storage option

Based in London, StorageOS is one of the companies at the cutting edge of this technology. The firm has developed a software-based, decentralised storage platform that provides container storage. It's been designed specifically for developers and professionals working in the field of DevOps. Chris Brandon, CEO of the company, says there's an increasing need for reliable storage options for teams working on large numbers of applications on a weekly basis.

"In order to meet the requirements of existing applications, and specifically with the increase of cloud native apps, we are seeing a rapid increase in the adoption of persistent storage for containers in the enterprise. This is being led by DevOps teams who, as they create and develop hundreds of new applications weekly, are reliant on the right environments to give them the agility to spin up projects with their databases quickly. Containers are ideal for these scenarios," he tells Cloud Pro.

While container technology has certainly created a lot of buzz in the industry, it's not perfect, Brandon says. He calls on companies to keep their options open and explore cloud native storage options in particular. "Even though container technology has passed the early adopter phase, organisations are still struggling to find the right storage solutions for the evolving container world," he adds.

"Organisations need to look for cloud native storage that offers persistent enterprise storage designed specifically for the container ecosystem. Containers are the engine running the DevOps industry and as the container market continues to explode, getting the storage right will allow everyone to take advantage of the technology."

Easing processes

Deploying multiple applications can be a time-consuming process, but containers are capable of making this easier for companies. Derek Weeks, of software automation specialist Sonatype, says this technology is transforming the DevOps industry in a number of ways.

"The simple fact is that adopting containers eases successful deployment of applications on premises or in the cloud. Fast development and deployment is the backbone of a DevOps culture, so it stands to reason that something that enables this, such as containers, is the backbone of the backbone. This is backed up by Docker's predictions that container pulls from DockerHub alone would double from 6 billion in 2016 to 12 billion in 2017," he says.

But there are caveats, he says, explaining that containers aren't universal and can differ in terms of quality, especially when it comes to security. Companies need to be aware of this. "Not all containers are created equal. With every organisation that adopts containers relying on a software supply chain which can include hundreds of suppliers and thousands of container versions, quality and security are paramount. But there is still some way to go; in our DevSecOps Community Survey, 88% of companies revealed that they were concerned about security, yet only 53% were using technologies to help them regularly assess their containers' security," he continues.

"To drive maximum value from containers and improve security practices, companies must continuously assess their container images for vulnerabilities and vulnerable dependencies. After all, if you can't see them all, you can't fix the ones at risk. As no container can be perfectly secure, the best organisations will use software supply chain automation to locate those vulnerable containers, recall, and update the image, and then redeploy safer versions within minutes.

"The first rule of DevOps is to never pass a known defect downstream. The second rule is to enable rapid correction when failures occur. Managing containers in an automated software supply chain is the true DevOps approach."

Boosting productivity

Jitendra Thethi, assistant vice president of technology and enterprise at software company Aricent, has worked with container technology in the past. Much of his role involves scaling cloud services and developing new software by leveraging emerging technologies. He says containers can improve efficiency and performance in DevOps.

"Containers enable applications to be scaled and can run on different environments across clouds. The developers will be responsible for packaging container images that get published on a repository.

"The operations team can then pull these container images and run them without the need for change. This makes DevOps practices much more effective and efficient, as it eliminates the need for manual input where errors can happen. It's worth bearing in mind that automation tools enable the construction of container images as part of their build cycle. Continuous Integration (CI) tools can take a source code, construct the build, run unit tests, coverage and [create] an image that becomes environment agnostic.

"Filesystems from software containerisation platforms can make the incremental additions to images much more efficient by enabling the layering of applications. New versions will affect only the application layers and the other layers, like dependencies, can be reused."

RELATED RESOURCE

Deliver secure automated multicloud for containers with Red Hat and Juniper

Learn how to get started with the multicloud enabler from Red Hat and Juniper

FREE DOWNLOAD

Security focus

Tim Mackey, a technical evangelist for Black Duck Software, also specialises in container solutions. His company's software helps firms locate, manage and secure open source code and he recently gave a talk at DevSecCon, where he spoke about what happens when good containers go bad. Mackey says companies need to ensure they have the right security in place if container technology is to be effective.

"For all its virtues, containerisation doesn't overcome the need for application security. And since a substantial element of most application software is open source code, organisations need to have processes in place to ensure any vulnerabilities in open source, including in containers, are patched as soon as fixes are published," he says.

"When you're using commercial software, the vendor is responsible for deployment guidance, security vulnerability notification, and solutions for disclosed vulnerabilities. If you're using open source software, those responsibilities shift. When Black Duck analysed audits of over 1,000 commercial applications we found the average application included 147 unique open source components. Tracking down the fork, version, and project stability for each component is a monumental task for development teams.

"We know that containerisation has increased the pace of deployment, creating questions of trust for many administrators. The key to protecting your applications in production is by maintaining visibility into your open source components and proactively patching vulnerabilities as they are disclosed. And if you assume from the outset your containers will be compromised, you can prepare by making changes that make it much harder to mount an attack from a compromised container."

Many businesses are still relying on ageing infrastructure to handle their application needs, and while container technology is relatively new, it offers so much potential. DevOps specialists deal with great workloads, but by making use of containers, they're able to ease processes and boost productivity. It's also clear that they speed up operations. We'll no doubt see more firms adopt this innovation as it continues to evolve.

Nicholas Fearn is a freelance technology journalist and copywriter from the Welsh valleys. His work has appeared in publications such as the FT, the Independent, the Daily Telegraph, the Next Web, T3, Android Central, Computer Weekly, and many others. He also happens to be a diehard Mariah Carey fan. You can follow Nicholas on Twitter.