By now, containers are a familiar concept for your IT department. The most agile small to midsize businesses (SMBs) or enterprises may already be using them. As far as application development and IT infrastructure go, Linux containers are about as trendy as you'll see with this kind of technology.
In fact, the application container market is projected to expand to $2.7 billion by 2020. Meanwhile, Gartner forecasts that more than 50 percent of global organizations will run containerized applications in 2020. From a technology standpoint, containers are the catalyzing agent of that DevOps equation. They're the convenient package through which DevOps and IT teams can quickly and consistently pass an application's code, configurations, and dependencies back and forth. But what does that actually mean for your business? This explainer will lay out not only what containers are and how they work but the different ways in which—once you understand the technology—your organization can use containerized deployments atop your data center or cloud infrastructure to deliver quality software faster.
At their most basic level, Linux containers are aptly named for the metal shipping containers to which they're so often equated. Whether it's on a freight ship, a cargo train, or on the back of a big rig truck, the container itself is the same uniform vessel of transporting goods. Businesses should simply look at containers as a new unit of work, they are all about agility.
In a complex organization, it's about assigning responsibilities along with the freedom to deliver features. And containers give you this technology to keep it all together while still managing your responsibility for security, availability, regulatory compliance—all the stuff that matters.
In this way, containers are easy-to-use building blocks. They're small, pluggable units upon which you can build a microservices architecture that accounts for operational efficiency and version control. At the same time, they give DevOps and IT teams granular control over how they deploy infrastructure resources. Containers are fundamentally an operating system (OS) technology. Containers take the operating system and slice it into two pieces, on one hand, you get the work unit for the application, which contains application code and dependencies in a way that can be optimized by the DevOps teams and gives them autonomy and control to make decisions when they want to. They no longer have to wait for other teams. The other piece is the operating system kernel. The OS kernel and container payload provide support for the resources and primitives you want available like storage, networking, and security. Because containers are an OS technology, you can run them anywhere, be it virtual hosts or a public cloud. That hybrid quality lets you manage any application in any environment using the same technology while still empowering DevOps teams.
Containers are also not the same thing as virtualization, containers and virtualization are reciprocal forces. Virtualization emulates a virtual hardware environment to run various software stacks; it provides what's called an abstraction layer to give that cloud-computing environment flexibility over how applications and data are structured and deployed. So, upon a single virtualized OS kernel, you can then run multiple servers or instances. Containers are the instances.
There is still a lot of confusion around mixing containers with virtualization. Virtualization solves a different problem, it provides concept and copy, however, with containers you get a similar kind of concept but without the copy. Together, they give you no measurable overhead and a ton of operational efficiency but it can be tough to separate the two.
Container Landscape
The DevOps and agile principles we're talking about in regards to containers aren't new as they go back to the concept of service-oriented architecture (SOA). The modern Linux container was invented when Docker changed the game. Docker is a few different things but, first and foremost, it's a wildly popular open-source technology that was developed by the Docker Project in 2013. It's designed for packing, shipping, and running any application as a lightweight container. In 2017, Docker added the ability to run Linux containers on Windows using Hyper-V technology. Docker is among several open-source projects that are helping to shape the technology and the space. Kubernetes, originally developed by Google and now managed by the Cloud Native Computing Foundation, is an open-source system for automating container deployment, scaling, and management. Docker and Kubernetes are the two powerhouse open-source projects that hold the most sway over the development of technology. In fact, Docker released its Docker Enterprise Edition (EE) 2.0, which lets users manage and secure their workloads in Kubernetes across a multi-Linux, multi-OS or multi-cloud environment. This flexibility reduces the chance of companies being locked in to a certain technology or infrastructure. Docker says EE 2.0 allows companies to gain more cost efficiency by managing applications from a single control interface to keep track of images, storage and networks.
There are dozens of other companies, along with Red Hat's Project Atomic (for combined Docker/Kubernetes stacks) and the Linux Foundation's Open Container Initiative that aim to create open industry standards around containers. For Docker, it was Docker images that set the development world on fire. A container stores the code, libraries and configuration files to run an image in any location. Containers had all been putting services on the same node until Docker introduced the notion of image-based deployment. Of course, when it comes to enterprise container management, Docker isn't alone in the space. Red Hat offers its own enterprise CaaS suite of developer tools across its Red Hat Enterprise Linux (RHEL), OpenShift, and JBoss products.
More and more big-name tech companies have also been getting in on the action. Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) have all integrated built-in container orchestration and management tools into their respective cloud infrastructure-as-a-service (IaaS) platforms. Microsoft announced a partnership with Red Hat to let developers run container-based software in Azure. That same day, IBM reported that it would expand its collaboration with Red Hat to enable developers to build applications on an integrated container platform. In a matter of a few short years, the container space has gotten quite crowded.
What Business Problems Do Containers Solve?
When it comes to implementing modern application architectures and DevOps principles within an enterprise, containers are the answer to a number of problems. Particularly when the organization is entrenched in legacy technology and traditional development policies, containers are the easily integrated platform underneath that can smooth the transition and make it easy on the IT department. Containerization is the most practical way to introduce cloud, DevOps, and microservices into your environment. Containers integrate naturally with the technologies enterprise business's already have.
All companies are looking to put out software faster to compete in their own markets, and that pressure often falls on an overworked IT department. Containers are a way to produce applications and services that can be changed faster—be it adding a new feature or a critical security fix—while maintaining quality. Investing in platforms like containers is a way to ensure the organization is ready to start building out something like microservices. Microservices and containers together is where the real power is. There's nothing interesting about a single microservice; it's only in plurality that you see this cooperating network made up of discrete chunks of functionality.
Evaluating whether to invest in and adopt containers is not only about technology. A successful transition to DevOps, which incorporates containers and microservices, you'll need the architecture, the underlying platforms, and agile processes in place. This is not just a technology decision, you need to think about whether your organization is ready, whether you have particular software delivery problems you need to solve, and understand what the business drivers look like around automation and DevOps. Understand your key requirements, look at different projects' needs, and then decide what combination of cloud, application architecture, and container technology can make it happen. Below are three pieces of advice for enterprise IT departments when looking at how containers fit into their organization:
1. Get Started
The combination of DevOps, agile, containers, microservices is not just a technology change in isolation, it's a journey that leads to pretty significant transformation in how your business will operate. First piece of advice is to get started because your competitors are. Letting early adopters put the pieces together is the wrong strategy because you may not catch up.
2. Comprehensive Vision
You should approach containers from a more holistic viewpoint. Pick your most important goal, the ability to deliver software faster is a great starting point. Based on that one objective, think about how you bring your organization into the process and structure this work without risk and churn for your existing operations.
3. Ecosystem
Many enterprises are reaching the point where they're constrained with legacy architecture, processes, and platforms. You can't change the architecture without thinking about the platforms you want to rely on but then the question is, Who do I work with? Who do I talk to? Experts recommend to look for companies who can help not just with the technology problems but manage the transformation on all these dimensions: tech, process, all the way to organization. When dealing with cloud, DevOps, containers, and microservices all together, you want to rely on an ecosystem that can help you deliver success in a short period of time and spare you from dead ends.
Comments