Colourful illustration of a laptop surrounded by disk drives and light beams

Containerized applications accelerate the development workflow so you can ship code more quickly. Here’s how adopting containers raises the pace and increases developer satisfaction.

1. Fully Portable Environments

The first and most obvious benefit of container-driven development is the ease with which you can transition code between environments. Once you’ve built a Docker image for your app, anyone can run it using docker run in their terminal.

This makes it easier to onboard new developers and keep code in sync between machines. It’s a welcome farewell to the days of working through long setup documents to bring up a dev environment that falls over a few hours later.

Docker virtually eliminates environmental dependencies. As long as you’ve got a Docker host available, you’ll be able to start your stack. This applies equally whether you’re on your own laptop or you’re bringing up services in a production cloud. You spend less time getting setup, applying changes, and managing infrastructure, giving you more opportunity to concentrate on development tasks.

2. Microservices and Code Reuse

Containers are self-contained, decoupled, and disposable. They’re the foundation of microservice architectures, where individual components of your stack become self-sufficient services that interlink with each other through well-defined interfaces.

You could have containers for your authentication gateway, user session provider, payment processor, and web API. Pulling these pieces apart helps avoid hard links between individual components which can create technical debt over time. Developers get an improved experience as each team has its own project it can build, test, and deploy, without running all the other aspects of the stack.

In larger projects, it’s common to find similar code implemented slightly differently in several areas of the codebase. Containerizing this into a dedicated microservice recognizes its integrative status in the system. Consuming code could then call the new service to perform required operations. Further components could hook into the service too, saving development time later in the project’s life.

3. Scalability

How many concurrent service instances do you need? Whether it’s one, 50, or 500, using containers lets you scale on-the-fly without reprovisioning infrastructure or specially configuring your code.

This goes hand-in-hand with microservice-driven approaches. Combining microservices with a container orchestrator like Kubernetes lets you dynamically scale individual parts of your stack. This gives you the flexibility to run extra API servers during periods of peak demand, or spin up additional payment handlers during an unexpected flood of orders.

Scalability might sound like its part of the operation team’s mantra but it still has a direct effect on development velocity. Services that you can scale in seconds by changing configuration files, and which discover each other automatically through a mesh, can be implemented without needing to touch any code. The mesh handles the intricacies of working out which containers to direct traffic to.

As far as the codebase is concerned, it connects to mysql-service to store data, and stripe-service for payments. The DNS names actually get resolved to one of the possible service replicas each time a request is made but the code doesn’t need to know this. Developers can hone in on functionality without worrying about architectural topologies.

4. Automated Deployments

Containers have revolutionized deployment workflows by offering an improved approach to automated rollouts via CI/CD pipelines. Once you’ve packaged your app as a container image, you can capitalize on the portability advantage by launching your image onto any available Docker host.

Automating deployments can have a transformative impact on engineering teams. Code releases become quick and easy, letting improvements and fixes reach customers much more rapidly than manually initiated release flows can provide.

Deployments typically occur each time you merge code into your main branch. Developers can get straight back to coding instead of tagging releases and running build scripts on their machines.

Although it’s possible to automate deployments without containers, the fact that containers are fully self-sufficient and independent of their operating environment makes them the most suitable contender for hosting your workloads. If you need to rollback a release, simply revert to an older image tag.

There are several ways to combine containers with CI/CD pipelines to create a deployment system. It’s common to deploy to an orchestrator like Kubernetes that can fetch your image, create multiple replicas, and handle the release’s lifecycle. You could use Docker Swarm mode or even plain Docker Compose over SSH as simpler alternatives.

5. Keep Everything As Code

In the same vein as how containers facilitate automated deployments, they also enable Infrastructure as Code methodologies. The act of writing a Dockerfile creates a declarative representation of your app’s build routine and dependencies.

Adding a docker-compose.yml or a set of Kubernetes resource manifests lets you describe how to launch your stack into a ready-to-run state. These files not only facilitate automation but act as self-documenting information repositories for others to learn how the system works.

Less time spent searching docs and running manual setup scripts tightens the development loop and helps keep everyone on the same page. Many developers welcome GitOps approaches where everything related to a project is stored as version-controlled text files. Docker’s compatible with this methodology by default, keeping you within familiar development workflows when you’re adjusting build steps and deployment routines.

6. Make Your App Cloud-Native

Containers are an integral part of the cloud-native paradigm. This refers to maximizing your stack’s ability to benefit from the opportunities offered by public cloud platforms.

Effective adoption of a cloud-native mentality increases the velocity of the whole software development lifecycle. It creates a tighter loop by lowering the barriers between development stages. Cloud-native systems are usually automated, observable, and predictable, enabling inferences to be made about the system’s state. This reduces the work developers must do to understand the cause of an issue and the remediation that’s required.

Containers are so important to cloud-native because they possess these traits themselves. You define image builds as a set of programmatic steps, monitor running instances with centralized logging tools, and benefit from a highly predictable environment where all dependencies are clearly defined in a Dockerfile. At cloud-native scale, a container orchestrator can aggregate metrics and logs from multiple containers and distribute traffic across them, using unified interfaces to surface and supply information automatically.

7. Narrowing the Dev-Ops Divide

Containers help to narrow the developer-operations divide, facilitating a unified DevOps workflow. This can have far-reaching impacts on your work as it advocates tight-knit collaboration, breaks down information silos, and formalizes the relationship between developers, traditionally seen as implementors, and operations teams, previously seen as managers.

In practice the two roles have many overlaps and regularly feed into – or block – each other. A container-based development model means everyone can use the same set of tools, whether they’re building the application or rolling it out to production. Less time is wasted “handing off” code for release – developers can simply build a new container image and push it into the shared repository.

As containers are portable across systems, personal preferences can still be accommodated. Developers may want to use a Linux workstation while operations staff use Windows. Regardless of platform choice, both groups can install Docker, pull the application image, and get a new stack live.

Conclusion: Containers Accelerate Development

Containers have become one of the most popular developer tools of the past decade. They solve several common pain points, including “it works on my machine” and the question of how to construct scalable architectures formed from microservices.

Transitioning to a containerized development workflow can be time consuming, especially when you’re working with legacy applications. It’s best to think of the process as a long-term investment that will accelerate development velocity and create a more satisfying experience. Even if you don’t see all the benefits straightaway, producing a working Docker image of your development environment will make it easier to bring your next new starter onboard.

Profile Photo for James Walker James Walker
James Walker is a contributor to How-To Geek DevOps. He is the founder of Heron Web, a UK-based digital agency providing bespoke software development services to SMEs. He has experience managing complete end-to-end web development workflows, using technologies including Linux, GitLab, Docker, and Kubernetes.
Read Full Bio »