Kubernetes logo

Kubernetes has become one of the most popular ways to run containerized workloads in production. It simplifies deploying, scaling, and maintaining the containers that run your service.

Kubernetes has arguably been less impactful in the arena of local development. It’s common for developers to build and test new services using plain Docker containers, perhaps arranged into stacks with Docker Compose. Running a local Kubernetes cluster is often seen as an extra layer of complexity and overheads.

In this article, we’ll explore some of the reasons why you should use Kubernetes for local development, as well as in your staging and production environments. We’ll also look at some of the tools you can use to implement a local Kubernetes development workflow.

What Makes a Good Development Environment?

Effective local development environments should closely mimic production infrastructure while offering a tight feedback loop that enables rapid iteration. These two objectives sit at opposite ends of a spectrum. They need to be balanced to achieve maximum throughput.

On the one hand, optimizing for maximum replication of production will give you the greatest chance of eliminating environment-specific bugs. However deploying to real production-like infrastructure could be a time-consuming process that requires a CI pipeline run and new cloud resources to be provisioned. Waiting for these procedures to complete after each change would slow down development.

Conversely, focusing only on iteration speed can cause development to deviate from how production works. This can cause users to experience issues that the engineering team never encounters.

How Development Kubernetes Clusters Help

Containerization is already a powerful technology for balancing production environment similarity with ease of iteration. Running containers in both development and production guarantees the application environment and its filesystem are consistent each time they’re deployed.

Using Kubernetes in production adds a new technology to your stack. It brings its own concepts, best practices, and potential incompatibilities. Although individual containers remain the same, you have an extra layer handling inbound traffic, networking between services, and peripheral concerns such as configuration and storage.

Running your development environment in Kubernetes lets you replicate these differences as you build your solution. A cluster running on your local machine might still not exactly replicate your production infrastructure but it’ll be a closer match. This makes testing more realistic, reducing the number of blind spots.

Shortening the Feedback Cycle

Kubernetes in development shortens the iteration feedback cycle. This is one of the aspects of a good development experience described above.

You’ll be better able to replicate user issue reports when your test environment runs the same technologies as production. It means you can experiment, iterate, and debug without deploying to a live environment each time you make a change.

The effects of each revision can be quickly observed in your local Kubernetes cluster, permitting greater throughput. Developers can rapidly test theories and evaluate new solutions, even if the problem lies in something specific to Kubernetes such as connections between services. This wouldn’t be possible if Kubernetes was reserved for production use.

Narrowing the Gap With Operations

Using Kubernetes as a development tool narrows the gap between engineering and operations. Developers obtain first-hand experience of the tools and concepts that the operations team uses to maintain production workloads.

Awareness of the bigger picture can help developers preempt issues in production. They’ll also gain an understanding of the challenges associated with operating the service. If operators suffer from logs and metrics that are incomplete or tricky to retrieve, developers will now encounter the issue too as part of their own work.

Setting Up a Local Kubernetes Environment

You have several options for setting up a Kubernetes cluster for development use. One approach is to create a new cluster in your existing cloud environment. This provides the best consistency with your production infrastructure. However, it can reduce efficiency as your development operations will be running against a remote environment.

Developers often prefer to run their own cluster locally for enhanced ease of use. Projects such as Minikube and MicroK8s simplify deploying Kubernetes clusters on your own hardware. As an example, you can start a MicroK8s cluster from its Snap package by running a single command:

$ sudo snap install microk8s --classic

Once MicroK8s has started up, you can interact with your cluster using the bundled version of Kubectl:

$ sudo microk8s kubectl apply -f my-pod.yaml

You may find Kubernetes is already available within your containerization platform. Docker Desktop for Windows and Mac includes a built-in Kubernetes cluster that you can activate inside the application’s settings. Rancher Desktop is another utility that combines plain container management with an integrated Kubernetes cluster.

Whichever solution you use, you should configure your cluster so it matches your production environment as closely as possible. Consistently use the same Kubernetes release to avoid unexpected incompatibilities and mismatched API versions.

The Limitations

Development environments will always differ from production in some ways. That’s alright as long as the discrepancies are acknowledged and understood.

When you’re running Kubernetes across all your environments, you’re most likely to come across these limitations:

  • Limited resources – Most development clusters will be running locally, on a single machine. A production cluster could span dozens of Nodes and support hundreds or thousands of Pods. This can make it difficult to accurately gauge how well your application scales.
  • Configuration discrepancies – When each developer provisions their own cluster, they could change settings that make it behave differently to other environments. It’s important to standardize on a single distribution – such as Minikube or MicroK8s – and a specific Kubernetes release to avoid these problems.
  • Vendor differences – Local Kubernetes deployments such as Minikube and MicroK8s have some inherent differences compared to cloud clusters deployed from vendors like Amazon EKS, Azure AKS, and Google GKE. Storage and networking integrations all work differently across providers, so there’s still opportunity for incompatibilities to creep in.
  • Complexity and learning curve – Using Kubernetes in development adds inevitable complexity. Although we’ve shown the benefits of the approach, it’s important to also acknowledge the burden imposed on developers – particularly newcomers to a team – when they have to learn Kubernetes and stay updated on its regular release cadence.

Most of the issues above can be resolved by providing an internal development cluster that’s centrally managed by a DevOps admin. You can use Kubernetes namespaces and RBAC controls to set up isolated areas for each developer to work in. While this guarantees standardization of Kubernetes distribution, version, and resource availability, it can reduce developer autonomy as they no longer own their cluster. It can also create bottlenecks when many engineers are waiting for new changes to be deployed inside the shared cluster.

Conclusion

Using Kubernetes in development lets you test software in an equivalent environment to staging and production. This can help you catch issues earlier, before they’re found by your users. Adopting Kubernetes as a development tool also gives engineers familiarity with how your application is deployed to production.

Switching to Kubernetes can seem daunting. A good understanding of container fundamentals will help you understand what Kubernetes adds and how it works. There are now several good options for deploying a local Kubernetes cluster on a development workstation. Using this kind of solution means you don’t need to wait for test deployments to rollout to remote infrastructure.

Tools that improve consistency through the software development process benefit everyone involved. We’ve already seen this in the foundations of the container movement and Docker’s widespread adoption. Running a local Kubernetes cluster lets you develop closer to production, shortening feedback cycles and improving collaboration across teams.

Profile Photo for James Walker James Walker
James Walker is a contributor to How-To Geek DevOps. He is the founder of Heron Web, a UK-based digital agency providing bespoke software development services to SMEs. He has experience managing complete end-to-end web development workflows, using technologies including Linux, GitLab, Docker, and Kubernetes.
Read Full Bio »