Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

How Docker can transform your development teams

Matthew Heusser | Sept. 1, 2015
Docker’s capability to speed up software testing should make it a no-brainer for any development team. Here’s why.

Instead of running in application space, Docker runs in the kernel. In other words, it makes itself a part of the operating system. Running in the operating system does limit Docker to modern kernels of Linux, both host machine and container, but it also massively simplifies the task-switching process of the operating system. Having Docker in the kernel eliminates many redundancies that typical VMs would have (it needs one kernel, not one per container) and means that Docker containers do not “boot up,” as they are already up. 

All this combines to make Docker an incredibly fast way to create machines – machines that are exact copies of what will go into production, based on disk images … not a patch to an existing server. 

The capability to stop and save a container in a broken state, then debug later, makes debugging much easier under Docker. If the debugging destroys the environmental condition, or “dirties” the environment in some way, restoring to the broken state is trivial. Docker is also capable or running any applications on any Linux server;  the quick startup and disposable nature of containers makes it fantastic for things like batch processing. 

There are some tools out there that help you configure and even simulate entire infrastructures with Docker containers, making life easier for the team. The most popular one is Docker Compose. This can reduce what used to be ultra-complex setup processes to a single command. 

Docker in production 

Docker on your local machine and a couple cloud servers is one thing; making it production-ready is a different matter entirely. The early days of Docker were like the Wild West when it came to production. The commonly thrown around phrase is "Container Orchestration," which is the practice of taking Dockerized apps and services, and scheduling them onto clusters of compute resources. That means organizations don't care where the containers are running, just that they’re running and serving the right requests, whether that be web traffic, internal services and databases, or messaging queues. 

Today’s big players in orchestration are AWS EC2 Container Service, Docker Swarm and Mesos. Typically orchestration services can manage containers well, but they also may come with other bells and whistles like blue/green deploys, container healing, load balancing, service discovery and inter-container networking. 

When evaluating Docker for production, there are certainly other challenges like logging, and environment variable configuration. One great place to start and see if you are ready to move towards Docker is seeing how close you are to optimal 12 Factor App

Don Taylor's tutorial on Docker at CodeMash walked the audience through installing Docker on a Linux machine, creating a container and executing commands on that container. Best of all, the labs are on github for you to follow along. 

So install a Linux virtual machine, put Docker inside it, explore how to create containers, and decide for yourself if this is a technology worth using in your organization. 

Jared Short contributed to this article.

 

Previous Page  1  2 

Sign up for CIO Asia eNewsletters.