Docker rolls out an orchestration engine. Because what customers want, customers get

You can't blame a vendor for doing what its customers ask for. Even better when it gives said vendor a clear path into the future.

Cargo ship with shipping containers in ocean

Microsoft: "Bring us all your containers!"

Credit: Thinkstock

Ever since it became obvious that Docker was onto something pretty special, there have been questions about how the company would parlay its rapidly increasing venture-backed valuation into monetization and, by extension, what that would mean for the significant ecosystem of third-party software vendors and service providers that Docker has built around its eponymously named movement.

Indeed, there have been times over the past years when Docker has made acquisitions of ecosystem players or introduced functionality as part of the platform that has been somewhat competitive to one or other members of its ecosystem. These moves had been met with a degree of concern and worry. Over time, however, the ecosystem has matured and has come to realize that Docker has no option but to extend its functional footprint, which it will do in a broadly open way. And while there will certainly be some casualties from the roster of ISVs around Docker, the approach of making its own technology "swappable" for third-party tools gives some of these players a bit of an out.

Given this context and the increasing production adoption of Docker, it is interesting to see that Docker is today at its annual DockerCon conference announcing version 1.12 of the Docker Engine and, with the release, introducing built-in orchestration features that allow users to define and manage complex Dockerized applications across the application life cycle. The new release offers decentralized orchestration capabilities that support multi-container environments on multi-host application environments. Users can elect to initiate Swarm mode, designating Docker Engine as a decentralized building block for self-organizing and self-healing Swarms.

Key highlights of the new version, according to the briefing materials, include:

  • Swarm mode, an optional capability that enables users to create coordinated groups of Docker engines with Swarm. It is available on every Engine to automatically discover each other and form a decentralized Swarm.
  • Service Deployment API defines services, attached storage, networking, and compute and scales to ensure high performance, consistency and resiliency.
  • Routing mesh capability provides out of the box multi-host overlay networking with built-in container-aware load balancing.
  • Secure by default -- cryptographic node identity ensures end-to-end encryption and other security features.

In espousing the benefits of built-in orchestration, Docker suggests that the consistency with regard to the user environment, the scalability and robustness of the engine, and the security that an integrated orchestration approach brings all deliver benefits to end users. Of course, there is a complex series of processes that goes into a decision like this, and to that end I spent time before the announcement with David Messina from Docker to discuss what it means for Docker, its customers and the ecosystem.

Messina began by explaining that Docker's users have been driving the company toward delivering orchestration; the release of the product comes from a latent desire for a native orchestration offering. That said, Messina was quick to remind me that Docker has a "batteries included but swappable" strategy, which means that even when Docker itself offers a specific functional tool, that tool is able to be swapped out for a third-party tool.

"We have a very wide ecosystem of 450 companies," Messina said. "A select few of those have some technology around third-party orchestration. Broadly for the ecosystem, however, this introduction is a positive thing. Everyone within the ecosystem is looking for ways to accelerate the process of moving from a few teams using containers in production to widespread adoption. To do that, orchestration has to be streamlined and democratized. Some orchestration solutions out there are highly specialized, and there are some use cases [where the use of a third-party orchestration tool] will make sense. The most important areas when it comes to the ecosystem are the peripheral ones -- monitoring, networking and storage, etc."

Messina reiterated the value of the "batteries included but swappable" approach, explaining that Docker is committed to its APIs allowing continuing swappability of plugins. Organizations, he says, have invested in plugins and hence want to know that Docker will integrate well with them. Docker has, for example, an integrated networking stack, but customers like the fact that they can use third-party networking tools without having to refactor their applications.

We discussed some potential moves that Docker could make with orchestration, beyond simply orchestrating containers but going further into the infrastructure space. Could Docker orchestration work with virtual machines as well as containers?

Messina explained that Docker doesn't categorize itself as a container-focused company per se, but rather is focused on a set of tooling options that are focused on the application itself. A proof point he brought up was a recent demo of Docker tooling administering unikernals.

I asked Messina about serverless architectures, one of the exciting new areas in infrastructure. Messina was circumspect, explaining that there are always exciting points on the horizon, but that Docker is focusing on the here and now. Notwithstanding that, however, if serverless becomes something that is critical to developers, then he can imagine Docker tools offering it.

More important, however, is the "Dockerization" of legacy applications -- taking monolithic applications and putting them into containers. He explained that 40% of users are Dockerizing traditional databases and puffing them into containers.

He pointed to one case study that will be presented at DockerCon, that of payroll vendor ADP, which wanted to take advantage of agile and the new velocity seen within development shops. The ADP approach is to take existing applications and put them into containers. That achieves the velocity ADP needs but also has the secondary effect of giving ADP a platform that lends itself to iteration on ever-smaller pieces of code.

I'm more relaxed about Docker's introduction of orchestration than I would have been a year or two ago. The ecosystem has matured and Docker's intentions and its approach to co-opetition is more of a known quantity. This is an organization which is quickly maturing (witness its tight partnership with HPE announced last week) and delivering upon customer demands.

Of course, there is still that monster valuation to justify, but that is a problem or another day. For today, Docker will simply enjoy the glory of its annual conference.

The march toward exascale computers
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies