Docker is emerging as the future of application delivery

This is a discussion on the role of Docker in software development and how it scores over virtual machines. As it becomes increasingly popular, let’s look at what the future holds for Docker.

We all know that Docker is simple to get up and running on our local machines. But seamlessly transitioning our honed application stacks from development to production is problematic.

Docker Cloud makes it easy to provision nodes from existing cloud providers. If you already have an account with an Infrastructure-as-a-Service (IaaS) provider, you can provision new nodes directly from within Docker Cloud, which can play a crucial role in digital transformation.

For many hosting providers, the easiest way to deploy and manage containers is via Docker Machine drivers. Today we have native support for nine major cloud providers:

  • Amazon Web Services
  • Microsoft Azure
  • Digital Ocean
  • Exoscale
  • Google Compute Engine
  • OpenStack
  • Rackspace
  • IBM Softlayer
  • Packet.net

AWS is the biggest cloud-hosting service on the planet and offers support for Docker across most of its standard EC2 machines. Google’s container hosting and management service is underpinned by Kubernetes, its own open source project that powers many large container-based infrastructures. More are likely to follow soon, and you may be able to use the generic driver for other hosts.

Docker Cloud provides a hosted registry service with build and testing facilities for Dockerised application images, tools to help you set up and manage host infrastructure, and application life cycle features to automate deploying (and redeploying) services created from images. It also allows you to publish Dockerised images on the Internet either publicly or privately. Docker Cloud can also store pre-built images, or link to your source code so it can build the code into Docker images, and optionally test the resulting images before pushing them to a repository.

Virtual machines (VM) vs Docker

Some of the companies investing in Docker and containers are Google, Microsoft and IBM. But just because containers are extremely popular, that doesn’t mean virtual machines are out of date. Which of the two is selected depends entirely on the specific needs of the end user.

Virtual machines (VMs) run on top of a hypervisor with a fully virtualised and totally isolated OS. They take up a lot of system resources and are also very slow to move around. Each VM runs not just a full copy of an operating system, but a virtual copy of all the hardware that the operating system needs to run. This quickly adds up to a lot of RAM and CPU cycles. And yes, containers can enable your company to pack a lot more applications into a single physical server than a VM can. Container technologies such as Docker beat VMs at this point in the cloud or data centre game.

Virtual machines are based on the concept of virtualisation, which is the emulation of computer hardware. It emulates hardware like the CPU, RAM and I/O devices. The software emulating this is called a hypervisor. Every VM interacts with the hypervisor through the operating system installed on the VM, which could be a typical desktop or laptop OS. There are many products that provide virtualised environments like Oracle VirtualBox, VMware Player, Parallel Desktop, Hyper-V, Citrix XenClient, etc.

Docker is based on the concept of containerisation. A container runs in an isolated partition inside the shared Linux kernel running on top of the hardware. There is no concept of emulation or a hypervisor in containerisation. Linux namespaces and cgroups enable Docker to run applications inside the container. In contrast to VMs, all that a container requires is enough of an operating system, supporting programs and libraries, and system resources to run a specific program. This means that, practically, you can put two to three times as many applications on a single server with containers than you can with a VM. In addition, with containers you can create a portable, consistent operating environment for development, testing and deployment. That’s a winning triple whammy.


Why Docker instead of VMs?

  • Faster delivery of applications.
  • Portable and scales more easily.
  • Get higher density and run more of a workload.
  • Faster deployment leads to easier management.

Docker features

  • VE (Virtual Environments) based on LXC.
  • Portable deployment across machines.
  • Versioning: Docker includes Git-like capabilities for tracking versions of a container.
  • Component reuse: It allows building or stacking of already created packages. You can create ‘base images’ and then run more machines based on the image.
  • Shared libraries: There is a public repository with several images.
  • Docker containers are very lightweight.

Who uses Docker and containers?

Many industries and companies have today shifted their infrastructure to containers or use containers in some other way.

The leading industries using Docker are energy, entertainment, financial, food services, life sciences, e-payments, retail, social networking, telecommunications, travel, healthcare, media, e-commerce, transportation, education and technology.

Some of the companies and organisations using Docker include The New York Times, PayPal, Business Insider, Cornell University, Indiana University, Splunk, The Washington Post, Swisscomm, GE, Groupon, Yandex, Uber, Shopify, Spotify, New Relic, Yelp, Quora, eBay, BBC News, and many more. There are many other companies planning to migrate their existing infrastructure to containers.

Integration of different tools

With the integration of various major tools available in the market now, Docker allows developers and IT operations teams to collaborate with each other to build more software faster while remaining secure. It is associated with service provider tools, dev tools, official repositories, orchestration tools, systems integration tools, service discovery tools, Big Data, security tools, monitoring and logging tools, configuration management tools such as those used for continuous integration, etc.

Continuous integration (CI) is another big area for Docker. Traditionally, CI services have used VMs to create the isolation you need to fully test a software app. Docker’s containers let you do this without using a lot of resources, which means your CI and your build pipeline can move more quickly.

Continuous integration and continuous deployment (CD) have become one of the most common use cases of Docker early adopters. CI/CD merges development with testing, allowing developers to build code collaboratively, submit it to the master branch and check for issues. This allows developers to not only build their code, but also test it in any environment type and as often as possible to catch bugs early in the applications development life cycle. Since Docker can integrate with tools like Jenkins and GitHub, developers can submit code to GitHub, test it and automatically trigger a build using Jenkins. Then, once the image is complete, it can be added to Docker registries. This streamlines the process and saves time on build and set-up processes, all while allowing developers to run tests in parallel and automate them so that they can continue to work on other projects while tests are being run. Since Docker works on the cloud or virtual environment and supports both Linux and Windows, enterprises no longer have to deal with inconsistencies between different environments – which is perhaps one of the most widely known benefits of the Docker CaaS (Containers as a Service) platform.

Drone.io is a Docker-specific CI service, but all the big CI players have Docker integration anyway, including Jenkins, Puppet, Chef, Saltstack, Packer, Ansible, etc; so it will be easy to find and incorporate Docker into your process.

Adoption of Docker

Docker is probably the most talked about infrastructure technology in the past few years. A study by Datadog, covering around 10,000 companies and 185 million containers in real-world use, has resulted in the largest and most accurate data review of Docker adoption. The following highlights of this study should answer all your questions.

i) Docker adoption has increased 40 per cent in one year

At the beginning of March 2016, 13.6 per cent of Datadog’s customers had adopted Docker. One year later, that number has grown to 18.8 per cent. That’s almost 40 per cent market-share growth in 12 months. Figure 7 shows the growth of Docker adoption and behaviour. Based on this, we can say that companies are adopting Docker very fast and it’s playing a major role in global digital transformation.

ii) Docker now runs on 15 per cent of the hosts

This is an impressive fact. Two years ago, Docker had about 3 per cent market share, and now it’s running on 15 per cent of the hosts Datadog monitors. The graph in Figure 8 illustrates that the Docker growth rate was somewhat variable early on, but began to stabilise around the fall of 2015. Since then, Docker usage has climbed steadily and nearly linearly, and it now runs on roughly one in every six hosts that Datadog monitors.

iii) Larger companies are leading adoption

Larger companies tend to be slower to move. But in the case of Docker, larger companies are leading the way since the first edition of Datadog’s report in 2015. The more hosts a company uses, the more likely it is to have tried Docker. Nearly 60 per cent of organisations running 500 or more hosts are classified as Docker dabblers or adopters.

While previous editions of this report showed organisations with many hosts clearly driving Docker adoption, the latest data shows that organisations with mid-sized host counts (100–499 hosts) have made significant gains. Adoption rates for companies with medium and large host counts are now nearly identical. Docker first gained a foothold in the enterprise world by solving the unique needs of large organisations, but is now being used as a general-purpose platform in companies of all sizes.

iv) Orchestrators are taking off

As Docker increasingly becomes an integral part of production environments, organisations are seeking out tools to help them effectively manage and orchestrate their containers. As of March 2017, roughly 40 per cent of Datadog customers running Docker were also running Kubernetes, Mesos, Amazon ECS, Google Container Engine, or another orchestrator. Other organisations may be using Docker’s built-in orchestration capabilities, but that functionality did not generate uniquely identifiable metrics that would allow us to reliably measure its use at the time of this report.

Among organisations running Docker and using AWS, Amazon ECS is a popular choice for orchestration, as would be expected — more than 35 per cent of these companies use ECS. But there has also been significant usage of other orchestrators (especially Kubernetes) at companies running AWS infrastructure.

v) Adopters quintuple their container count within nine months

The average number of running containers Docker adopters have in production grows five times between their first and tenth month of usage. This internal-usage growth rate is quite linear, and shows no signs of tapering off after the tenth month. Another indication of the robustness of this trend is that it has remained steady since Datadog’s previous report last year.

vi) Top technologies/companies running on Docker technology

The most common technologies running in Docker are listed below.

  1. NGINX: Docker is being used to contain a lot of HTTP servers, it seems. NGINX has been a perennial contender on this list since Datadog began tracking image use in 2015.
  2. Redis: This popular key-value data store is often used as an in-memory database, message queue, or cache.
  3. ElasticSearch: Full-text search continues to increase in popularity, cracking the top three for the first time.
  4. Registry: Eighteen per cent of companies running Docker are using Registry, an application for storing and distributing other Docker images. Registry has been near the top of the list in each edition of this report.
  5. Postgres: The increasingly popular open source relational database edges out MySQL for the first time in this ranking.
  6. MySQL: The most widely used open source database in the world continues to find use in Docker infrastructure. Adding the MySQL and Postgres numbers, it appears that using Docker to run relational databases is surprisingly common.
  7. etcd: The distributed key-value store is used to provide consistent configuration across a Docker cluster.
  8. Fluentd: This open source ‘unified logging layer’ is designed to decouple data sources from backend data stores. This is the first time Fluentd has appeared on the list, displacing Logsout from the top 10.
  9. MongoDB: This is a widely-used NoSQL datastore.
  10. RabbitMQ: This open source message broker finds plenty of use in Docker environments.

vii) Docker hosts often run seven containers at a time

The median company that adopts Docker runs seven containers simultaneously on each host, up from five containers nine months ago. This finding seems to indicate that Docker is in fact commonly used as a lightweight way to share compute resources; it is not solely valued for providing a knowable, versioned runtime environment. Bolstering this observation, 25 per cent of companies run an average of 14+ containers simultaneously.

viii) Containers’ churn rate is 9x faster than VMs

At companies that adopt Docker, containers have an average lifespan of 2.5 days, while across all companies, traditional and cloud-based VMs have an average lifespan of 23 days. Container orchestration appears to have a strong effect on container lifetimes, as the automated starting and stopping of containers leads to a higher churn rate. In organisations running Docker with an orchestrator, the typical lifetime of a container is less than one day. At organisations that run Docker without orchestration, the average container exists for 5.5 days.

Containers’ short lifetimes and increased density have significant implications for infrastructure monitoring. They represent an order-of-magnitude increase in the number of things that need to be individually monitored. Monitoring solutions that are host-centric, rather than role-centric, quickly become unusable. We thus expect Docker to continue to drive the sea change in monitoring practices that the cloud began several years ago.

Top 5 Node.js frameworks for developers

A majority of JavaScript developers last year experienced some major challenges. If you are a developer, then you must know that ECMA Script 6 was finally standardised and published. Many renowned compilers as well as web browsers are working very hard to adapt the latest changes and regulations. It is more or less necessary to immerse yourself within a strong and systematic guide, which discusses all the facets of latest standard.

Handpicked Node.js frameworks just for you

With the ongoing Node.js, JavaScript technology has come to the mainstream. JavaScript was already a popular programming language — widely used by millions of developers in browsers. However, with Node.js, the run-time language has found a new way to server-side applications, and minimise the difficulty of using two different languages at same platforms. It has single threaded event loop as well as asynchronous and non-blocking input or output processing characteristic that distinguishes it from other runtime scenarios.

The opportunity is increasing by leaps and bounds with many valuable contributions, from the community of developers to technology specialists. We have seen many performance driven frameworks utilising primary principles as well as approaches of Node.js. Here, we are discussing some of the rated Node.js frameworks that have extended its core functionality and have built latest features.

Hapi.js

Hapi.js is one of the most powerful Node.js web frameworks for creating or building the application program interfaces, shortly known as APIs. It is also good for building other software applications. The framework has a strong plugin system and different features, some of them are, input validation, configuration based functionality, logging and error handling among various others.

Hapi.js is widely used by thousands of developers for designing useful applications. Also, it is preferable to use for giving technology solutions by many large-scale websites such as PayPal and Walmart.

Features of Hapi.js:

  • Especially good for passing around a DB connection
  • Known as go-to-technology for both startups and enterprises
  • Works as a stable, secure solution as well as helps deploying application
  • Good structure to work in a team

Socket.io

Socket.io is a powerful Node.js, a great server framework for creating or building real-time web applications. As a strong JavaScript library, it allows developers to build event-driven, bi-directional communication between the Web clients and service in a quick and easy way.

This framework not only allows real-time concurrency for the sake of document collaboration but also exchange the key features, like asynchronous input or output processing and binary streaming.

Features of Socket.io:

  • Enables interaction between a browser and a node.js server
  • Good for the purpose of http
  • Wonderful real-time web application
  • Works well as a client-side library that runs in the browser as well as a server-side library for the purpose of Node.js

Express.js

Express.js is among the most popular and important web frameworks for Node.js developments. Many developers say it is a useful minimalist framework for creating or building a web host as well as mobile applications and helpful for building application programming interfaces, shortly known as APIs. Many applications are there have used express.js, some of them are MySpace, Segment.io and Geekli.st.

This Node.js framework offers a wide range of features such as temple engines and database integration.

Features of Express.js:

  • Powerful and useful minimalist framework to create a web host and mobile application
  • Good for temple engines and database integration
  • Affective standard server framework for Node.js

Mojito

Mojito is the best JavaScript framework, which is based on Yahoo, cocktails and another most powerful mobile application development platform produced by Yahoo Developer Network.

The best part of using Mojito is that it runs on both client- and server-side, browser and Node.js. Many developers use this framework because client as well as server components both are written in JS and Mojito is good for them. It is a powerful model-view controller framework, offering a range of features.

Features of Mojito:

  • Good for convenient data fetching
  • Perfect for the purpose of local development environment as well as tools
  • Designed for integrated unit testing
  • Suitable for simplifying library internationalisation and localisation

Meteor

Meteor is a very useful open source MVC (model view controller) Node.js framework used for building websites, web and mobile applications. The majority of developers these days use this Node.js framework for various purposes including web application development.

This framework supports macOS, Linux and Windows. Its reactive programming model helps building applications using powerful lesser JavaScript code. Further, it is a highly popular powerful framework for creating real-time applications.

Features of Meteor:

  • Good for rapid prototyping and produces cross-platform code
  • Can be used with any JS UI
  • Supports OS X, Linux and Windows
  • Quickly integrated with MongoDB

Web and application development landscape is changing rapidly. Developers in different parts of the world are shifting to Node.js frameworks for easy, clean and quick project delivery.

One of the biggest advantages of using Node.js frameworks is the high-level output and structure. You can easily focus on scaling your application instead of spending efforts in creating and defining the basics.