Docker refers to a couple of different things within the development community. Firstly, it’s an open-source community technology project for containerization. Secondly, and slightly confusing, Docker or Docker Inc. is also the company name of the biggest contributor to the open-source project.
Docker is essentially a tool for managing containers, these containers can be thought of as lightweight virtual machines. The containers can be created, duplicated and moved between environments in a repeatable manor making them perfect for cloud environments or moving legacy build environments to new equipment.
How does Docker work?
So how does specifically the Docker open-source technology work? Docker uses features of the Linux kernel, most notably using namespaces and cgroups, to segregate processes meaning that they can run completely independently of each other.
This independence is what makes Docker so important, allowing developers to run multiple applications on the same infrastructure without worry of them interfering.
Why use Docker?
Docker allows developers to work in a standardized environment, developing locally with no issue when deploying to production. These workloads can easily scale from running on your laptop, on a testing virtual machine and even up to enterprise data centers or cloud platforms.
Due to its lightweight manor and segregation Docker is a good alternative to a hypervisor setup with fast deployment. Traditionally setting up the hardware could take days, with docker the hardware is irrelevant.
Standardization means that developers from any projects or companies can easily work within is system without having to get up to speed with how the infrastructure works.
Categories: Developer Chat