It's 3am and client calls support complaining their customers cannot complete orders. The complaint manages to reach a developer on the team who discovers a backend API error. This API endpoint works successfully on the development environment, so the developer is forced to test in production. The developer turns on extensive logging and tests the breaking API endpoint to reveal an invalid method called on a specific class. The language’s API and the packages all checkout with documentation, so the developer decides to check the versions. The language’s version was upgraded and is the reason for the breaking API endpoint. The developer fixes the issue by reverting the upgrade, which gets communicated back to the client.
Next month, it happens again. The developer reviews the issue and knows where to start, reverts the upgrade and all is fixed. Although the issue continues to get resolved, it does not go unnoticed by the CEO. The CEO holds a meeting to discuss the issue. They discover that the IT team has been deploying security and OS upgrades. These upgrades bump the language’s version, thus depreciating the method and causing the breaking API endpoint. The CEO tells the team that development and IT needs to communicate and create a process. Both have dramatically different opinions but they pick a tool that can accommodate both parties. That tool is Docker.
Docker provides an integrated technology suite that enables development and IT operations teams to build, ship, and run distributed applications anywhere.
Docker eases the development and IT process with the use of containers. The containers are pieces of the environment that are small, fast and repeatable. Docker build creates these containers that can be replicated on development and production servers.
Once the containers are built, there are two options to release them. Pushing or Automating. Docker push provides the benefit of only pushing the incremental changes. Its great for development, but when working with teams and pushing to production, automation is desired. In automating, the Docker build file is committed and shared to Docker Hub. Docker Hub will see the change and automatically trigger a new build for the container. Both methods gets the containers up in the cloud making it easy for developers and production environments to use.
Creating the Docker build files gets the environment in code. The development and IT teams can review pull request before deploying any changes to development and production. This means the environment changes can go through the same rigorous process the software goes through. No more upgrades will go unnoticed by the team and breaking API endpoints will get caught in development before it makes it to production.
Docker enables development and IT operations the ability to work together on building and shipping environments. When developers need new packages they can add that to the Docker build file. When IT wants to upgrade or add security measures they can add that to the Docker build file. In both cases, the environment can get reviewed by the team before it is released. Docker eases the pain between development and IT operations by creating a process that can be used to communicate environment changes. This truly allows the teams to build, ship and run distributed applications anywhere.
- Getting started may seem daunting but you can start with the development environment then move towards production.
- Docker puts environment into code. Terraform puts infrastructure into code.