We received overwhelming feedback on the 0.0.2 release of our Docker solution. Feedback had one common theme: only R&D use-cases are supported. I show the new
docker-compose.yaml in the 0.0.3 release. We are now able to support both Dev and Test use-cases in our company.
What was the problem?
Some of our developers and all of Test use PyCharm for code updates, debugging, running test suites, automated type-checking, automated coverage, automated formatting, and automated reviews.
However, for many Dev groups and most Test groups use-cases, the round-trip time for debugging and unit testing was considerably slower after the 0.0.2 Docker solution release.
What use-cases did our previous Docker package cover?
The initial use-case coverage of our 0.0.2 Docker package documents the following:
A Docker Solution For the Enterprise
Here is a story about our rollout of Docker in our company. I show our Docker and Docker-Compose solution. Be warned…
We discussed that every developer had their unique sandbox in which they developed (played). If the developer had to recover their development environment, it would take hours. Test would take a day or more to create a Dev environment that could pass Dev`s unit tests. Much time was spent by Test looking at source files to untangle import statements. Only once Test had a working environment were they able to create and run integration tests and acceptance tests.
The workflow of our environment
The developer accomplishes unit testing. The code must pass unit testing before committing to the Github repo(sitory). Other Dev group(s) perform code reviews, some integration testing, and merges into the Github repo(sitory).
The 0.0.2 alpha release was successful because all the Dev groups deployed the Docker images to Test. They shared their work with others by git clon-ing an URL. The URL has the
dockerfile and Docker image in a sub-directory labeled
docker/. The developer put customizations notes and build instructions in the
README.m . The minimal set of Python packages that Dev requires are listed in the requirements.txt file.
The Dev workflow, while simple, was used successfully by all six Dev teams and the three Test teams. More importantly, Test finds the
requirements.txtfile is documentation on what the Dev used and what the Test, Stage, and Production runtime environments require.
Test is thrilled by the ease by which they can reproduce an environment that passes Dev's unit tests. Unfortunately, Test rejected the 0.0.2 release. The 0.0.2 Docker solution was too slow for many of their other use-cases.
Note: The groups called Test are actually acceptance testing. They perform testing to determine whether an application/service is acceptable for Stage and then Production.
The 0.0.3 alpha release of Docker images
Test (Dev also) mostly used Pycharm with a minority using Visual Studio and Emacs. We also suspect that there are closet Vim-ers that need a faster Docker image.
We found that major consumer of memory and the majority of the launch time of the 0.0.2 Docker image is the Jupyter notebook. A straightforward solution was 0.0.3 Docker container without Jupyter notebook executables.
I would not go so far as to say that Test is happy now. But I think Test is even less unhappy with the 0.0.3 Docker solution than they were with the 0.0.2 release. (They keep us from resting on our laurels.)
Directory Structure for Dev for the 0.0.3 Alpha Release
We have a shared disk where anybody can copy the template files to their local directory structure.
NOTE: 0.0.3 eliminated the
docker directory and simplified our release directory structure.
Test 0.0.3 is able to use the 0.0.2
PROJECTS directory structure unchanged.
|--- <ln> requirements.txt
<project-name>/docker/requirements.txtfor the difference in Python packages they do and don't require. Then the Test symbolically links
<project-name>/dockerdirectory) with the command:
# MacOS and linux variants
$ ln ../requirements.txt requirements.txt
The only differences in the Test directory are the files
requirements.txt,the files in directory
docker/, and the files in directory
test/. Test moves this directory structure to stage server and 3 redundant production servers. The server farm is internal.
The Docker command and Dockerfiles
Docker is used to managing an individual container image of your application.
dockerfile for Test is:
RUN python -m pip install --upgrade pip
COPY requirements.txt ./requirements.txt
RUN python -m pip install -r requirements.txt
RUN python -m pip install -r requirements.txt
0.0.2 released a
dockerfile template that enabled Dev and Test to customize each Docker image by
Dockerfileis simplified, and the resulting Docker container images runtime are faster as all references to Jupyter are deleted.
The Docker-Compose command and docker-compose.yaml file
Docker-Compose is used to manage several containers at the same time for the same application. This tool offers the same features as Docker but allows you to have more complex applications.
Docker-Compose can merge more than one Docker container into one runtime image. Currently, we use DockerCompose for only one Docker image.
Some of us, are of the opinion, that PyCharm and Git are used as a simple CI/CD tool (mish-mash). With several Docker solutions under our belt, we think a great improvement to the product lifecycle process is to automate some of the steps. Some of these steps are semi-automated with bash scripts.
We want a future Docker architecture to accommodate a possible CI/CD framework such as Jenkins, circleci, or any of the other open-source CI/CD frameworks available. In this future, we will need Docker-Compose when our environment needs more than one Docker container image.
dockerfile-compose.yml for Dev is:
- ../../.:/docker working_dir: /docker
dockerfile-compose.yml for Test is:
- ../../.:/dockerworking_dir: /docker
The docker container image is built, run, and shutdown with the following commands:
# compose all containers (services) into one image
$ docker-compose build# runs docker-compose image in background
$ docker-compose up & # shuts down docker-compose image in background
$ docker-compose down
PyCharm Config for Docker
I describe how to set up a Docker image for the Pycharm project interpreter.
- Choose Pycharm new project interpreter at the ribbon at top-left by selecting
Pycharm>Preferences...which pop-ups the following window.
2. In the window shown above, choose
Project Interpreter. Then click on
+ at the bottom of the frame which will pop up the various interpreters available.
The screen capture above shows the chioce of the
Remote Python 3.7.6 Docker (test_notebook:latest)Docker image. 0.0.3 has a faster launch time with elimination of theJupyter notebook executables.
Test Docker image
1-unit-test, run: 2.66 sec
1-unit-test, debug: 5.59 sec
1-unit-test, run: 5.68 sec
100-unit-test, debug: 14.72 secDev Docker image
1-unit-test, run: 4.18 sec
1-unit-test, debug: 8.34 sec
100-unit-test, run: 11.66 sec
100-unit-test, debug: 25.59 sec
The 0.0.3 Docker image for Test is about 2x faster for the execution of these unit tests than 0.0.2 release with the Jupyter notebook.
Many useful resources explain how and why Docker is currently the leading platform for virtualization. I list just a few of these references.
Docker is an open platform for developing, shipping, and running applications. Docker enables you to separate your…
Docker Tutorial for Beginners: What is, Architecture, Install, Commands
Earlier, the process for deploying a service was slow and painful. First, the developers were writing code; then the…
Today, a lot of applications are services deployed in the cloud, on infrastructure of cloud providers such as Amazon…
Test rejected the 0.0.2. Docker solution because it was too slow for many of their other use-cases.
I detailed how we designed and implemented the faster 0.0.3 Docker solution in our company. I showed the directory structure and the Docker code for Dev and Test.
As a bet on the future, where we need to glue multiple Docker container images together, we use Docker-Compose as part of our solution.
All code shown in this article is here.
There will be future updates on our progress. I hope these articles on enhancements to our product lifecycle is beneficial.