Improving Development Workflows with Docker
- Anthony Wall

- Oct 18
- 3 min read
No matter the programming language, every developer will eventually have to battle with changing tools, environments or development machines.
Working on this problem requires a strict approach to documentation or the use of clunky virtual machines and build servers to ensure the correct tools are always available.
Today though we can leverage the power of Docker and containers to not only create lightweight, distributed development environments but also ones that are tightly coupled to the sources.

The Problem
Imagine you are working in a team of three developers to create an embedded MCU application. The MCU requires a vendor-specific C compiler which is very sensitive to version changes.
In the last sprint your team added OpenSSL support to the application, which required you to compile OpenSSL from source with the vendor's C compiler.
How do you ensure that everyone is using the right C compiler and has the correct version of OpenSSL available for linking? What if a new developer joins the team?
The Traditional Solution
Managing this problem is usually requires several levels of effort. Firstly, project leads must be strict to enforce any critical tool changes across the team, which includes any Continuous Integration (CI) pipelines or agents.
Good documentation must also be created to track the versions of tools and the setup or on-boarding processes.
To lighten the burden teams may create scripts to aide in the process of checking, installing or updating tools and environments or leverage virtual machines to create a shared development environment. For example the script might try to run the vendor C compiler and install it from a company server if not found, or check the presence of the custom OpenSSL library and attempt to compile it.
These solutions do work, and for lightweight/highly proficient teams could be an effective solution to the problem. But it comes with a significant effort cost, high risk of documentation being outdated and a reliance on senior direction.
The other glaring issue with this approach is it's very difficult to build sources from older releases unless documentation and environments are carefully archived.
The Docker Solution
This is where Docker comes in, by encapsulating the development environment into a container image it can be rapidly deployed at a moments notice and provide a fully functional development environment.
Let's break down the original problem into a Dockerfile, first we need to setup our container base and install some utilities
# For this environment we will be running on Ubuntu 22.04
FROM ubuntu:noble
# Install some developer friendly utilities
RUN apt-get update && \
apt-get install -y build-essential curl ninja-dev cmakeNext we can install our vendor's C compiler, for this example we're assuming it's hosted by our company.
# Download the vendor compiler from our company server
WORKDIR /opt
RUN curl -O https://cdn.my-company.com/vendor/c-compiler.tar.gz && \
tar -xvzf c-compiler.tar.gz -C /opt/c-compiler && \
./c-compiler/install.sh && \
rm -f c-compiler.tar.gz
# Test the compiler works
RUN vendor-c --versionAfter the compiler is ready we can install our previously compiled OpenSSL patch.
# Download the OpenSSL binaries and headers and copy them into our compiler
RUN curl -O https://cdn.my-company.com/project/openssl.tar.gz && \
tar -xvzf openssl.tar.gz -C /opt/openssl && \
cp -a ./openssl/** /opt/c-compiler/ && \
rm -f openssl.tar.gzOnce this is done, we can package it into an image with docker build.
docker build -t my-project:latest .Now we have a docker image that contains everything the team needs to build and compile the MCU application. All they have to do is pull and run it
docker pull my-project:latest
docker run --rm -it my-project:latest -v ~/my-project-src:/workspace /bin/bashThis creates a terminal that is fully ready to build the MCU application, and is identical for all developers that run the container image, including CI/AI pipelines.
Be sure to deploy your container image somewhere the team can access it, either through a private container registry or Docker Hub.
Commercial Benefits
Docker driven development environments are something I use heavily in my development work to both technical and commercial benefit.
I've taken projects that required over 3 hours of setup before the sources could even be compiled, down to just the time taken to install Docker and download the container image.
This same strategy enabled CI automation to be ran on the project, something that was previously deemed too expensive due to the complex build environment being deployed to Azure VMs.

Final Thoughts
Docker driven development environments serve to not only improve developer quality of life but also rapidly decrease on-boarding times while improve time to market and release stability.
While this is a very basic introduction into the concepts of using docker for development environments its the basis of technologies such as VS Code's DevContainers which offer a fully featured in editor workflow.
If your team is struggling with complicated build environments or struggle to deploy to CI/AI tools, why not see if a Dockerised approach works for you?

Comments