Docker changed the way applications are deployed and run. For that to function at full throttle, one has to take care of the optimization of their containers. Here are ten best practices that help improve your Docker container performance, organized at the Build, Ship, and Run phases:
Build Phase
Use Official and Verified Base Images
Starting with official and verified base images means you are off to a great base of security and performance for your container.
Example:
FROM nginx:latest
Use Specific Image Versions
Always use a specific version of a base image to avoid any unexpected changes, in order to keep the builds consistent.
Example:
FROM nginx:1.27.0
Use Small Sized Official Images
Smaller base images, like Alpine, reduce the attack surface and improve build times.
Example:
FROM nginx:1.27.0-alpine
Use Multi-stage Builds
Multi-stage builds are a way to create smaller, more efficient Docker images. Isolation of the build environment from the runtime environment is possible, and the size of the final image dramatically goes down. This is where you’ll make sure to include in the final image only the necessary artifacts, reducing its size and potential security vulnerabilities.
Example:
FROM node:22 AS build
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
FROM node:22.5-alpine
WORKDIR /app
COPY --from=build /app/dist ./dist
COPY package*.json ./
RUN npm install --production
CMD ["npm", "start"]
Minimize Layer Count
Every instruction in a Dockerfile adds a new layer. Fewer layers generally mean faster builds and smaller images. Combine related commands using the &&
operator and clean up in the same RUN
instruction.
Example:
RUN apt-get update && \
apt-get install -y python3 python3-pip && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
Leverage Docker Cache
Docker uses caching of intermediate layers between runs to speed up subsequent builds. Order your Dockerfile instructions from least to most frequently changing. This maximizes cache usage and reduces build times.
Example:
COPY package.json package-lock.json ./
RUN npm install
COPY . .
Use .dockerignore
A .dockerignore
file will prevent unwanted files from being copied into the build context, reducing both build time and image size.
Example .dockerignore
:
node_modules
npm-debug.log
Dockerfile
.git
.gitignore
Ship Phase
Scan Your Images for Security Vulnerabilities
Scan Docker images regularly against known vulnerabilities with tools like Docker Scout, Trivy, or Snyk.
Example:
docker run -it \
-e DOCKER_SCOUT_HUB_USER=<your Docker Hub user name> \
-e DOCKER_SCOUT_HUB_PASSWORD=<your Docker Hub PAT> \
docker/scout-cli <command>
Run Phase
Use the Least Privileged User
Run all your applications with a nonprivileged user to maximize security.
Example:
FROM node:22.5-alpine
WORKDIR /app
COPY . .
RUN adduser -D myuser
USER myuser
CMD ["npm", "start"]
Implement Resource Limits
Set a memory and CPU limit for your containers so that contention for common resources is avoided and performance is consistent across different environments.
Example docker-compose.yml
:
version: '3'
services:
app:
image: myapp:latest
deploy:
resources:
limits:
cpus: '0.50'
memory: 512M
The best practices provided here will enable you to create leaner, fitter Docker containers for better performance. Optimization in itself is a continuous process; therefore, based on changing requirements of your Dockerfiles and configurations of the containers, keep reviewing them to keep the performance at its peak.
As a developer, these practices have been invaluable to me for both my own personal and professional projects, helping me design lighter, faster, more reliable, and often more complex applications. I encourage you to experiment with these different techniques and see how they can help improve your Docker workflow.