Skip to main content
How to Optimize Docker Builds with Cache Management in GitHub Actions
5 min read

How to Optimize Docker Builds with Cache Management in GitHub Actions

CI pipeline >12min? ~$4.2k/month engineering time loss.

Debug & Optimize Your CI/CD Pipeline

$ cicube analyze pipeline
Pipeline Speed
40% Faster (15min → 9min)
Resource Usage
60% Less Cache Misses
Docs →
CICube Dashboard
Live

Introduction

We wanted to share some insights into managing Docker build caches within GitHub Actions, which could help us improve our CI/CD workflows. I’ve adapted the content from some sample README files, so it’s a bit more digestible and practical for us. Here’s a brief rundown of the cache strategies we can leverage:

Steps we will cover in this article:

GitHub Actions Cache

This method uses the GitHub Cache API, making it suitable exclusively for GitHub Actions workflows. It can store cache blobs between jobs in a pipeline, but it's still experimental, so we should test carefully before relying on it for critical builds.

.github/workflows/build.yml
name: Docker Build

on:
push:

jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}

- name: Build and push
uses: docker/build-push-action@v6
with:
push: true
tags: user/app:latest
cache-from: type=gha
cache-to: type=gha,mode=max

Inline Cache

The inline cache exporter is ideal for most cases. It allows for simple caching directly within the image. However, this method only supports "min" cache mode, meaning it won’t aggressively cache everything. If we want to use "max" cache mode, we’ll need to push the cache separately using the registry cache exporter.

.github/workflows/build.yml
name: Docker Build

on:
push:

jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}

- name: Build and push
uses: docker/build-push-action@v6
with:
push: true
tags: user/app:latest
cache-from: type=registry,ref=user/app:latest
cache-to: type=inline

Registry Cache

When more aggressive caching is needed (max mode), we can export the cache to a registry. This option stores cache metadata separately from the image, which helps with larger or more complex builds.

.github/workflows/build.yml
name: Docker Build

on:
push:

jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}

- name: Build and push
uses: docker/build-push-action@v6
with:
push: true
tags: user/app:latest
cache-from: type=registry,ref=user/app:buildcache
cache-to: type=registry,ref=user/app:buildcache,mode=max

Cache Mounts

By default, BuildKit doesn’t save cache mounts between builds. However, we can use an external GitHub Action to work around this limitation by mounting the cache manually. This can be useful for language-specific dependencies, like Go's build cache.

Dockerfile
FROM golang:1.21.1-alpine as base-build

WORKDIR /build
RUN go env -w GOMODCACHE=/root/.cache/go-build

COPY go.mod go.sum ./
RUN --mount=type=cache,target=/root/.cache/go-build go mod download

COPY ./src ./
RUN --mount=type=cache,target=/root/.cache/go-build go build -o /bin/app /build/src
.github/workflows/build.yml
name: Docker Build

on:
push:

jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v3

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: YOUR_IMAGE
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}

- name: Go Build Cache for Docker
uses: actions/cache@v4
with:
path: go-build-cache
key: ${{ runner.os }}-go-build-cache-${{ hashFiles('**/go.sum') }}

- name: inject go-build-cache into docker
uses: reproducible-containers/buildkit-cache-dance@4b2444fec0c0fb9dbf175a96c094720a692ef810 # v2.1.4
with:
cache-source: go-build-cache

- name: Build and push
uses: docker/build-push-action@v6
with:
cache-from: type=gha
cache-to: type=gha,mode=max
file: build/package/Dockerfile
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64

Local Cache

This method leverages local storage for caching Docker layers. The downside is that old cache entries aren’t deleted automatically, so the cache size might increase over time. A temporary fix involves moving and cleaning the cache after each build.

.github/workflows/build.yml
name: Docker Build

on:
push:

jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Cache Docker layers
uses: actions/cache@v4
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-

- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}

- name: Build and push
uses: docker/build-push-action@v6
with:
push: true
tags: user/app:latest
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max

- # Temp fix
# https://github.com/docker/build-push-action/issues/252
# https://github.com/moby/buildkit/issues/1896
name: Move cache
run: |
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache

Conclusion

Optimizing your Docker builds within GitHub Actions by leveraging different caching strategies can dramatically improve your CI/CD performance. Whether you opt for the inline, registry, or GitHub Cache API methods, each approach brings unique benefits suited for various project needs. Additionally, by using cache mounts and local cache, you can ensure more efficient and faster builds, especially for larger projects with multiple dependencies.