Manual image optimization doesn't scale. When multiple developers are pushing changes daily, it's easy for unoptimized images to slip into production. The solution? Automate image compression in your CI/CD pipeline. This guide shows you how.
Why Automate Image Compression?
Automating image optimization in your pipeline provides several benefits:
- Consistency - Every image gets optimized, every time
- Developer productivity - No manual optimization steps
- Quality gates - Block deployments with oversized images
- Versioning - Track optimization alongside code changes
- Speed - Parallel processing of multiple images
Choosing Your Approach
There are two main strategies for CI/CD image optimization:
1. Build-Time Optimization
Optimize images during the build process before deployment:
[Push] → [Build] → [Optimize Images] → [Deploy]
Pros:
- Images optimized before reaching production
- Can fail builds if optimization fails
- Works with any hosting
Cons:
- Increases build time
- Requires compute resources
2. Pre-Commit Optimization
Optimize images before they enter the repository:
[Optimize] → [Commit] → [Push] → [Build] → [Deploy]
Pros:
- Faster builds
- Smaller repository size
- Images always optimized in repo
Cons:
- Requires developer tooling setup
- Less control over pipeline
GitHub Actions Implementation
Basic Image Optimization Workflow
Create .github/workflows/optimize-images.yml:
name: Optimize Images
on:
push:
paths:
- '**.jpg'
- '**.jpeg'
- '**.png'
- '**.gif'
- '**.webp'
jobs:
optimize:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Find changed images
id: changed-images
run: |
IMAGES=$(git diff --name-only ${{ github.event.before }} ${{ github.sha }} | grep -E '\.(jpg|jpeg|png|gif|webp)$' || true)
echo "images=$IMAGES" >> $GITHUB_OUTPUT
- name: Optimize images
if: steps.changed-images.outputs.images != ''
env:
OCTOSQUEEZE_API_KEY: ${{ secrets.OCTOSQUEEZE_API_KEY }}
run: |
for image in ${{ steps.changed-images.outputs.images }}; do
echo "Optimizing: $image"
curl -X POST https://api.octosqueeze.com/v1/compress \
-H "Authorization: Bearer $OCTOSQUEEZE_API_KEY" \
-F "file=@$image" \
-F "format=webp" \
-o "${image%.*}.webp"
done
- name: Commit optimized images
run: |
git config user.name github-actions
git config user.email [email protected]
git add -A
git commit -m "chore: optimize images [skip ci]" || exit 0
git push
Advanced Workflow with Quality Gates
name: Image Quality Gate
on:
pull_request:
paths:
- 'public/images/**'
- 'assets/images/**'
jobs:
check-images:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Check image sizes
id: check-sizes
run: |
MAX_SIZE=500000 # 500KB
FAILED=0
for file in $(find . -type f \( -name "*.jpg" -o -name "*.png" -o -name "*.webp" \)); do
SIZE=$(stat -f%z "$file" 2>/dev/null || stat -c%s "$file")
if [ $SIZE -gt $MAX_SIZE ]; then
echo "::error file=$file::Image exceeds 500KB limit ($SIZE bytes)"
FAILED=1
fi
done
if [ $FAILED -eq 1 ]; then
echo "::error::Some images exceed the size limit. Please optimize them."
exit 1
fi
- name: Verify WebP versions exist
run: |
for jpg in $(find public/images -name "*.jpg" -o -name "*.png"); do
webp="${jpg%.*}.webp"
if [ ! -f "$webp" ]; then
echo "::warning file=$jpg::Missing WebP version"
fi
done
GitLab CI Implementation
Create .gitlab-ci.yml:
stages:
- optimize
- build
- deploy
optimize-images:
stage: optimize
image: curlimages/curl:latest
rules:
- changes:
- "**/*.{jpg,jpeg,png,gif,webp}"
script:
- |
for image in $(git diff --name-only HEAD~1 | grep -E '\.(jpg|jpeg|png|gif|webp)$'); do
echo "Optimizing: $image"
curl -X POST https://api.octosqueeze.com/v1/compress \
-H "Authorization: Bearer $OCTOSQUEEZE_API_KEY" \
-F "file=@$image" \
-F "format=webp" \
-o "optimized/${image%.*}.webp"
done
artifacts:
paths:
- optimized/
expire_in: 1 hour
build:
stage: build
dependencies:
- optimize-images
script:
- cp -r optimized/* public/images/
- npm run build
Jenkins Pipeline
Create Jenkinsfile:
pipeline {
agent any
environment {
OCTOSQUEEZE_API_KEY = credentials('octosqueeze-api-key')
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Optimize Images') {
when {
changeset "**/*.jpg"
changeset "**/*.png"
changeset "**/*.webp"
}
steps {
script {
def images = sh(
script: "git diff --name-only HEAD~1 | grep -E '\\.(jpg|png|webp)\$' || true",
returnStdout: true
).trim().split('\n')
images.each { image ->
if (image) {
sh """
curl -X POST https://api.octosqueeze.com/v1/compress \
-H "Authorization: Bearer ${OCTOSQUEEZE_API_KEY}" \
-F "file=@${image}" \
-F "format=webp" \
-o "${image.replaceAll(/\\.\\w+$/, '.webp')}"
"""
}
}
}
}
}
stage('Build') {
steps {
sh 'npm run build'
}
}
}
}
Pre-Commit Hook Setup
For local optimization before commits, use a pre-commit hook.
Using Husky (Node.js)
Install Husky:
npm install --save-dev husky
npx husky init
Create .husky/pre-commit:
#!/bin/sh
# Find staged images
STAGED_IMAGES=$(git diff --cached --name-only | grep -E '\.(jpg|jpeg|png|gif)$' || true)
if [ -n "$STAGED_IMAGES" ]; then
echo "Optimizing staged images..."
for image in $STAGED_IMAGES; do
if [ -f "$image" ]; then
# Optimize with OctoSqueeze CLI
npx octosqueeze compress "$image" --format webp --replace
git add "$image"
fi
done
fi
Using Git Hooks Directly
Create .git/hooks/pre-commit:
#!/bin/bash
STAGED=$(git diff --cached --name-only --diff-filter=ACM | grep -E '\.(jpg|jpeg|png)$')
if [ -n "$STAGED" ]; then
for file in $STAGED; do
# Call optimization script
./scripts/optimize-image.sh "$file"
git add "$file"
done
fi
Parallel Processing for Speed
When dealing with many images, process them in parallel:
Bash Parallel Processing
#!/bin/bash
optimize_image() {
local image=$1
curl -s -X POST https://api.octosqueeze.com/v1/compress \
-H "Authorization: Bearer $OCTOSQUEEZE_API_KEY" \
-F "file=@$image" \
-F "format=webp" \
-o "${image%.*}.webp"
echo "Optimized: $image"
}
export -f optimize_image
export OCTOSQUEEZE_API_KEY
# Process 4 images in parallel
find . -name "*.jpg" -o -name "*.png" | \
xargs -P 4 -I {} bash -c 'optimize_image "$@"' _ {}
Node.js Concurrent Processing
const { promisify } = require('util');
const exec = promisify(require('child_process').exec);
const glob = require('glob');
const pLimit = require('p-limit');
const limit = pLimit(4); // 4 concurrent requests
async function optimizeAll() {
const images = glob.sync('**/*.{jpg,png}', {
ignore: ['node_modules/**', 'dist/**'],
});
const promises = images.map((image) =>
limit(() => optimizeImage(image))
);
await Promise.all(promises);
}
async function optimizeImage(path) {
const output = path.replace(/\.\w+$/, '.webp');
await exec(`
curl -s -X POST https://api.octosqueeze.com/v1/compress \
-H "Authorization: Bearer ${process.env.OCTOSQUEEZE_API_KEY}" \
-F "file=@${path}" \
-F "format=webp" \
-o "${output}"
`);
console.log(`Optimized: ${path} → ${output}`);
}
optimizeAll();
Caching for Faster Builds
Avoid re-optimizing unchanged images:
Content-Addressable Caching
# GitHub Actions with caching
- name: Cache optimized images
uses: actions/cache@v3
with:
path: .image-cache
key: images-${{ hashFiles('**/*.jpg', '**/*.png') }}
- name: Optimize uncached images
run: |
for image in $(find . -name "*.jpg" -o -name "*.png"); do
HASH=$(md5sum "$image" | cut -d' ' -f1)
CACHED=".image-cache/${HASH}.webp"
if [ -f "$CACHED" ]; then
cp "$CACHED" "${image%.*}.webp"
else
# Optimize and cache
curl ... -o "${image%.*}.webp"
cp "${image%.*}.webp" "$CACHED"
fi
done
Monitoring and Alerts
Track optimization results in your pipeline:
- name: Report optimization stats
run: |
ORIGINAL_SIZE=$(du -sb original-images | cut -f1)
OPTIMIZED_SIZE=$(du -sb optimized-images | cut -f1)
SAVINGS=$((100 - (OPTIMIZED_SIZE * 100 / ORIGINAL_SIZE)))
echo "## Image Optimization Report" >> $GITHUB_STEP_SUMMARY
echo "- Original size: $(numfmt --to=iec $ORIGINAL_SIZE)" >> $GITHUB_STEP_SUMMARY
echo "- Optimized size: $(numfmt --to=iec $OPTIMIZED_SIZE)" >> $GITHUB_STEP_SUMMARY
echo "- Savings: ${SAVINGS}%" >> $GITHUB_STEP_SUMMARY
Conclusion
Automating image compression in CI/CD ensures:
- Consistent optimization across all team members
- Quality gates to prevent oversized images in production
- Faster development with no manual steps
- Reliable deployments with predictable image sizes
Start with a simple workflow that optimizes changed images, then add quality gates and caching as your needs grow. Your pipeline—and your users—will thank you.