Prerequisites
Before integrating Jenkins with Docker on Debian, ensure the following:
sudo apt update && sudo apt install -y docker.io && sudo systemctl start docker && sudo systemctl enable docker
to install and start Docker.Step 1: Deploy Jenkins in a Docker Container
The most common way to integrate Jenkins with Docker on Debian is to run Jenkins itself inside a Docker container. This provides isolation, portability, and easy management.
sudo docker pull jenkins/jenkins:lts
./var/run/docker.sock
) to allow Jenkins to control Docker from within the container:sudo docker run -d --name jenkins -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home -v /var/run/docker.sock:/var/run/docker.sock jenkins/jenkins:lts
.http://<Debian-IP>:8080
in a browser. Retrieve the initial admin password using sudo docker exec jenkins cat /var/jenkins_home/secrets/initialAdminPassword
and enter it on the unlock page.Step 2: Configure Jenkins to Use Docker
To allow Jenkins to execute Docker commands (e.g., building, running containers), you need to grant it access to the Docker daemon.
-v /var/run/docker.sock:/var/run/docker.sock
flag in the docker run
command links the host’s Docker socket to the container, enabling Jenkins to communicate with the Docker daemon.ubuntu:latest
) and test connectivity. If successful, Jenkins can now spin up containers as agents.Step 3: Create a Docker-Based Jenkins Pipeline
Use a Jenkinsfile
(stored in your source code repository) to define a pipeline that leverages Docker for building, testing, and deploying applications. Below is an example pipeline for a Java application:
pipeline {
agent any
stages {
stage('Checkout') {
steps {
checkout scm // Pull code from Git
}
}
stage('Build Docker Image') {
steps {
script {
// Build a Docker image from the current directory (Dockerfile must exist)
docker.build("my-app:${env.BUILD_ID}")
}
}
}
stage('Run Tests in Docker') {
steps {
script {
// Run tests inside a container created from the built image
docker.image("my-app:${env.BUILD_ID}").inside {
sh 'mvn test' // Example: Run Maven tests
}
}
}
}
stage('Push to Docker Registry') {
steps {
script {
// Push the image to Docker Hub (requires credentials)
withDockerRegistry([credentialsId: 'docker-hub-creds', url: '']) {
sh 'docker push my-app:${env.BUILD_ID}'
}
}
}
}
stage('Deploy to Production') {
steps {
script {
// Deploy the image to a production server (example: local Docker host)
sh 'docker stop my-app || true' // Stop existing container if running
sh 'docker rm my-app || true' // Remove existing container if exists
sh 'docker run -d -p 8081:8080 --name my-app my-app:${env.BUILD_ID}'
}
}
}
}
}
docker.build()
: Creates a Docker image from a Dockerfile
in the workspace.docker.image().inside()
: Runs a container from the built image and executes commands inside it.withDockerRegistry()
: Authenticates with a Docker registry (e.g., Docker Hub) to push/pull images.Step 4: Optional - Use Docker Compose for Complex Setups
For applications requiring multiple containers (e.g., a web app + database), use Docker Compose to define and manage the environment.
docker-compose.yml
File:version: '3'
services:
app:
build: .
ports:
- "8081:8080"
depends_on:
- db
db:
image: postgres:13
environment:
POSTGRES_PASSWORD: mysecretpassword
docker-compose
commands in the pipeline to build and start the multi-container environment:stage('Deploy with Docker Compose') {
steps {
script {
sh 'docker-compose up -d --build' // Build and start containers
sh 'docker-compose logs -f' // Tail logs to verify deployment
}
}
}
Best Practices
-v jenkins_home:/var/jenkins_home
) to avoid data loss when the container is recreated.--user
flag with docker run
(e.g., --user jenkins
) or create a dedicated Jenkins user in the container.alpine
, ubuntu:jammy
) in Dockerfile
s to reduce build and runtime overhead.credentialsId
.