Run Multiple Commands in Docker Compose (4 Methods + Examples & Best Practices)

Run Multiple Commands in Docker Compose (4 Methods + Examples & Best Practices)

Docker Compose allows you to run multiple commands inside containers, but choosing the right method depends on your use case—whether it’s sequential execution, parallel tasks, or complex workflows. Using the correct approach helps avoid failures, improves maintainability, and ensures predictable container behavior. In this guide, you’ll learn practical ways to run multiple commands efficiently with real-world scenarios.

Quick cheat sheet: Docker Compose multiple commands

MethodSyntaxExecution TypeBest Use Case
Sequential (&&)command: sh -c "cmd1 && cmd2"SequentialDependent commands (build → start)
Multiline (``)YAML multiline blockSequential (readable)
docker-compose rundocker-compose run service cmdOne-offAd-hoc tasks / debugging
docker-compose execdocker-compose exec service cmdOn running containerDebug or maintenance
Shell scriptcommand: /script.shSequentialComplex workflows
SupervisorProcess managerParallelMultiple long-running processes

Quick decision guide:

  • Use && → when commands depend on each other
  • Use multiline (|) → when you want readability
  • Use exec → when container is already running
  • Use scripts → when logic becomes complex
  • Use Supervisor → when running multiple services in one container

Method 1: Run multiple commands sequentially using &&

How && controls execution flow

The && operator allows you to run multiple commands sequentially, ensuring that each command executes only if the previous one succeeds. This is useful when commands are dependent on each other.

Example:

bash
command: sh -c "npm install && npm run build && npm start"

In this case:

  • npm install must succeed before npm run build runs
  • npm run build must succeed before npm start runs

If any command fails, the remaining commands are skipped, which helps prevent inconsistent application states.

When to allow failure and continue execution

Sometimes you may want all commands to run even if one fails. In such cases, you can modify the behavior:

bash
command: sh -c "command1 && command2 ; command3"

Or force success using:

bash
command: sh -c "command1 && command2 ; exit 0"

Use this approach carefully, as it may hide failures and make debugging harder.


Method 2: Run multiple commands using multiline (|) syntax

Writing clean multi-line commands in docker-compose.yml

For better readability, especially when commands become long, you can use YAML multiline syntax:

bash
command:
  - /bin/sh
  - -c
  - |
    echo "Installing dependencies"
    npm install
    echo "Starting application"
    npm start

This keeps your configuration clean and easier to maintain compared to long single-line commands.

Although commands are written on separate lines, they are executed sequentially inside the container. Unlike &&, this approach does not automatically stop execution on failure unless explicitly handled.

Output is displayed line by line, making it easier to debug multi-step workflows.


Method 3: Use shell scripts for complex workflows

When scripts are better than inline commands

When your workflow involves multiple steps, conditional logic, or loops, shell scripts are a better choice than inline commands.

Example:

text
command: /app/start.sh

Example script:

bash
#!/bin/sh
echo "Running migrations"
npm run migrate

echo "Starting application"
npm start

Scripts improve readability, reusability, and make debugging easier.

Managing environment variables and arguments

Shell scripts make it easier to work with environment variables and pass arguments:

bash
#!/bin/sh
echo "Environment: $ENV"
npm run build -- --env=$ENV
npm start

You can define variables in docker-compose.yml:

bash
environment:
  - ENV=production

This approach provides flexibility and is ideal for production-grade workflows.


Method 4: Manage multiple processes using Supervisor

Running multiple services inside one container

Supervisor is a process manager that allows you to run and manage multiple processes inside a single container. This is useful when your application requires multiple long-running services such as a web server, worker, and scheduler.

Example setup:

bash
# Dockerfile
FROM alpine
RUN apk add --no-cache supervisor
COPY supervisord.conf /etc/supervisord.conf
CMD ["supervisord", "-c", "/etc/supervisord.conf"]

Example Supervisor configuration:

bash
[supervisord]
nodaemon=true

[program:web]
command=npm start

[program:worker]
command=npm run worker

This ensures both processes run simultaneously and are managed properly.

When Supervisor is required vs overkill

Use Supervisor when:

  • You must run multiple long-running processes in one container
  • Processes need automatic restart or monitoring
  • You are handling tightly coupled services

Avoid Supervisor when:

  • You can split services into separate containers (recommended Docker practice)
  • You only need to run sequential commands
  • Simpler methods like && or scripts are sufficient

Practical example: Running multiple commands in Docker Compose

Below is a real-world example where a Node.js application performs multiple steps before starting:

  • Install dependencies
  • Run database migrations
  • Start the application
bash
# docker-compose.yml
version: '3'
services:
  app:
    image: node:18
    working_dir: /usr/src/app
    volumes:
      - .:/usr/src/app
    ports:
      - "3000:3000"
    command: >
      sh -c "npm install &&
             npm run migrate &&
             npm run build &&
             npm start"

How this works

  • npm install → installs dependencies
  • npm run migrate → prepares database schema
  • npm run build → builds application assets
  • npm start → starts the application

👉 Each command runs sequentially using &&, ensuring the next step only runs if the previous one succeeds.

What happens if a command fails

If npm run migrate fails:

  • npm run build and npm start will NOT execute
  • Container exits immediately

This is the recommended behavior in production, as it prevents running a broken application.

If you want to debug issues before removing or restarting containers, you can:


Frequently Asked Questions

1. How do I run multiple commands in Docker Compose?

You can run multiple commands in Docker Compose using operators like && for sequential execution, multiline commands with |, shell scripts, or process managers like Supervisor.

2. What is the difference between docker-compose run and exec?

docker-compose run creates a new container and runs a command, while docker-compose exec runs a command inside an already running container.

3. Can I run commands in parallel in Docker Compose?

Yes, using multiline command syntax or tools like Supervisor, you can execute multiple commands concurrently inside a container.

4. What is the best method to run multiple commands in Docker Compose?

It depends on the use case. Use && for simple sequential execution, scripts for complex workflows, and Supervisor for managing multiple processes.

  • Use && for simple sequential execution
  • Use multiline syntax (|) for readability
  • Use scripts for complex workflows
  • Use Supervisor only when managing multiple long-running processes
  • Prefer single-purpose containers whenever possible

Choosing the right method depends on your use case, complexity, and environment. Start simple and move to advanced approaches only when required.


Official Documentation

Deepak Prasad

Deepak Prasad

R&D Engineer

Founder of GoLinuxCloud with over a decade of expertise in Linux, Python, Go, Laravel, DevOps, Kubernetes, Git, Shell scripting, OpenShift, AWS, Networking, and Security. With extensive experience, he excels across development, DevOps, networking, and security, delivering robust and efficient solutions for diverse projects.