How to Connect Django and Postgres Using Docker and Docker Compose


Author: Natalia Denkiewicz
June 4, 2025

Hi everyone!

In this post, we will explain how to connect a Django application running in a Docker container with a Postgres database, also containerized. To manage both containers easily, we’ll use Docker Compose.

Currently, we are using the versions:

  • Django 5.2.2
  • Docker 28.1.1

We’ll try to keep this post updated with new information If anything changes in future versions.

Feel free to skip the content that you already know and go to the sections that are more useful for you.

You can find the code for this blog in our github.

Getting started: creating a base Django project and installing dependencies

Let’s create a simple Django project that we’ll use to set up the connection with Postgres.

First, we’ll create a virtual environment and install Django. We are using python3-venv and pip but you can use any environment and package manager that you prefer.

#create a base folder to use as root in this project
mkdir django_postgres_docker
cd django_postgres_docker
python3 -m venv env
source env/bin/activate
pip install django

Next, we are going to create a requirements.txt file:

pip freeze > requirements.txt

Now let’s create a small Django project, and to check that everything is working as expected. After running the server, you should see the default Django welcome page.

django-admin startproject djangopg
cd djangopg
python manage.py runserver

After checking that the application is working in localhost:8000, you can stop the server.

Preparing a Dockerfile for the Django Project

Now, we are going to containerize our Django project. To do that, we need to create a Dockerfile.

Step on the root folder django_postgres_docker and create a Dockerfile, and add the following lines:

FROM python:3.13-slim

# Ensure that Python output is sent straight to the terminal (useful for debugging)
ENV PYTHONUNBUFFERED=1

# Create a working directory and copy the Django project into it
RUN mkdir /djangopg
WORKDIR /djangopg
COPY  ./djangopg /djangopg

# Update the package list
RUN apt-get update

# Upgrade pip and install database dependencies
RUN pip install --upgrade pip \
	&& pip install -U psycopg2-binary

# Copy and install Python dependencies
COPY ./requirements.txt /requirements.txt
RUN pip install -r /requirements.txt

Let’s build and run the image to check that everything is working as expected.

First, we’ll build the Docker image. We’re assigning it the name djangopg-img using the --tag (or -t) flag. This way, it’s easier to refer to the image later when we want to run or manage it.

docker build . --tag djangopg-img

And then, we are going to run the container from the image. We are naming it djangopg_docker by using the flag --name.

docker run --name djangopg_docker -p 8000:8000 djangopg-img python manage.py runserver 0.0.0.0:8000

You should be able to see django running in your localhost, port 8000.

Now you can stop the container:

docker stop djangopg_docker

Updating settings.py to connect to a Postgres database

The next step is to update the database configuration in Django settings so it connects to a Postgres database instead of connecting to the default SQLite database.

We'll begin by creating a .env file to store the database credentials. Then, we'll configure Django to load these variables and use them in the database settings.

At this moment, you should have the following files and folders:

# folders tree
django_postgres_docker
    |__ env
    |__ djangopg
        |   |__ djangopg
        |          |__ __init__.py
        |          |__ asgi.py
        |          |__ settings.py
        |          |__ urls.py
        |          |__ wsgi.py
        |__ manage.py
    |__ Dockerfile
    |__ requirements.txt
    |__ .env
#.env
POSTGRES_DB="postgres_db"
POSTGRES_USER="your_user"
POSTGRES_PASSWORD="your_password"
PG_HOST="postgres_db"
PG_PORT=5432
#settings.py

import os

DATABASES = {
   'default': {
       'ENGINE': 'django.db.backends.postgresql_psycopg2',
       'NAME': os.getenv('POSTGRES_DB'),
       'USER': os.getenv('POSTGRES_USER'),
       'PASSWORD': os.getenv('POSTGRES_PASSWORD'),
       'HOST': os.getenv('PG_HOST'),
       'PORT': os.getenv('PG_PORT'),
    }
}

Make sure to keep your .env file out of version control by adding it to your .gitignore file if you're using Git.

Adding a Docker compose file

Now we are going to add a docker-compose.yml file to put together the Django application and the Postgres database. This setup will allow us to manage both containers easily, and define how they interact with each other.

In the root folder django_postgres_docker create a docker-compose.yml and save the following lines on it:

services:
  django:
    build:
      context: .
      dockerfile: ./Dockerfile
    container_name: "djangopg"
    ports:
      - "8000:8000"
    volumes:
      - djangopg:/djangopg
    command: python manage.py runserver 0.0.0.0:8000            
    env_file:
      - .env
    restart: unless-stopped
    depends_on:
      postgres_db:
        condition: service_healthy
  postgres_db:
    image: postgres:16-alpine
    container_name: "postgres_db"
    restart: always
    env_file:
      - .env
    volumes:
      - postgres-data:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U $$POSTGRES_USER -d $$POSTGRES_DB"]
      interval: 5s
      timeout: 5s
      retries: 5
volumes:
  djangopg:
  postgres-data:

This docker compose includes:

  • a Django service (built from the custom Dockerfile)
  • a Postgres service (using the official image)
  • shared environment variables from the .env file
  • checks to ensure Postgres is ready before Django starts and that it can accept connections.
# folders tree
django_postgres_docker
    |__ env
    |__ djangopg
        |   |__ djangopg
        |          |__ __init__.py
        |          |__ asgi.py
        |          |__ settings.py
        |          |__ urls.py
        |          |__ wsgi.py
        |__ manage.py
    |__ Dockerfile
    |__ docker-compose.yml
    |__ requirements.txt
    |__ .env

To build the images defined in the docker-compose.yml file run:

docker compose build

Finally to create and start the containers based on those images, run:

docker compose up -d

The flag -d allows to start the containers in the background (detached mode).

Django should be running in your localhost:8000.

Testing that everything is working as expected: running Django migrations from container

To confirm that Django and Postgres are working together as expected, we’ll run the default migrations that come with any Django project. If everything is set up correctly, Django should create a series of tables in the Postgres database.

So first, we are going to migrate by running the python command inside the django container:

docker exec djangopg python manage.py migrate

And now, let’s access the Postgres container and check if the tables were created:

docker exec -it postgres_db sh

You should see the Postgres prompt. From there, you can enter to the database by doing:

psql -U your_user_name -d postgres_db

And then, list the tables by typing:

\dt

If everything went well, you’ll see the default tables that Django creates, like auth_user, django_migrations, and others.

That’s all for this post! We hope it was helpful and that gave you a clearer idea of how to connect Django and Postgres using Docker and Docker Compose.

But what about if your database is running locally instead of inside a container? How can you connect it with a dockerized Django app? We’ll cover that in an upcoming post on our blog!

If you have any questions, suggestions, or just want to share how it went for you, feel free to reach out. We’d love to hear from you!




About the author

Natalia Denkiewicz

Biologist, programmer and mother of 2 and 4 legged children. She loves nature and spends her time trying to understand small pieces of it.