Load Balancing with Nginx and Docker: A Step-by-Step Guide
In today’s web development landscape, ensuring high availability and reliability of web applications is crucial. One way to achieve this is through load balancing, which distributes incoming traffic across multiple servers. In this blog post, we’ll walk you through setting up a simple load balancer using Nginx and Docker containers, each serving a basic HTML file.
Prerequisites
Before we begin, ensure you have the following installed on your machine:
Docker: To create and manage containers.
Docker Compose: To manage multi-container applications easily.
Project Setup
Start by creating a project directory to hold all our files:
mkdir nginx-load-balancing
cd nginx-load-balancing
Creating Simple Web Pages
Next, we will create two Docker containers, each serving a simple index.html file. Create directories for each container:
mkdir container1 container2
Inside each container directory, create an index.html file with the following commands:
echo "Welcome from Container 1" > container1/index.html
echo "Welcome from Container 2" > container2/index.html
These files will display different welcome messages when accessed.
Dockerfile Configuration
Now, let's create a Dockerfile for each container to set up a lightweight web server using Nginx. Create the following Dockerfile in each container directory:
For Container 1 (container1/Dockerfile):
Dockerfile
FROM nginx:alpine
COPY index.html /usr/share/nginx/html/index.html
For Container 2 (container2/Dockerfile):
Dockerfile
FROM nginx:alpine
COPY index.html /usr/share/nginx/html/index.html
These Dockerfiles will create Nginx-based images that serve the respective index.html files.
Building Docker Images
With our Dockerfiles in place, let's build the Docker images for both containers:
docker build -t web-container1 container1
docker build -t web-container2 container2
Running Docker Containers
We will run each container, mapping them to different ports:
docker run -d --name container1 -p 8081:80 web-container1
docker run -d --name container2 -p 8082:80 web-container2
Here, Container 1 is accessible on port 8081 and Container 2 on port 8082.
Nginx Load Balancer Configuration
Now, we will configure Nginx as a load balancer. Create an nginx.conf file in the nginx-load-balancing directory:
touch nginx.conf
Add the following configuration to nginx.conf:
events {}
http {
upstream backend {
server container1:80;
server container2:80;
}
server {
listen 80;
location / {
proxy_pass http://backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
}
This configuration sets up an upstream block that points to both containers, allowing Nginx to distribute requests between them.
Docker Compose Setup
To simplify our deployment, we will use Docker Compose. Create a docker-compose.yml file in the nginx-load-balancing directory:
version: '3'
services:
nginx:
image: nginx:alpine
container_name: nginx_load_balancer
ports:
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf:ro
depends_on:
- container1
- container2
container1:
image: web-container1
container_name: container1
expose:
- "80"
container2:
image: web-container2
container_name: container2
expose:
- "80"
This docker-compose.yml file defines our Nginx service and the two web container services.
Starting the Load Balancer
Now that everything is set up, we can start our load balancer and the web servers using Docker Compose:
docker-compose up -d
This command will pull the necessary images and start the containers in detached mode.
Testing the Load Balancer
Once everything is up and running, open your web browser and navigate to http://localhost. You should see either:
"Welcome from Container 1"
"Welcome from Container 2"
Refresh the page a few times, and you’ll notice the message alternates between the two containers, thanks to the load balancing setup.
Conclusion
In this tutorial, we demonstrated how to set up a simple load balancing solution using Nginx and Docker. This setup not only improves the reliability of your web application but also allows you to scale easily as traffic increases. You can extend this setup by adding more containers or integrating additional features such as SSL termination or monitoring.