Developing Flask App Locally with Docker: Part I

Arash Soheili
Published January 08, 2017

When developing a new app, one of the trickiest part is setting up your local development environment. It's usually difficult to mimic your production server locally and because of that your run into issues when deploying. That's why it would be ideal to start your development locally with as close to production as possible. This also gives you the advantage of figuring how out to setup your production environment. But how exactly do you do that. Specially if your production is going to have a distributed system. This is where Docker and Docker Compose can help.

At this point most people have heard of containers and Docker which popularized them. Docker uses containers to packages your application into a standard unit. If you'd like to learn more then visit the official site for more info. We're going to use Docker and Docker Compose to build a flask server locally. The advantage here will be that through this process we'll also figure out most of our production setup. That way you can feel confident developing locally and deploying to production.

First install Docker and Docker Compose on your system. We're going to setup a Flask app with Apache and mod_wsgi on Ubuntu 16.04. Docker will allow us to build our containers and Docker Compose will help us to coordinate those containers. I will use this blog's setup as an example. The code can be found on my Github page.

The first things we'll need is a Dockerfile to setup our web server container. A Dockerfile is basically instructions for setting up the container. This is equivalent to what you will have to do on your production server. One thing to note is that in Docker containers your always running as root and therefore you won't need to run sudo commands. But it's always good practice to setup a user on the server. Because we're using Ubuntu we're going to setup an ubuntu user to match what we'll have in production.

Although you can run all your commands in the Dockerfile I prefer to use bash scripts. It makes it more readable and it's also more flexible. I also think it's good practice to separate your server setup from your application code. That way you have a script to setup your production server as well. Let's get to it. Below is the Dockerfile I'm using for this blog.

FROM ubuntu:16.04
ADD . /home/ubuntu/python-blog
WORKDIR /home/ubuntu/python-blog
RUN bash scripts/install_docker_ubuntu.sh
CMD bash scripts/docker_start.sh

Let's analyze this line by line.

FROM ubuntu:16.04

The FROM commands means we're going to use an image hosted on Docker. Ubuntu has official images hosted and we're starting with 16.04. The ubuntu refers to the image and the :16.04 is the tag. If you want the latest image your can use the latest tag. If the image is not available on your local machine then Docker will download it automatically. Afterwards it will use the local image.

ADD . /home/ubuntu/python-blog

The ADD commands let's us copy files from our host machine to the container. In this case we're copying the current directory using . and copying to the /home/ubuntu/python-blog. These give us all of our repo files in the container.

WORKDIR /home/ubuntu/python-blog

By setting our working directory will be able to use relative paths moving forward.

RUN bash scripts/install_docker_ubuntu.sh

The RUN commands allows us to execute code inside the containers. In this case we're going to bash execute install_docker_ubuntu.sh file from our repository. This script setups up the server with all the needed dependencies.

CMD bash scripts/docker_start.sh

The CMD command is special and is used by Docker to execute the container. There can be only one CMD file in a Dockerfile. If there are more than one then the last one will be used. In this instance we're using the CMD to run docker_start.sh. This file contains dependencies for our application.

This is the basics of a Dockerfile and you can get more info and reference on the official page. In the following posts we'll cover the script files and how to put it all together using Docker Compose. In the meanwhile you can checkout the scripts file here. Check back soon for part I of this post.