In this article, we'll show you how to make a Docker image of a Laravel application, push it to the registry, and use Docker in Continuous Delivery.

Actions used in this guide:



Laravel is a fast, user-friendly, and very well documented framework that you can use to build any type of web application. As you can read on its official introduction,

Laravel is a web application framework with expressive, elegant syntax. We believe development must be an enjoyable, creative experience to be truly fulfilling. Laravel attempts to take the pain out of development by easing common tasks used in the majority of web projects, such as authentication, routing, sessions, and caching.


Docker, on the other hand, is a virtualization method that eliminates "works on my machine" problems when collaborating on code with other users. It employs a different approach to the virtualization architecture, which makes it:

  • faster and less resource consuming than regular VM's
  • easier to set up and modify
  • easy to reuse: you can pick an existing Docker image and install any missing libraries and packages (similar to class inheritance)
  • shareable: you can push images to Docker registries and pull them on other machines, just like you push and pull code from Git

Give your app a boost with Docker

How to dockerize a Laravel application?

For the purpose of this guide, we'll use a simple calculator written with the Laravel framework. Just fork from our GitHub profile and run it with php artisan serve.

We have covered the process of writing and running the calculator in this guide.

We shall run the calculator in a Docker container. For that, we need a Dockerfile with the description of the Docker image that will run the Docker container. Seems complicated? Fear no more, we'll walk you through the process step by step.

Install Docker

Begin with installing Docker for your type of OS:

Write a Docker file

The Dockerfile is a text file with the contents and launching method of our Docker image. Add a new file to the repository and name it Dockerfile. When done, copy the following content and paste it to the file:

FROM php:7
RUN apt-get update -y && apt-get install -y openssl zip unzip git
RUN curl -sS | php -- --install-dir=/usr/local/bin --filename=composer
RUN docker-php-ext-install pdo mbstring
COPY . /app
RUN composer install

CMD php artisan serve --host= --port=8181

Docker file details

Line 1:

FROM php:7

Just like we mentioned in the introduction, Docker images can be inherited from other users. In this case we shall PHP in version 7.

Docker images are shared through image registries. The most popular is Docker Hub. Using official Docker images guarantees the template is well written and prepared. You can browse the list of all official Docker images here.

Lines 2-4:

RUN apt-get update -y && apt-get install -y openssl zip unzip git
RUN curl -sS | php -- --install-dir=/usr/local/bin --filename=composer
RUN docker-php-ext-install pdo mbstring

These lines cover the installation of things that we miss: Git, Zip, Unzip and OpenSSL which are required to install Laravel. Next, we install Laravel and run PHP modules: PDO and mbstring.

Lines 5-7:

COPY . /app
RUN composer install

Set the working dir in the container to /app; copy the files of my application to the /app directory and run composer install.

Line 8:

CMD php artisan serve --host= --port=8181

Use CMD to set the command that will be run launching the Docker image. In our case, we're going to serve the application with php artisan set to port 8181.

Line 9:


Expose port 8181 outside the launched container. This is the same port our application is going to use.

Build Docker image

With the Dockerfile in place, we're ready to create the Docker image. Fire up the terminal, go to the directory with the Dockerfile and run:

$ docker build -t my-first-image .

The -t parameter is the tag of the Docker image. The dot at the end of the command means that we're going to build the image in the context of the parent directory. In other words, the files that will be copied to the /app directory in the container are files from the contextual directory of the build (in this case: repository directory).

Run Docker image:

Docker containers are launched with docker run. Don't forget to add the parameter with the port:

$ docker run -p 8181:8181 my-first-image

The application is now available from your browser at http://localhost:8181.

Sharing images through Docker registry

Once created, Docker images can be pushed to a Docker registry and shared with other users. You can create a private registry and serve images from your server, or use a cloud-hosted solution. Currently, the most popular cloud registries are:

In this part of the guide we'll show you how to add an image to Docker Hub so you can use it on another computer:

  1. Create a profile at
  2. Build the Docker image and tag it with your Docker Hub username

     $ docker build -t [USERNAME]/my-first-image .
  3. Log in to Docker Hub with your username and password:

     $ docker login
  4. Push the image to the registry:

     $ docker push [USERNAME]/my-first-image

    Logging in to Docker HubLogging in to Docker Hub

That's it. From now on you pull and run the image on any computer with Docker installed:

$ docker run [USERNAME]/my-first-image

Docker image on Docker HubDocker image on Docker Hub

Of course, the image needs to be downloaded first, so if you're running a business from a secret underwater cave in Antarctica with slow internet connection, you may consider setting up a private registry for better performance.

Docker in action: Use cases

So far we've covered the technical process of building and sharing Docker images. Here we'll share some concrete use cases of using Docker in practice.

Docker in development

Every developer knows that before you can start working on an application you first need to prepare your work environment. This involves a series of pains:

  • install PHP in version X
  • install database in version Y
  • install library in version Z

The more advanced the project, the more dependencies and documentation it involves, which considerably prolongs the process of delivering the application. We won't mention the dependency hell when you're trying to develop two applications with different requirements on one computer.

With Docker, you don't need to write documentation because the whole work environment is defined in the Dockerfile and safely stored in Git. This means you can go back to any revision of the file in the repository to see how your project evolved. Next, Docker is all that you need to install on your computer. Everything else – frameworks, libraries, dependencies – is stored in the Docker image. What you do is run git clone, docker build, and docker run and the app is already running on your computer. What's more, you can create a Dockerfile that will serve the application directly from the directory where you code, so that you won't have to rebuild the Docker image on every change (we shall cover that in a separate guide).

Docker in Q&A

"Works on my machine" – one of the most popular phrases in discussions between testers and developers.

Every developer knows that phraseEvery developer knows that phrase

The thing is, 99% of this kind of problems result from compatibility issues between different environments in which the application is run. One developer codes the application in Linux, another one on Mac, and Q&A runs the tests on Windows. With Docker, the environment is exactly the same across the whole team: the app is forwarded to Q&A in the form of a Docker image pre-configured in the Dockerfile.

Docker for dev/stage/production

Usually, the development process is divided into three consecutive environments:

  • Development – where the actual coding takes place
  • Staging – where the application is reviewed and tested
  • Production – where the final version of the application is hosted

Each of these environments is assigned to a separate branch in the repository. When the work in the development branch is over, it is merged to the staging branch, where Q&A run their tests. Finally, the staging branch is merged into the master (production) branch and the application is ready to be reviewed by the client.

With Docker, each environment is described in a separate Dockerfile for every branch. If you change something in the DEV branch, the STAGE branch remains in the original state until the changes have been merged. This allows for consistency and prevents your

Continuous Delivery with Buddy

Buddy is a tool that lets you streamline the process of building, pushing and running Docker images to a single push to the repository. You can create a separate pipeline for each environment.

Now, we'll add a pipeline that will build, run and push the image to the selected server.

Continuous Delivery pipeline for DockerContinuous Delivery pipeline for Docker

The configuration process is pretty simple:

  1. Create a new pipeline and set the trigger mode to On every push: Pipeline configurationPipeline configuration

  2. Add the Docker image build action: Docker action rosterDocker action roster

  3. Add a Run Docker image action and select Use docker image built in previous action from the dropdown. This action lets you test the newly built image: Run Image action detailsRun Image action details

  4. Once you’ve built and tested your image, you can push it to the registry: Push Image action detailsPush Image action details

  5. Add an SSH action and enter the script that will run the image on selected server: SSH action detailsSSH action details

Now make a push and Buddy will perform all of the above step-by-step. Additionally, each pipeline can be freely expanded with extra actions. For example, you can run unit tests before building the image, and Selenium tests once the app is already running on the server. Only then you can be sure your application is ready to be delivered to Production.