Run Elasticsearch On Docker

When I started to learn about Elasticsearch I had to find an environment to play with. After quick research on the net, I found that the simplest solution is to run Elasticsearch on a docker client.

In this article, we will walk through a simple scenario of running Elasticsearch on docker. Before we dive in, I would like to make a note that this is not an in-depth article on Elasticsearch nor on docker. Both products are well documented and you can find many comprehensive courses on the net for each of these products. This article aims to give you a quick start on working with these two products together.

First thing first

First, you need to have a docker engine.
According to docker official documentation:

Docker Engine is a client-server application with these major components:

  • A server which is a type of long-running program called a daemon process (the dockerd command).
  • A REST API which specifies interfaces that programs can use to talk to the daemon and instruct it what to do.
  • A command line interface (CLI) client (the docker command).

The CLI uses the Docker REST API to control or interact with the Docker daemon through scripting or direct CLI commands. Many other Docker applications use the underlying API and CLI.

The daemon creates and manages Docker objects, such as images, containers, networks, and volumes. 

engine-components-flow

Image source –https://docs.docker.com/engine/docker-overview/

In order to run a Docker Engine you have Two major options:

Each of these options has his pros and cons, In this article, I choose to concentrate on running the docker on your local machine.
I found this option more simple and intuitive for development purposes. You have full control over your environment and you don’t need internet access. If you choose to run your docker engine in the cloud then beside the Docker installation process, the other steps are the same.

docker_architecture

image src –https://docs.docker.com/engine/docker-overview/

Interacting with your Docker

It doesn’t matter which installation process you choose, You will interact with your Docker through a Docker Client (CLI).

The Docker client (docker) is the primary way that many Docker users interact with Docker. When you use commands such as docker run, the client sends these commands to dockerd, which carries them out. The docker command uses the Docker API. The Docker client can communicate with more than one daemon. – docker official documentation

If you are new to dockers, I encourage you to read this Overview it will give you a basic understanding of the docker concepts.

Create your Elasticsearch container

Latest instructions – check for the latest Elasticsearch version

In order to create your container you first need to obtain the Elasticsearch image.
Run this command on the CLI

docker pull docker.elastic.co/elasticsearch/elasticsearch:7.0.0

This command will go tho the registry and pull the image

A Docker registry stores Docker images. Docker Hub is a public registry that anyone can use, and Docker is configured to look for images on Docker Hub by default. You can even run your own private registry.
An image is a read-only template with instructions for creating a Docker container. Often, an image is based on another image, with some additional customization. For example, you may build an image which is based on the ubuntu image, but installs the Apache web server and your application, as well as the configuration details needed to make your application run.– docker official documentation

Create a network

docker network create somenetwork

Create and run your container:

A container is a runnable instance of an image. You can create, start, stop, move, or delete a container using the Docker API or CLI. You can connect a container to one or more networks, attach storage to it, or even create a new image based on its current state.– docker official documentation

docker run -d --name myelasticsearch --net somenetwork -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:7.0.0

run on different port:

docker run -d --name myelasticsearch --net somenetwork -p 9500:9200 -p 9600:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:7.0.0

Get a list of all running containers

docker ps

Get a list of all containers (running\stopped)

docker ps -a

Stop container – stop running container

docker stop myelasticsearch 

Start container – you already created the container but it is stopped – restart the container with all its previous changes

docker start myelasticsearch

Access your Elasticsearch container

After your Elasticsearch container is up and running you can access it through the port that you declared in the “docker run” command

http://localhost:9200

Useful Docker commands

You can find the documentation for all commands here

docker images – list of all images in my  engine

docker pull “image name” –  Pull an image or a repository from a registry

docker run “image name” –  create a new container and immediately run it

docker ps -a –  list of all containers

docker rm “container id” –  delete a container

docker rmi “image id” – delete image

docker network create somenetwork – create a new network

docker network ls –  list of all networks

docker inspect network 1ba – inspect specific network

docker start “container  id ” – restart an existing container

Next Steps

This was a quick start of running your Elasticsearch client on a docker. From here you can enjoy playing with your local Elasticsearch

Advertisement

2 thoughts on “Run Elasticsearch On Docker”

  1. Hey Royi Benita, I was reading one of your articles on medium. This one to be precise: https://betterprogramming.pub/implementing-a-generic-repository-pattern-using-nestjs-fb4db1b61cce
    I wanted to ask, why did you have to use both entities and schemas in your implementation of the repository pattern?
    Was it a necessity or a preference? I am trying to implement the repository pattern in one of my projects and so far that part has been really confusing for me because I already do have schemas but now I am thinking if I also need entities. I would appreciate hearing back from you as soon as possible

    Like

    1. Hey Victor,
      The reason to create both entity and schema is to follow the clean architecture paradigm and separate the business (entities) from the schema(framework). It is your decision if you need them both. Creating only schema will couple your project with mongoose for instance and make the replacement of it harder. This is an architecture tradeoff decision. Hope it answers your question

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s