Setting-Up a Single Node Dockerized ELK Cluster on your Local Machine

Zeeshan Malik
3 min readJul 2, 2020

--

Steps of Installation

ElasticSearch Installation

  1. Install Elastic Search With Docker

Elastic Search is available as docker images. The images use centos:7 as the base image. These images are free to use under the Elastic license.

2. Pulling the Image

Obtaining Elasticsearch for Docker is as simple as issuing a docker pull command against the Elastic Docker registry.

docker pull docker.elastic.co/elasticsearch/elasticsearch:7.7.1

Alternatively, you can download other Docker images that contain only features available under the Apache 2.0 license (docker images).

3. Create a local elasticsearch.yml file

Write Two lines as below in your elasticsearch.yml file

cluster.name: “docker-cluster”
network.host: 0.0.0.0

4. Starting a single node elastic search service with Docker

docker run -p 9200:9200 -p 9300:9300 -e “discovery.type=single-node”  
-v $PWD/elasticsearch.yml:/usr/share/elasticsearch/config/
elasticsearch.yml docker.elastic.co/elasticsearch/
elasticsearch:7.7.1

5. To test Successful Installation of Elastic Search

curl http://localhost:9200

And you should see the below response

HURRAY ELASTIC SEARCH IS INSTALLED. NOW MOVING TO THE SECOND MAJOR STEP.

Kibana Installation

  1. Install Kibana with Docker

Docker images for kibana are available from the Elastic Docker Registry. The base image is centos:7. A list of published Docker images and tags are available at www.docker.elastic.co. The source code is in GitHub.

2. Pull the Docker image For Kibana

docker pull docker.elastic.co/kibana/kibana:7.7.1

3. Create a local kibana.yml file

Write the following lines as below in your kibana.yml file

server.name: kibanaserver.host: “0”elasticsearch.host: [“http://docker-container-id:9200”]monitoring.ui.container.elasticsearch.enabled: true

4. Running a Docker Container for Kibana From an Image

docker run —link container-id -v kibana.yml:/usr/share/kibana/config/kibana.yml -p 5601:5601 docker.elastic.co/kibana/kibana:7.7.1

Note: Replace the(docker-container-id) with your running elastic search docker container Id. Which in my case is as below:- cd491e8552df as below.

Now to test successful installation of kibana linked with Elasticsearch service

5. Open your browser and type

http://localhost:5601

You should see

HURRAY SUCCESSFULLY INSTALLED KIBANA AND LINKED WITH ELASTICSEARCH SERVICE ON YOUR LOCAL MACHINE. NOW MOVING TO THE THIRD MAJOR STEP.

Logstash Installation

Docker images for Logstash are available from the Elastic Docker registry. The base image is centos:7. A list of all published Docker images and tags are available at www.docker.elastic.co. The source code is in Github.

  1. Pulling the Image

Obtaining Logstash for Docker is as simple as issuing a docker pull command against the Elastic Docker Registry.

docker pull docker.elastic.co/logstash/logstash:7.7.1

2. Create a local logstash.yml file.

http.host: "0.0.0.0"xpack.monitoring.elasticsearch.hosts: ["http://docker-container-id:9200"]

3. Create a local logstash.conf file. And add the following content into it

input {file {type => "json"path => "/pipeline/*"start_position => "beginning"sincedb_path => "/dev/null"}}filter {json {source => "message"}}output {elasticsearch {hosts => ["docker-container-id:9200"]index => "logstash-%{+YYYY.MM.dd}"}stdout { codec => json }}

4. Running the Docker Container from the Logstash Image

docker run -it --link docker-container-id -v $PWD/logstash.yml:/usr/share/logstash/config/logstash.yml -v $PWD/pipeline/:/pipeline/ -v $PWD/logstash.conf:/usr/share/logstash/pipeline/logstash.conf docker.elastic.co/logstash/logstash:7.7.1

Note: Replace the(docker-container-id) with your running elastic search docker container Id. Which in my case is as below:- cd491e8552df as below.

4. Test your Logstash Installation

Once you will see Logstash Successfully Runs. You can copy your data you want to index into your pipeline folder and the indexing into elastic search will immediately starts.

--

--