Docker Compose

This article explains how to collect Dockerarrow-up-right logs and propagate them to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses Docker Composearrow-up-right for setting up multiple containers.

Kibana

Elasticsearcharrow-up-right had been an open-source search engine known for its ease of use. Kibanaarrow-up-right had been an open-source Web UI that makes Elasticsearch user-friendly for marketers, engineers and data scientists alike.

NOTE: Since v7.11, These products are distributed under non open-source license (Dual licensed under Server Side Public License and Elastic License)

By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we get a scalable, flexible, easy to use log collection and analytics pipeline. In this article, we will set up four (4) containers, each includes:

All the logs of httpd will be ingested into Elasticsearch + Kibana, via Fluentd.

Prerequisites: Docker

Please download and install Docker / Docker Compose. Well, that's it :)

Step 0: Create docker-compose.yml

Create docker-compose.yml for Docker Composearrow-up-right. Docker Compose is a tool for defining and running multi-container Docker applications.

With the YAML file below, you can create and start all the services (in this case, Apache, Fluentd, Elasticsearch, Kibana) by one command:

The logging section (check Docker Compose documentationarrow-up-right) of web container specifies Docker Fluentd Logging Driverarrow-up-right as a default container logging driver. All the logs from the web container will automatically be forwarded to host:port specified by fluentd-address.

Step 1: Create Fluentd Image with your Config + Plugin

Create fluentd/Dockerfile with the following content using the Fluentd official Docker imagearrow-up-right; and then, install the Elasticsearch plugin:

Then, create the Fluentd configuration file fluentd/conf/fluent.conf. The forward input plugin receives logs from the Docker logging driver and elasticsearch output plugin forwards these logs to Elasticsearch.

NOTE: The detail of used parameters for @type elasticsearch, see Elasticsearch parameters section and fluent-plugin-elasticsearcharrow-up-right furthermore.

Step 2: Start the Containers

Let's start the containers:

Use docker ps command to verify that the four (4) containers are up and running:

Step 3: Generate httpd Access Logs

Use curl command to generate some access logs like this:

Step 4: Confirm Logs from Kibana

Browse to http://localhost:5601/app/discover#/arrow-up-right and create data view. Kibana Discover

Specify fluentd-* to Index pattern and click Save data view to Kibana. Kibana Discover

Then, go to Discover tab to check the logs. As you can see, logs are properly collected into the Elasticsearch + Kibana, via Fluentd.

Kibana Discover

Learn More

If this article is incorrect or outdated, or omits critical information, please let us knowarrow-up-right. Fluentdarrow-up-right is an open-source project under Cloud Native Computing Foundation (CNCF)arrow-up-right. All components are available under the Apache 2 License.

Last updated

Was this helpful?