Docker Logging Efk Compose

This article explains how to collect Dockerarrow-up-right logs to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses Docker Composearrow-up-right for setting up multiple containers.

Elasticsearcharrow-up-right is an open source search engine known for its ease of use. Kibanaarrow-up-right is an open source Web UI that makes Elasticsearch user friendly for marketers, engineers and data scientists alike.

By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we get a scalable, flexible, easy to use log collection and analytics pipeline. In this article, we will set up 4 containers, each includes:

All of httpd's logs will be ingested into Elasticsearch + Kibana, via Fluentd.

Prerequisites: Docker

Please download and install Docker / Docker Compose. Well, that's it :)

Step 0: prepare docker-compose.yml

First, please prepare docker-compose.yml for Docker Composearrow-up-right. Docker Compose is a tool for defining and running multi-container Docker applications.

With the YAML file below, you can create and start all the services (in this case, Apache, Fluentd, Elasticsearch, Kibana) by one command.

logging section (check Docker Compose documentationarrow-up-right) of web container specifies Docker Fluentd Logging Driverarrow-up-right as a default container logging driver. All of the logs from web container will be automatically forwarded to host:port specified by fluentd-address.

Step 1: Prepare Fluentd image with your Config + Plugin

Then, please prepare fluentd/Dockerfile with the following content, to use Fluentd's official Docker imagearrow-up-right and additionally install Elasticsearch plugin.

Then, please prepare Fluentd's configuration file fluentd/conf/fluent.conf. in_forward plugin is used for receive logs from Docker logging driver, and out_elasticsearch is for forwarding logs to Elasticsearch.

Step 2: Start Containers

Let's start all of the containers, with just one command.

You can check to see if 4 containers are running by docker ps command.

Step 3: Generate httpd Access Logs

Let's access to httpd to generate some access logs. curl command is always your friend.

Step 4: Confirm Logs from Kibana

Please go to http://localhost:5601/ with your browser. Then, you need to set up the index name pattern for Kibana. Please specify fluentd-* to Index name or pattern and press Create button.

Then, go to Discover tab to seek for the logs. As you can see, logs are properly collected into Elasticsearch + Kibana, via Fluentd.

Conclusion

This article explains how to collect logs from Apache to EFK (Elasticsearch + Fluentd + Kibana). The example code is available in this repository.

Learn More

If this article is incorrect or outdated, or omits critical information, please let us knowarrow-up-right. Fluentdarrow-up-right is a open source project under Cloud Native Computing Foundation (CNCF)arrow-up-right. All components are available under the Apache 2 License.

Last updated

Was this helpful?