Fluentd
Search…
Docker Compose
This article explains how to collect Docker logs and propagate them to EFK (Elasticsearch + Fluentd + Kibana) stack. The example uses Docker Compose for setting up multiple containers.
Kibana
Elasticsearch had been an open-source search engine known for its ease of use. Kibana had been an open-source Web UI that makes Elasticsearch user-friendly for marketers, engineers and data scientists alike.
NOTE: Since v7.11, These products are distributed under non open-source license (Dual licensed under Server Side Public License and Elastic License)
By combining these three tools EFK (Elasticsearch + Fluentd + Kibana) we get a scalable, flexible, easy to use log collection and analytics pipeline. In this article, we will set up four (4) containers, each includes:
All the logs of httpd will be ingested into Elasticsearch + Kibana, via Fluentd.

Prerequisites: Docker

Please download and install Docker / Docker Compose. Well, that's it :)

Step 0: Create docker-compose.yml

Create docker-compose.yml for Docker Compose. Docker Compose is a tool for defining and running multi-container Docker applications.
With the YAML file below, you can create and start all the services (in this case, Apache, Fluentd, Elasticsearch, Kibana) by one command:
1
version: "3"
2
services:
3
web:
4
image: httpd
5
ports:
6
- "80:80"
7
links:
8
- fluentd
9
logging:
10
driver: "fluentd"
11
options:
12
fluentd-address: localhost:24224
13
tag: httpd.access
14
15
fluentd:
16
build: ./fluentd
17
volumes:
18
- ./fluentd/conf:/fluentd/etc
19
links:
20
- "elasticsearch"
21
ports:
22
- "24224:24224"
23
- "24224:24224/udp"
24
25
elasticsearch:
26
image: docker.elastic.co/elasticsearch/elasticsearch:7.13.1
27
container_name: elasticsearch
28
environment:
29
- "discovery.type=single-node"
30
expose:
31
- "9200"
32
ports:
33
- "9200:9200"
34
35
kibana:
36
image: docker.elastic.co/kibana/kibana:7.13.1
37
links:
38
- "elasticsearch"
39
ports:
40
- "5601:5601"
Copied!
The logging section (check Docker Compose documentation) of web container specifies Docker Fluentd Logging Driver as a default container logging driver. All the logs from the web container will automatically be forwarded to host:port specified by fluentd-address.

Step 1: Create Fluentd Image with your Config + Plugin

Create fluentd/Dockerfile with the following content using the Fluentd official Docker image; and then, install the Elasticsearch plugin:
1
# fluentd/Dockerfile
2
3
FROM fluent/fluentd:v1.12.0-debian-1.0
4
USER root
5
RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-document", "--version", "5.0.3"]
6
USER fluent
Copied!
Then, create the Fluentd configuration file fluentd/conf/fluent.conf. The forward input plugin receives logs from the Docker logging driver and elasticsearch output plugin forwards these logs to Elasticsearch.
1
# fluentd/conf/fluent.conf
2
3
<source>
4
@type forward
5
port 24224
6
bind 0.0.0.0
7
</source>
8
9
<match *.**>
10
@type copy
11
12
<store>
13
@type elasticsearch
14
host elasticsearch
15
port 9200
16
logstash_format true
17
logstash_prefix fluentd
18
logstash_dateformat %Y%m%d
19
include_tag_key true
20
type_name access_log
21
tag_key @log_name
22
flush_interval 1s
23
</store>
24
25
<store>
26
@type stdout
27
</store>
28
</match>
Copied!
NOTE: The detail of used parameters for @type elasticsearch, see Elasticsearch parameters section and fluent-plugin-elasticsearch furthermore.

Step 2: Start the Containers

Let's start the containers:
1
$ docker-compose up --detach
Copied!
Use docker ps command to verify that the four (4) containers are up and running:
1
$ docker ps
2
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
3
60a8c3c8fcab httpd "httpd-foreground" 6 minutes ago Up 6 minutes 0.0.0.0:80->80/tcp, :::80->80/tcp fluentd-elastic-kibana_web_1
4
43df4d266636 fluentd-elastic-kibana_fluentd "tini -- /bin/entryp…" 6 minutes ago Up 6 minutes 5140/tcp, 0.0.0.0:24224->24224/tcp, 0.0.0.0:24224->24224/udp, :::24224->24224/tcp, :::24224->24224/udp fluentd-elastic-kibana_fluentd_1
5
6a63ad1ddef1 docker.elastic.co/kibana/kibana:7.13.1 "/bin/tini -- /usr/l…" 6 minutes ago Up 6 minutes 0.0.0.0:5601->5601/tcp, :::5601->5601/tcp fluentd-elastic-kibana_kibana_1
6
6168bd075497 docker.elastic.co/elasticsearch/elasticsearch:7.13.1 "/bin/tini -- /usr/l…" 6 minutes ago Up 6 minutes 0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 9300/tcp elasticsearch
Copied!

Step 3: Generate httpd Access Logs

Use curl command to generate some access logs like this:
1
$ curl http://localhost:80/[1-10]
2
<html><body><h1>It works!</h1></body></html>
3
<html><body><h1>It works!</h1></body></html>
4
<html><body><h1>It works!</h1></body></html>
5
<html><body><h1>It works!</h1></body></html>
6
<html><body><h1>It works!</h1></body></html>
7
<html><body><h1>It works!</h1></body></html>
8
<html><body><h1>It works!</h1></body></html>
9
<html><body><h1>It works!</h1></body></html>
10
<html><body><h1>It works!</h1></body></html>
11
<html><body><h1>It works!</h1></body></html>
Copied!

Step 4: Confirm Logs from Kibana

Browse to http://localhost:5601/app/management/kibana/indexPatterns and set up the index name pattern for Kibana. Specify fluentd-* to Index name or pattern and click Create.
Then, go to Discover tab to check the logs. As you can see, logs are properly collected into the Elasticsearch + Kibana, via Fluentd.
Kibana Discover

Code

Learn More

If this article is incorrect or outdated, or omits critical information, please let us know. Fluentd is an open-source project under Cloud Native Computing Foundation (CNCF). All components are available under the Apache 2 License.
Last modified 2mo ago