Fluentd
1.0
1.0
  • Introduction
  • Overview
    • Life of a Fluentd event
    • Support
    • FAQ
    • Logo
    • fluent-package v5 vs td-agent v4
  • Installation
    • Before Installation
    • Install fluent-package
      • RPM Package (Red Hat Linux)
      • DEB Package (Debian/Ubuntu)
      • .dmg Package (macOS)
      • .msi Installer (Windows)
    • Install calyptia-fluentd
      • RPM Package (Red Hat Linux)
      • DEB Package (Debian/Ubuntu)
      • .dmg Package (macOS)
      • .msi Installer (Windows)
    • Install by Ruby Gem
    • Install from Source
    • Post Installation Guide
    • Obsolete Installation
      • Treasure Agent v4 (EOL) Installation
        • Install by RPM Package v4 (Red Hat Linux)
        • Install by DEB Package v4 (Debian/Ubuntu)
        • Install by .dmg Package v4 (macOS)
        • Install by .msi Installer v4 (Windows)
      • Treasure Agent v3 (EOL) Installation
        • Install by RPM Package v3 (Red Hat Linux)
        • Install by DEB Package v3 (Debian/Ubuntu)
        • Install by .dmg Package v3 (macOS)
        • Install by .msi Installer v3 (Windows)
  • Configuration
    • Config File Syntax
    • Config File Syntax (YAML)
    • Routing Examples
    • Config: Common Parameters
    • Config: Parse Section
    • Config: Buffer Section
    • Config: Format Section
    • Config: Extract Section
    • Config: Inject Section
    • Config: Transport Section
    • Config: Storage Section
    • Config: Service Discovery Section
  • Deployment
    • System Configuration
    • Logging
    • Signals
    • RPC
    • High Availability Config
    • Performance Tuning
    • Multi Process Workers
    • Failure Scenarios
    • Plugin Management
    • Trouble Shooting
    • Fluentd UI
    • Linux Capability
    • Command Line Option
    • Source Only Mode
    • Zero-downtime restart
  • Container Deployment
    • Docker Image
    • Docker Logging Driver
    • Docker Compose
    • Kubernetes
  • Monitoring Fluentd
    • Overview
    • Monitoring by Prometheus
    • Monitoring by REST API
  • Input Plugins
    • tail
    • forward
    • udp
    • tcp
    • unix
    • http
    • syslog
    • exec
    • sample
    • monitor_agent
    • windows_eventlog
  • Output Plugins
    • file
    • forward
    • http
    • exec
    • exec_filter
    • secondary_file
    • copy
    • relabel
    • roundrobin
    • stdout
    • null
    • s3
    • kafka
    • elasticsearch
    • opensearch
    • mongo
    • mongo_replset
    • rewrite_tag_filter
    • webhdfs
    • buffer
  • Filter Plugins
    • record_transformer
    • grep
    • parser
    • geoip
    • stdout
  • Parser Plugins
    • regexp
    • apache2
    • apache_error
    • nginx
    • syslog
    • ltsv
    • csv
    • tsv
    • json
    • msgpack
    • multiline
    • none
  • Formatter Plugins
    • out_file
    • json
    • ltsv
    • csv
    • msgpack
    • hash
    • single_value
    • stdout
    • tsv
  • Buffer Plugins
    • memory
    • file
    • file_single
  • Storage Plugins
    • local
  • Service Discovery Plugins
    • static
    • file
    • srv
  • Metrics Plugins
    • local
  • How-to Guides
    • Stream Analytics with Materialize
    • Send Apache Logs to S3
    • Send Apache Logs to Minio
    • Send Apache Logs to Mongodb
    • Send Syslog Data to Graylog
    • Send Syslog Data to InfluxDB
    • Send Syslog Data to Sematext
    • Data Analytics with Treasure Data
    • Data Collection with Hadoop (HDFS)
    • Simple Stream Processing with Fluentd
    • Stream Processing with Norikra
    • Stream Processing with Kinesis
    • Free Alternative To Splunk
    • Email Alerting like Splunk
    • How to Parse Syslog Messages
    • Cloud Data Logging with Raspberry Pi
  • Language Bindings
    • Java
    • Ruby
    • Python
    • Perl
    • PHP
    • Nodejs
    • Scala
  • Plugin Development
    • How to Write Input Plugin
    • How to Write Base Plugin
    • How to Write Buffer Plugin
    • How to Write Filter Plugin
    • How to Write Formatter Plugin
    • How to Write Output Plugin
    • How to Write Parser Plugin
    • How to Write Storage Plugin
    • How to Write Service Discovery Plugin
    • How to Write Tests for Plugin
    • Configuration Parameter Types
    • Upgrade Plugin from v0.12
  • Plugin Helper API
    • Plugin Helper: Child Process
    • Plugin Helper: Compat Parameters
    • Plugin Helper: Event Emitter
    • Plugin Helper: Event Loop
    • Plugin Helper: Extract
    • Plugin Helper: Formatter
    • Plugin Helper: Inject
    • Plugin Helper: Parser
    • Plugin Helper: Record Accessor
    • Plugin Helper: Server
    • Plugin Helper: Socket
    • Plugin Helper: Storage
    • Plugin Helper: Thread
    • Plugin Helper: Timer
    • Plugin Helper: Http Server
    • Plugin Helper: Service Discovery
  • Troubleshooting Guide
  • Appendix
    • Update from v0.12 to v1
    • td-agent v2 vs v3 vs v4
Powered by GitBook
On this page
  • Prerequisites
  • Java for Elasticsearch
  • Set Up Elasticsearch
  • Set Up Kibana
  • Set Up Fluentd (fluent-package)
  • Set Up rsyslogd
  • Store and Search Event Logs
  • Conclusion
  • Learn More

Was this helpful?

  1. How-to Guides

Free Alternative To Splunk

PreviousStream Processing with KinesisNextEmail Alerting like Splunk

Last updated 3 months ago

Was this helpful?

is a great tool for searching logs, but its high cost makes it prohibitive for many teams. In this article, we present a free and open-source alternative to Splunk by combining three open source projects: Elasticsearch, Kibana, and Fluentd.

is an open-source search engine well-known for its ease of use. is an open-source Web UI that makes Elasticsearch user friendly for marketers, engineers and data scientists alike.

By combining these three tools (Fluentd + Elasticsearch + Kibana) we get a scalable, flexible, easy to use log search engine with a great Web UI that provides an open-source Splunk alternative, all for free.

In this guide, we will go over the installation, setup, and basic use of this combined log search solution. This article was tested on Ubuntu 24.04. If you're not familiar with Fluentd, please learn more about Fluentd first.

Prerequisites

  • Java runtime (OpenJDK - JRE 21)

You can install Fluentd via major packaging systems.

Java for Elasticsearch

Please confirm that Java version 21 or higher is installed:

$ java --version
openjdk 21.0.5 2024-10-15
OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu124.04)
OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu124.04, mixed mode, sharing)

Set Up Elasticsearch

To install Elasticsearch, please download and extract the Elasticsearch package as shown below:

$ curl -O https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-8.17.1-linux-x86_64.tar.gz
$ tar -xf elasticsearch-8.17.1-linux-x86_64.tar.gz
$ cd elasticsearch-8.17.1

Once the installation is complete, start Elasticsearch:

$ ./bin/elasticsearch
  • You can create enrollment token for kibana. Use ./bin/elasticsearch-create-enrollment-token -s kibana.

  • You can reset default password for elastic, Use ./bin/elasticsearch-reset-password -u elastic.

Set Up Kibana

$ curl -O https://artifacts.elastic.co/downloads/kibana/kibana-8.17.1-linux-x86_64.tar.gz
$ tar -xf kibana-8.17.1-linux-x86_64.tar.gz
$ cd kibana-8.17.1-linux-x86_64

Once the installation is complete, start Kibana i.e. ./bin/kibana. You can modify its configuration file (config/kibana.yml).

$ ./bin/kibana

Access http://localhost:5601 in your browser.

Set Up Fluentd (fluent-package)

You can install Fluentd via major packaging systems.

Next, we'll install the Elasticsearch plugin for Fluentd: fluent-plugin-elasticsearch. Then, install fluent-plugin-elasticsearch.

We'll configure fluent-package (Fluentd) to interface properly with Elasticsearch. Please modify /etc/fluent/fluentd.conf as shown below:

# get logs from syslog
<source>
  @type syslog
  port 42185
  tag syslog
</source>

# get logs from fluent-logger, fluent-cat or other fluentd instances
<source>
  @type forward
</source>

<match syslog.**>
  @type elasticsearch
  host localhost
  user elastic
  password (ELASTIC_USER_PASSWORD_HERE)
  logstash_format true
  scheme https
  ssl_verify false
  include_timestamp true
  <buffer>
    flush_interval 10s # for testing
  </buffer>
</match>

In this article, it disables verification of TLS explicitly for elasticsearch because of demonstration. Do not disable on production.

fluent-plugin-elasticsearch comes with a logstash_format option that allows Kibana to search through the stored event logs in Elasticsearch.

Once everything has been set up and configured, start fluentd:

$ sudo systemctl start fluentd

Set Up rsyslogd

The final step is to forward the logs from your rsyslogd to fluentd. Please create the file with following line to /etc/rsyslog.d/90-fluentd.conf, and restart rsyslog. This will forward the local syslogs to Fluentd, and Fluentd in turn will forward the logs to Elasticsearch.

*.* @127.0.0.1:42185

Please restart the rsyslog service once the modification is complete:

$ sudo systemctl restart rsyslog

Store and Search Event Logs

Once Fluentd receives some event logs from rsyslog and has flushed them to Elasticsearch, you can view, search and visualize the log data using Kibana.

For starters, let's access http://localhost:5601 and click the Set up index patterns button in the upper-right corner of the screen.

Kibana will start a wizard that guides you through configuring the data sets to visualize. If you want a quick start, use logstash-* as the index pattern, and select @timestamp as the time-filter field.

After setting up an index pattern, you can view the system logs as they flow in:

To manually send logs to Elasticsearch, please use the logger command:

$ logger -t test foobar
<filter syslog.**>
  @type stdout
</filter>

<match syslog.**>
  @type elasticsearch
  host localhost
  user elastic
  password (ELASTIC_USER_PASSWORD_HERE)
  logstash_format true
  scheme https
  ssl_verify false
  include_timestamp true
  <buffer>
    flush_interval 10s # for testing
  </buffer>
</match>

Conclusion

This article introduced the combination of Fluentd and Kibana (with Elasticsearch) which achieves a free alternative to Splunk: storing and searching machine logs. The examples provided in this article have not been tuned.

If you will be using these components in production, you may want to modify some of the configurations (e.g. JVM, Elasticsearch, Fluentd buffer, etc.) according to your needs.

Learn More

You can also install Elasticsearch (and Kibana) using RPM/DEB packages. For details, please refer to .

To install Kibana, download it from the official website and extract it. Kibana is an HTML/CSS/JavaScript application (). Use the binary for 64-bit Linux systems.

See section how to install fluent-plugin-elasticsearch on your environment.

For more detail on how to use Kibana, please read the official .

When debugging your fluentd configuration, using will be useful. All the logs including errors can be found at /etc/fluent/fluentd.log.

If this article is incorrect or outdated, or omits critical information, please . is an open-source project under . All components are available under the Apache 2 License.

Fluentd
Elasticsearch
Kibana
Fluentd Elasticsearch Plugin
Installation
the official instructions
download
Installation
Plugin Management
manual
filter_stdout
Fluentd Architecture
Fluentd Get Started
Downloading Fluentd
let us know
Fluentd
Cloud Native Computing Foundation (CNCF)
Splunk
Elasticsearch
Kibana
Kibana Visualization
Fluentd + Elasticsearch + Kibana
Kibana Top Menu
Kibana: Discover