Fluentd
0.12
0.12
  • Introduction
  • Overview
    • Getting Started
    • Installation
    • Life of a Fluentd event
    • Support
    • FAQ
  • Use Cases
    • Centralized App Logging
    • Monitoring Service Logs
    • Data Analytics
    • Connecting to Data Storages
    • Stream Processing
    • Windows Event Collection
    • IoT Data Logger
  • Configuration
    • Config File Syntax
    • Routing Examples
    • Recipes
  • Deployment
    • Logging
    • Monitoring
    • Signals
    • RPC
    • High Availability Config
    • Failure Scenarios
    • Performance Tuning
    • Plugin Management
    • Trouble Shooting
    • Secure Forwarding
    • Fluentd UI
    • Command Line Option
  • Container Deployment
    • Docker Image
    • Docker Logging Driver
    • Docker Compose
    • Kubernetes
  • Input Plugins
    • tail
    • forward
    • secure_forward
    • udp
    • tcp
    • http
    • unix
    • syslog
    • exec
    • scribe
    • multiprocess
    • dummy
    • Others
  • Output Plugins
    • file
    • s3
    • kafka
    • forward
    • secure_forward
    • exec
    • exec_filter
    • copy
    • geoip
    • roundrobin
    • stdout
    • null
    • webhdfs
    • splunk
    • mongo
    • mongo_replset
    • relabel
    • rewrite_tag_filter
    • Others
  • Buffer Plugins
    • memory
    • file
  • Filter Plugins
    • record_transformer
    • grep
    • parser
    • stdout
  • Parser Plugins
    • regexp
    • apache2
    • apache_error
    • nginx
    • syslog
    • ltsv
    • csv
    • tsv
    • json
    • multiline
    • none
  • Formatter Plugins
    • out_file
    • json
    • ltsv
    • csv
    • msgpack
    • hash
    • single_value
  • Developer
    • Plugin Development
    • Community
    • Mailing List
    • Source Code
    • Bug Tracking
    • ChangeLog
    • Logo
  • Articles
    • Store Apache Logs into MongoDB
    • Apache To Riak
    • Store Apache Logs into Amazon S3
    • Before Install
    • Cep Norikra
    • Collect Glusterfs Logs
    • Common Log Formats
    • Docker Logging Efk Compose
    • Docker Logging
    • Filter Modify Apache
    • Forwarding Over Ssl
    • Free Alternative To Splunk By Fluentd
    • Data Collection to Hadoop (HDFS)
    • Data Analytics with Treasure Data
    • Install By Chef
    • Install By Deb
    • Install By Dmg
    • Install By Gem
    • Install By Rpm
    • Install From Source
    • Install On Beanstalk
    • Install On Heroku
    • Java
    • Kinesis Stream
    • Kubernetes Fluentd
    • Monitoring by Prometheus
    • Monitoring by Rest Api
    • Nodejs
    • Performance Tuning Multi Process
    • Performance Tuning Single Process
    • Perl
    • Php
    • Python
    • Quickstart
    • Raspberrypi Cloud Data Logger
    • Recipe Apache Logs To Elasticsearch
    • Recipe Apache Logs To Mongo
    • Recipe Apache Logs To S3
    • Recipe Apache Logs To Treasure Data
    • Recipe Cloudstack To Mongodb
    • Recipe Csv To Elasticsearch
    • Recipe Csv To Mongo
    • Recipe Csv To S3
    • Recipe Csv To Treasure Data
    • Recipe Http Rest Api To Elasticsearch
    • Recipe Http Rest Api To Mongo
    • Recipe Http Rest Api To S3
    • Recipe Http Rest Api To Treasure Data
    • Recipe Json To Elasticsearch
    • Recipe Json To Mongo
    • Recipe Json To S3
    • Recipe Json To Treasure Data
    • Recipe Nginx To Elasticsearch
    • Recipe Nginx To Mongo
    • Recipe Nginx To S3
    • Recipe Nginx To Treasure Data
    • Recipe Syslog To Elasticsearch
    • Recipe Syslog To Mongo
    • Recipe Syslog To S3
    • Recipe Syslog To Treasure Data
    • Recipe Tsv To Elasticsearch
    • Recipe Tsv To Mongo
    • Recipe Tsv To S3
    • Recipe Tsv To Treasure Data
    • Ruby
    • Scala
    • Splunk Like Grep And Alert Email
Powered by GitBook
On this page
  • Background
  • Mechanism
  • Install
  • Install Kinesis Plugin
  • Configuration
  • Tail Input
  • Amazon Kinesis Output
  • Test
  • FAQs
  • Conclusion
  • Learn More

Was this helpful?

  1. Articles

Kinesis Stream

PreviousJavaNextKubernetes Fluentd

Last updated 5 years ago

Was this helpful?

This article explains how to use 's Output plugin () to aggregate semi-structured logs in real-time. Kinesis plugin is developed and published by Amazon Web Services officially.

Background

Amazon Kinesis is a platform for streaming data on AWS, offering powerful services to make it easy to load and analyze streaming data, and also providing the ability for you to build custom streaming data applications for specialized needs.

Mechanism

In this example, Fluentd does 3 things:

  1. It continuously "tails" the access log file.

  2. It parses the incoming log entries into meaningful fields (such as

    ip, path, etc.) and buffers them.

  3. It writes the buffered data to Amazon Kinesis periodically.

Install

For simplicity, this article will describe how to set up an one-node configuration. Please install the following software on the same node.

  • Apache (with the Combined Log Format)

You can install Fluentd via major packaging systems.

Install Kinesis Plugin

Since Amazon Kinesis plugin is not bundled with td-agent package, plase install it manually.

$ sudo td-agent-gem install fluent-plugin-kinesis

Configuration

Let's start configuring Fluentd. If you used the deb/rpm package, Fluentd's config file is located at /etc/td-agent/td-agent.conf. Otherwise, it is located at /etc/fluentd/fluentd.conf.

Tail Input

For the input source, we will set up Fluentd to track the recent Apache logs (typically found at /var/log/apache2/access_log) The Fluentd configuration file should look like this:

<source>
  type tail
  format apache2
  path /var/log/apache2/access_log
  pos_file /var/log/td-agent/apache2.access_log.pos
  tag kinesis.apache.access
</source>

Please make sure that your Apache outputs are in the default \'combined\' format. `format apache2` cannot parse custom log formats. Please see the in_tail article for more information.

Let's go through the configuration line by line.

  1. type tail: The tail Input plugin continuously tracks the log file.

    This handy plugin is included in Fluentd's core.

  2. format apache2: Uses Fluentd's built-in Apache log parser.

  3. path /var/log/apache2/access_log: The location of the Apache log.

    This may be different for your particular system.

  4. tag kinesis.apache.access: kinesis.apache.access is used as the

    tag to route the messages within Fluentd.

That's it! You should now be able to output a JSON-formatted data stream for Fluentd to process.

Amazon Kinesis Output

The output destination will be Amazon Kinesis. The output configuration should look like this:

<match **>
  # plugin type
  type kinesis

  # your kinesis stream name
  stream_name <KINESIS_STREAM_NAME>

  # AWS credentials
  aws_key_id <AWS_KEY_ID>
  aws_sec_key <AWS_SECRET_KEY>

  # AWS region
  region us-east-1

  # Use random value for the partition key
  random_partition_key true

  # Frequency of ingestion
  flush_interval 5s

  # Parallelism of ingestion
  num_threads 16
</match>

The match section specifies the regexp used to look for matching tags. If a matching tag is found in a log, then the config inside <match>...</match> is used (i.e. the log is routed according to the config inside). In this example, the kinesis.apache.access tag (generated by tail) is always used.

The ** in match.** matches zero or more period-delimited tag parts (e.g. match/match.a/match.a.b).

flush_interval specifies how often the data is written to Kinesis.

Test

Please restart td-agent process first, to make the configuration change available.

$ sudo /etc/init.d/td-agent stop
$ sudo /etc/init.d/td-agent configtest
$ sudo /etc/init.d/td-agent start

To test the configuration, just have a couple of accesses to your Apache server. This example uses the ab (Apache Bench) program.

$ ab -n 100 -c 10 http://localhost/

FAQs

Why we need Fluentd, while Kinesis also offers client libraries?

Conclusion

Fluentd + Amazon Kinesis makes real-time log collection simple, easy, and robust.

Learn More

  • Fluentd Get Started

  • by Amazon Web Services)

is an advanced open-source log collector originally developed at . Because Fluentd can collect logs from various sources, is one of the popular destinations for the output.

This article will show you how to use to import Apache logs into Amazon Kinesis.

random_partition_key true option will generate the partition key via UUID v3 (). Kinesis Stream consists of shards, and the processing power of each shard is limited. This partition key will be used by Kinesis, to determine which shard wll be assigned to for the specific record.

For additional configuration parameters, please see the README.

For those who are interested in security, all communication between Fluentd and Amazon Kinesis are done via HTTPS. If you don't want to have AES keys in the configuration file, is available too for EC2 nodes.

A lot of people use Fluentd + Kinesis, simply because they want to have more choice for and . For inputs, Fluentd has a lot more community contributed plugins and libraries. For outputs, you can send not only Kinesis, but multiple destinations like Amazon S3, local file storage, etc.

(Made

If this article is incorrect or outdated, or omits critical information, please . is a open source project under . All components are available under the Apache 2 License.

Fluentd
Treasure Data, Inc
Amazon Kinesis
Fluentd
Fluentd
Debian Package
RPM Package
Ruby gem
source
Kinesis Output plugin
IAM Role based authentication
inputs
outputs
Fluentd Architecture
Amazon Kinesis
Amazon Kinesis Output Plugin
let us know
Fluentd
Cloud Native Computing Foundation (CNCF)
Fluentd
Amazon Kinesis
out_kinesis