Fluentd
0.12
0.12
  • Introduction
  • Overview
    • Getting Started
    • Installation
    • Life of a Fluentd event
    • Support
    • FAQ
  • Use Cases
    • Centralized App Logging
    • Monitoring Service Logs
    • Data Analytics
    • Connecting to Data Storages
    • Stream Processing
    • Windows Event Collection
    • IoT Data Logger
  • Configuration
    • Config File Syntax
    • Routing Examples
    • Recipes
  • Deployment
    • Logging
    • Monitoring
    • Signals
    • RPC
    • High Availability Config
    • Failure Scenarios
    • Performance Tuning
    • Plugin Management
    • Trouble Shooting
    • Secure Forwarding
    • Fluentd UI
    • Command Line Option
  • Container Deployment
    • Docker Image
    • Docker Logging Driver
    • Docker Compose
    • Kubernetes
  • Input Plugins
    • tail
    • forward
    • secure_forward
    • udp
    • tcp
    • http
    • unix
    • syslog
    • exec
    • scribe
    • multiprocess
    • dummy
    • Others
  • Output Plugins
    • file
    • s3
    • kafka
    • forward
    • secure_forward
    • exec
    • exec_filter
    • copy
    • geoip
    • roundrobin
    • stdout
    • null
    • webhdfs
    • splunk
    • mongo
    • mongo_replset
    • relabel
    • rewrite_tag_filter
    • Others
  • Buffer Plugins
    • memory
    • file
  • Filter Plugins
    • record_transformer
    • grep
    • parser
    • stdout
  • Parser Plugins
    • regexp
    • apache2
    • apache_error
    • nginx
    • syslog
    • ltsv
    • csv
    • tsv
    • json
    • multiline
    • none
  • Formatter Plugins
    • out_file
    • json
    • ltsv
    • csv
    • msgpack
    • hash
    • single_value
  • Developer
    • Plugin Development
    • Community
    • Mailing List
    • Source Code
    • Bug Tracking
    • ChangeLog
    • Logo
  • Articles
    • Store Apache Logs into MongoDB
    • Apache To Riak
    • Store Apache Logs into Amazon S3
    • Before Install
    • Cep Norikra
    • Collect Glusterfs Logs
    • Common Log Formats
    • Docker Logging Efk Compose
    • Docker Logging
    • Filter Modify Apache
    • Forwarding Over Ssl
    • Free Alternative To Splunk By Fluentd
    • Data Collection to Hadoop (HDFS)
    • Data Analytics with Treasure Data
    • Install By Chef
    • Install By Deb
    • Install By Dmg
    • Install By Gem
    • Install By Rpm
    • Install From Source
    • Install On Beanstalk
    • Install On Heroku
    • Java
    • Kinesis Stream
    • Kubernetes Fluentd
    • Monitoring by Prometheus
    • Monitoring by Rest Api
    • Nodejs
    • Performance Tuning Multi Process
    • Performance Tuning Single Process
    • Perl
    • Php
    • Python
    • Quickstart
    • Raspberrypi Cloud Data Logger
    • Recipe Apache Logs To Elasticsearch
    • Recipe Apache Logs To Mongo
    • Recipe Apache Logs To S3
    • Recipe Apache Logs To Treasure Data
    • Recipe Cloudstack To Mongodb
    • Recipe Csv To Elasticsearch
    • Recipe Csv To Mongo
    • Recipe Csv To S3
    • Recipe Csv To Treasure Data
    • Recipe Http Rest Api To Elasticsearch
    • Recipe Http Rest Api To Mongo
    • Recipe Http Rest Api To S3
    • Recipe Http Rest Api To Treasure Data
    • Recipe Json To Elasticsearch
    • Recipe Json To Mongo
    • Recipe Json To S3
    • Recipe Json To Treasure Data
    • Recipe Nginx To Elasticsearch
    • Recipe Nginx To Mongo
    • Recipe Nginx To S3
    • Recipe Nginx To Treasure Data
    • Recipe Syslog To Elasticsearch
    • Recipe Syslog To Mongo
    • Recipe Syslog To S3
    • Recipe Syslog To Treasure Data
    • Recipe Tsv To Elasticsearch
    • Recipe Tsv To Mongo
    • Recipe Tsv To S3
    • Recipe Tsv To Treasure Data
    • Ruby
    • Scala
    • Splunk Like Grep And Alert Email
Powered by GitBook
On this page
  • Example Configurations
  • Parameters
  • type (required)
  • format
  • output_type
  • out_file
  • json
  • hash
  • ltsv
  • single_value
  • csv
  • stdout
  • log_level option
  • Learn More

Was this helpful?

  1. Filter Plugins

stdout

The filter_stdout filter plugin prints events to stdout (or logs if launched with daemon mode). This filter plugin is useful for debugging purposes.

Example Configurations

filter_stdout is included in Fluentd's core. No installation required.

<filter pattern>
  @type stdout
</filter>

A sample output is as follows:

2015-05-02 12:12:17 +0900 tag: {"field1":"value1","field2":"value2"}

where the first part shows the output time, the second part shows the tag, and the third part shows the record. The first part shows the **output** time, not the time attribute of message event structure as `out_stdout` does.

Parameters

type (required)

The value must be stdout.

format

The format of the output. The default is stdout.

output_type

This is the option of stdout format. Configure the format of record (third part). Any formatter plugins can be specified. The default is json.

out_file

Output time, tag and json record separated by a delimiter:

time[delimiter]tag[delimiter]record\n

Example:

2014-06-08T23:59:40[TAB]file.server.logs[TAB]{"field1":"value1","field2":"value2"}\n

out_file format has several options to customize the format.

delimiter SPACE   # Optional, SPACE or COMMA. "\t"(TAB) is used by default
output_tag false  # Optional, defaults to true. Output the tag field if true.
output_time true  # Optional, defaults to true. Output the time field if true.

For this format, the following common parameters are also supported.

  • include_time_key (Boolean, Optional, defaults to false) If

    true, the time field (as specified by the time_key parameter) is

    kept in the record.

  • time_key (String, Optional, defaults to "time") The field name

    for the time key.

  • time_format (String. Optional) By default, the output format is

    iso8601 (e.g. "2008-02-01T21:41:49"). One can specify their own

    format with this parameter.

  • include_tag_key (Boolean. Optional, defaults to false) If

    true, the tag field (as specified by the tag_key parameter) is

    kept in the record.

  • tag_key (String, Optional, defaults to "tag") The field name

    for the tag key.

  • localtime (Boolean. Optional, defaults to true) If true, use

    local time. Otherwise, UTC is used. This parameter is overwritten by

    the utc parameter.

  • timezone (String. Optional) By setting this parameter, one can parse the time value in the specified timezone. The following formats are accepted:

    1. [+-]HH:MM (e.g. "+09:00")

    2. [+-]HHMM (e.g. "+0900")

    3. [+-]HH (e.g. "+09")

    4. Region/Zone (e.g. "Asia/Tokyo")

    5. Region/Zone/Zone (e.g. "America/Argentina/Buenos_Aires")

    The timezone set in this parameter takes precedence over localtime, e.g., if localtime is set to true but timezone is set to +0000, UTC would be used.

json

Output a json record without the time or tag field:

{"field1":"value1","field2":"value2"}\n

For this format, the following common parameters are also supported.

  • include_time_key (Boolean, Optional, defaults to false) If

    true, the time field (as specified by the time_key parameter) is

    kept in the record.

  • time_key (String, Optional, defaults to "time") The field name

    for the time key.

  • time_format (String. Optional) By default, the output format is

    iso8601 (e.g. "2008-02-01T21:41:49"). One can specify their own

    format with this parameter.

  • include_tag_key (Boolean. Optional, defaults to false) If

    true, the tag field (as specified by the tag_key parameter) is

    kept in the record.

  • tag_key (String, Optional, defaults to "tag") The field name

    for the tag key.

  • localtime (Boolean. Optional, defaults to true) If true, use

    local time. Otherwise, UTC is used. This parameter is overwritten by

    the utc parameter.

  • timezone (String. Optional) By setting this parameter, one can parse the time value in the specified timezone. The following formats are accepted:

    1. [+-]HH:MM (e.g. "+09:00")

    2. [+-]HHMM (e.g. "+0900")

    3. [+-]HH (e.g. "+09")

    4. Region/Zone (e.g. "Asia/Tokyo")

    5. Region/Zone/Zone (e.g. "America/Argentina/Buenos_Aires")

    The timezone set in this parameter takes precedence over localtime, e.g., if localtime is set to true but timezone is set to +0000, UTC would be used.

hash

Output a record as ruby hash without the time or tag field:

{"field1"=>"value1","field2"=>"value2"}\n

For this format, the following common parameters are also supported.

  • include_time_key (Boolean, Optional, defaults to false) If

    true, the time field (as specified by the time_key parameter) is

    kept in the record.

  • time_key (String, Optional, defaults to "time") The field name

    for the time key.

  • time_format (String. Optional) By default, the output format is

    iso8601 (e.g. "2008-02-01T21:41:49"). One can specify their own

    format with this parameter.

  • include_tag_key (Boolean. Optional, defaults to false) If

    true, the tag field (as specified by the tag_key parameter) is

    kept in the record.

  • tag_key (String, Optional, defaults to "tag") The field name

    for the tag key.

  • localtime (Boolean. Optional, defaults to true) If true, use

    local time. Otherwise, UTC is used. This parameter is overwritten by

    the utc parameter.

  • timezone (String. Optional) By setting this parameter, one can parse the time value in the specified timezone. The following formats are accepted:

    1. [+-]HH:MM (e.g. "+09:00")

    2. [+-]HHMM (e.g. "+0900")

    3. [+-]HH (e.g. "+09")

    4. Region/Zone (e.g. "Asia/Tokyo")

    5. Region/Zone/Zone (e.g. "America/Argentina/Buenos_Aires")

    The timezone set in this parameter takes precedence over localtime, e.g., if localtime is set to true but timezone is set to +0000, UTC would be used.

ltsv

field1[label_delimiter]value1[delimiter]field2[label_delimiter]value2\n

ltsv format supports delimiter and label_delimiter options.

format ltsv
delimiter SPACE   # Optional. "\t"(TAB) is used by default
label_delimiter = # Optional. ":" is used by default

For this format, the following common parameters are also supported.

  • include_time_key (Boolean, Optional, defaults to false) If

    true, the time field (as specified by the time_key parameter) is

    kept in the record.

  • time_key (String, Optional, defaults to "time") The field name

    for the time key.

  • time_format (String. Optional) By default, the output format is

    iso8601 (e.g. "2008-02-01T21:41:49"). One can specify their own

    format with this parameter.

  • include_tag_key (Boolean. Optional, defaults to false) If

    true, the tag field (as specified by the tag_key parameter) is

    kept in the record.

  • tag_key (String, Optional, defaults to "tag") The field name

    for the tag key.

  • localtime (Boolean. Optional, defaults to true) If true, use

    local time. Otherwise, UTC is used. This parameter is overwritten by

    the utc parameter.

  • timezone (String. Optional) By setting this parameter, one can parse the time value in the specified timezone. The following formats are accepted:

    1. [+-]HH:MM (e.g. "+09:00")

    2. [+-]HHMM (e.g. "+0900")

    3. [+-]HH (e.g. "+09")

    4. Region/Zone (e.g. "Asia/Tokyo")

    5. Region/Zone/Zone (e.g. "America/Argentina/Buenos_Aires")

    The timezone set in this parameter takes precedence over localtime, e.g., if localtime is set to true but timezone is set to +0000, UTC would be used.

single_value

value1\n

single_value format supports the add_newline and message_key options.

add_newline false # Optional, defaults to true. If there is a trailing "\n" already, set it "false"
message_key my_field # Optional, defaults to "message". The value of this field is outputted.

csv

Output the record as CSV/TSV:

"value1"[delimiter]"value2"[delimiter]"value3"\n

csv format supports the delimiter and force_quotes options.

format csv
fields field1,field2,field3
delimiter \t   # Optional. "," is used by default.
force_quotes false # Optional. true is used by default. If false, value won't be framed by quotes.

For this format, the following common parameters are also supported.

  • include_time_key (Boolean, Optional, defaults to false) If

    true, the time field (as specified by the time_key parameter) is

    kept in the record.

  • time_key (String, Optional, defaults to "time") The field name

    for the time key.

  • time_format (String. Optional) By default, the output format is

    iso8601 (e.g. "2008-02-01T21:41:49"). One can specify their own

    format with this parameter.

  • include_tag_key (Boolean. Optional, defaults to false) If

    true, the tag field (as specified by the tag_key parameter) is

    kept in the record.

  • tag_key (String, Optional, defaults to "tag") The field name

    for the tag key.

  • localtime (Boolean. Optional, defaults to true) If true, use

    local time. Otherwise, UTC is used. This parameter is overwritten by

    the utc parameter.

  • timezone (String. Optional) By setting this parameter, one can parse the time value in the specified timezone. The following formats are accepted:

    1. [+-]HH:MM (e.g. "+09:00")

    2. [+-]HHMM (e.g. "+0900")

    3. [+-]HH (e.g. "+09")

    4. Region/Zone (e.g. "Asia/Tokyo")

    5. Region/Zone/Zone (e.g. "America/Argentina/Buenos_Aires")

    The timezone set in this parameter takes precedence over localtime, e.g., if localtime is set to true but timezone is set to +0000, UTC would be used.

stdout

This format is aimed to be used by stdout plugins.

Output time, tag and formatted record as follows:

time tag: formatted_record\n

Example:

2015-05-02 12:12:17 +0900 tag: {"field1":"value1","field2":"value2"}\n

stdout format has a following option to customize the format of the record part.

output_type format # Optional, defaults to "json". The format of
`formatted_record`. Any formatter plugins can be specified.

For this format, the following common parameters are also supported.

  • include_time_key (Boolean, Optional, defaults to false) If

    true, the time field (as specified by the time_key parameter) is

    kept in the record.

  • time_key (String, Optional, defaults to "time") The field name

    for the time key.

  • time_format (String. Optional) By default, the output format is

    iso8601 (e.g. "2008-02-01T21:41:49"). One can specify their own

    format with this parameter.

  • include_tag_key (Boolean. Optional, defaults to false) If

    true, the tag field (as specified by the tag_key parameter) is

    kept in the record.

  • tag_key (String, Optional, defaults to "tag") The field name

    for the tag key.

  • localtime (Boolean. Optional, defaults to true) If true, use

    local time. Otherwise, UTC is used. This parameter is overwritten by

    the utc parameter.

  • timezone (String. Optional) By setting this parameter, one can parse the time value in the specified timezone. The following formats are accepted:

    1. [+-]HH:MM (e.g. "+09:00")

    2. [+-]HHMM (e.g. "+0900")

    3. [+-]HH (e.g. "+09")

    4. Region/Zone (e.g. "Asia/Tokyo")

    5. Region/Zone/Zone (e.g. "America/Argentina/Buenos_Aires")

    The timezone set in this parameter takes precedence over localtime, e.g., if localtime is set to true but timezone is set to +0000, UTC would be used.

log_level option

The log_level option allows the user to set different levels of logging for each plugin. The supported log levels are: fatal, error, warn, info, debug, and trace.

Learn More

PreviousparserNextParser Plugins

Last updated 5 years ago

Was this helpful?

Output the record as :

Output the value of a single field instead of the whole record. Often used in conjunction with 's format none.

Please see the for further details.

If this article is incorrect or outdated, or omits critical information, please . is a open source project under . All components are available under the Apache 2 License.

LTSV
in_tail
logging article
Filter Plugin Overview
let us know
Fluentd
Cloud Native Computing Foundation (CNCF)