Fluentd
0.12
0.12
  • Introduction
  • Overview
    • Getting Started
    • Installation
    • Life of a Fluentd event
    • Support
    • FAQ
  • Use Cases
    • Centralized App Logging
    • Monitoring Service Logs
    • Data Analytics
    • Connecting to Data Storages
    • Stream Processing
    • Windows Event Collection
    • IoT Data Logger
  • Configuration
    • Config File Syntax
    • Routing Examples
    • Recipes
  • Deployment
    • Logging
    • Monitoring
    • Signals
    • RPC
    • High Availability Config
    • Failure Scenarios
    • Performance Tuning
    • Plugin Management
    • Trouble Shooting
    • Secure Forwarding
    • Fluentd UI
    • Command Line Option
  • Container Deployment
    • Docker Image
    • Docker Logging Driver
    • Docker Compose
    • Kubernetes
  • Input Plugins
    • tail
    • forward
    • secure_forward
    • udp
    • tcp
    • http
    • unix
    • syslog
    • exec
    • scribe
    • multiprocess
    • dummy
    • Others
  • Output Plugins
    • file
    • s3
    • kafka
    • forward
    • secure_forward
    • exec
    • exec_filter
    • copy
    • geoip
    • roundrobin
    • stdout
    • null
    • webhdfs
    • splunk
    • mongo
    • mongo_replset
    • relabel
    • rewrite_tag_filter
    • Others
  • Buffer Plugins
    • memory
    • file
  • Filter Plugins
    • record_transformer
    • grep
    • parser
    • stdout
  • Parser Plugins
    • regexp
    • apache2
    • apache_error
    • nginx
    • syslog
    • ltsv
    • csv
    • tsv
    • json
    • multiline
    • none
  • Formatter Plugins
    • out_file
    • json
    • ltsv
    • csv
    • msgpack
    • hash
    • single_value
  • Developer
    • Plugin Development
    • Community
    • Mailing List
    • Source Code
    • Bug Tracking
    • ChangeLog
    • Logo
  • Articles
    • Store Apache Logs into MongoDB
    • Apache To Riak
    • Store Apache Logs into Amazon S3
    • Before Install
    • Cep Norikra
    • Collect Glusterfs Logs
    • Common Log Formats
    • Docker Logging Efk Compose
    • Docker Logging
    • Filter Modify Apache
    • Forwarding Over Ssl
    • Free Alternative To Splunk By Fluentd
    • Data Collection to Hadoop (HDFS)
    • Data Analytics with Treasure Data
    • Install By Chef
    • Install By Deb
    • Install By Dmg
    • Install By Gem
    • Install By Rpm
    • Install From Source
    • Install On Beanstalk
    • Install On Heroku
    • Java
    • Kinesis Stream
    • Kubernetes Fluentd
    • Monitoring by Prometheus
    • Monitoring by Rest Api
    • Nodejs
    • Performance Tuning Multi Process
    • Performance Tuning Single Process
    • Perl
    • Php
    • Python
    • Quickstart
    • Raspberrypi Cloud Data Logger
    • Recipe Apache Logs To Elasticsearch
    • Recipe Apache Logs To Mongo
    • Recipe Apache Logs To S3
    • Recipe Apache Logs To Treasure Data
    • Recipe Cloudstack To Mongodb
    • Recipe Csv To Elasticsearch
    • Recipe Csv To Mongo
    • Recipe Csv To S3
    • Recipe Csv To Treasure Data
    • Recipe Http Rest Api To Elasticsearch
    • Recipe Http Rest Api To Mongo
    • Recipe Http Rest Api To S3
    • Recipe Http Rest Api To Treasure Data
    • Recipe Json To Elasticsearch
    • Recipe Json To Mongo
    • Recipe Json To S3
    • Recipe Json To Treasure Data
    • Recipe Nginx To Elasticsearch
    • Recipe Nginx To Mongo
    • Recipe Nginx To S3
    • Recipe Nginx To Treasure Data
    • Recipe Syslog To Elasticsearch
    • Recipe Syslog To Mongo
    • Recipe Syslog To S3
    • Recipe Syslog To Treasure Data
    • Recipe Tsv To Elasticsearch
    • Recipe Tsv To Mongo
    • Recipe Tsv To S3
    • Recipe Tsv To Treasure Data
    • Ruby
    • Scala
    • Splunk Like Grep And Alert Email
Powered by GitBook
On this page
  • Example Configurations
  • Parameters
  • format
  • key_name
  • reserve_data
  • suppress_parse_error_log
  • ignore_key_not_exist
  • replace_invalid_sequence
  • inject_key_prefix
  • hash_value_field
  • time_parse
  • emit_invalid_record_to_error
  • Learn More

Was this helpful?

  1. Filter Plugins

parser

PreviousgrepNextstdout

Last updated 5 years ago

Was this helpful?

The filter_parser filter plugin "parses" string field in event records and mutates its event record with parsed result.

Example Configurations

filter_parser is included in Fluentd's core since v0.12.29. No installation required. If you want to use filter_parser with lower fluentd versions, need to install fluent-plugin-parser.

filter_parser has just same with in_tail about format and time_format:

<filter foo.bar>
  @type parser
  format /^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)$/
  time_format %d/%b/%Y:%H:%M:%S %z
  key_name message
</filter>

filter_parser uses built-in parser plugins and your own customized parser plugin, so you can re-use pre-defined format like apache, json and etc. See document page for more details:

Parameters

format

This is required parameter. Specify parser format or regexp pattern.

key_name

This is required parameter. Specify field name in the record to parse.

reserve_data

Keep original key-value pair in parsed result. Default is false.

<filter foo.bar>
  @type parser
  format json
  key_name log
  reserve_data true
</filter>

With above configuration, result is below:

# input data:  {"key":"value","log":"{\"user\":1,\"num\":2}"}
# output data: {"key":"value","log":"{\"user\":1,\"num\":2}","user":1,"num":2}

Without reserve_data, result is below

# input data:  {"key":"value","log":"{\"user\":1,\"num\":2}"}
# output data: {"user":1,"num":2}

suppress_parse_error_log

If true, a plugin suppresses pattern not match warning log. Default is false.

This parameter is useful for parsing mixed logs and you want to ignore non target lines.

ignore_key_not_exist

Ignore "key not exist" log. Default is false.

Useful case is same with suppress_parse_error_log.

replace_invalid_sequence

If true, invalid string is replaced with safe characters and re-parse it. Default is false.

inject_key_prefix

Store parsed values with specified key name prefix. Default is nil.

<filter foo.bar>
  @type parser
  format json
  key_name log
  reserve_data true
  inject_key_prefix data.
</filter>

With above configuration, result is below:

# input data:  {"log": "{\"user\":1,\"num\":2}"}
# output data: {"log":"{\"user\":1,\"num\":2}","data.user":1, "data.num":2}

hash_value_field

Store parsed values as a hash value in a field. Default is nil.

<filter foo.bar>
  @type parser
  format json
  key_name log
  hash_value_field parsed
</filter>

With above configuration, result is below:

# input data:  {"log": "{\"user\":1,\"num\":2}"}
# output data: {"parsed":{"user":1,"num":2}}

time_parse

If false, time parsing is disabled in the parser. Default is true.

emit_invalid_record_to_error

Emit invalid record to @ERROR label. Default is false. Invalid cases are

  • key not exist

  • format is not matched

  • unexpected error

You can rescue unexpected format logs in @ERROR label.

Learn More

If this article is incorrect or outdated, or omits critical information, please . is a open source project under . All components are available under the Apache 2 License.

Parser Plugin Overview
Filter Plugin Overview
let us know
Fluentd
Cloud Native Computing Foundation (CNCF)