Fluentd
0.12
0.12
  • Introduction
  • Overview
    • Getting Started
    • Installation
    • Life of a Fluentd event
    • Support
    • FAQ
  • Use Cases
    • Centralized App Logging
    • Monitoring Service Logs
    • Data Analytics
    • Connecting to Data Storages
    • Stream Processing
    • Windows Event Collection
    • IoT Data Logger
  • Configuration
    • Config File Syntax
    • Routing Examples
    • Recipes
  • Deployment
    • Logging
    • Monitoring
    • Signals
    • RPC
    • High Availability Config
    • Failure Scenarios
    • Performance Tuning
    • Plugin Management
    • Trouble Shooting
    • Secure Forwarding
    • Fluentd UI
    • Command Line Option
  • Container Deployment
    • Docker Image
    • Docker Logging Driver
    • Docker Compose
    • Kubernetes
  • Input Plugins
    • tail
    • forward
    • secure_forward
    • udp
    • tcp
    • http
    • unix
    • syslog
    • exec
    • scribe
    • multiprocess
    • dummy
    • Others
  • Output Plugins
    • file
    • s3
    • kafka
    • forward
    • secure_forward
    • exec
    • exec_filter
    • copy
    • geoip
    • roundrobin
    • stdout
    • null
    • webhdfs
    • splunk
    • mongo
    • mongo_replset
    • relabel
    • rewrite_tag_filter
    • Others
  • Buffer Plugins
    • memory
    • file
  • Filter Plugins
    • record_transformer
    • grep
    • parser
    • stdout
  • Parser Plugins
    • regexp
    • apache2
    • apache_error
    • nginx
    • syslog
    • ltsv
    • csv
    • tsv
    • json
    • multiline
    • none
  • Formatter Plugins
    • out_file
    • json
    • ltsv
    • csv
    • msgpack
    • hash
    • single_value
  • Developer
    • Plugin Development
    • Community
    • Mailing List
    • Source Code
    • Bug Tracking
    • ChangeLog
    • Logo
  • Articles
    • Store Apache Logs into MongoDB
    • Apache To Riak
    • Store Apache Logs into Amazon S3
    • Before Install
    • Cep Norikra
    • Collect Glusterfs Logs
    • Common Log Formats
    • Docker Logging Efk Compose
    • Docker Logging
    • Filter Modify Apache
    • Forwarding Over Ssl
    • Free Alternative To Splunk By Fluentd
    • Data Collection to Hadoop (HDFS)
    • Data Analytics with Treasure Data
    • Install By Chef
    • Install By Deb
    • Install By Dmg
    • Install By Gem
    • Install By Rpm
    • Install From Source
    • Install On Beanstalk
    • Install On Heroku
    • Java
    • Kinesis Stream
    • Kubernetes Fluentd
    • Monitoring by Prometheus
    • Monitoring by Rest Api
    • Nodejs
    • Performance Tuning Multi Process
    • Performance Tuning Single Process
    • Perl
    • Php
    • Python
    • Quickstart
    • Raspberrypi Cloud Data Logger
    • Recipe Apache Logs To Elasticsearch
    • Recipe Apache Logs To Mongo
    • Recipe Apache Logs To S3
    • Recipe Apache Logs To Treasure Data
    • Recipe Cloudstack To Mongodb
    • Recipe Csv To Elasticsearch
    • Recipe Csv To Mongo
    • Recipe Csv To S3
    • Recipe Csv To Treasure Data
    • Recipe Http Rest Api To Elasticsearch
    • Recipe Http Rest Api To Mongo
    • Recipe Http Rest Api To S3
    • Recipe Http Rest Api To Treasure Data
    • Recipe Json To Elasticsearch
    • Recipe Json To Mongo
    • Recipe Json To S3
    • Recipe Json To Treasure Data
    • Recipe Nginx To Elasticsearch
    • Recipe Nginx To Mongo
    • Recipe Nginx To S3
    • Recipe Nginx To Treasure Data
    • Recipe Syslog To Elasticsearch
    • Recipe Syslog To Mongo
    • Recipe Syslog To S3
    • Recipe Syslog To Treasure Data
    • Recipe Tsv To Elasticsearch
    • Recipe Tsv To Mongo
    • Recipe Tsv To S3
    • Recipe Tsv To Treasure Data
    • Ruby
    • Scala
    • Splunk Like Grep And Alert Email
Powered by GitBook
On this page
  • Parameters
  • time_key
  • time_format
  • keep_time_key
  • types
  • Example
  • FAQ
  • How to debug my regexp pattern?

Was this helpful?

  1. Parser Plugins

regexp

PreviousParser PluginsNextapache2

Last updated 5 years ago

Was this helpful?

The regexp parser plugin parses logs by given regexp pattern. If the parameter value starts and ends with "/", it is considered to be a regexp. The regexp must have at least one named capture (?\PATTERN). If the regexp has a capture named time, this is configurable, it is used as the time of the event. You can specify the time format using the time_format parameter.

format /.../ # regexp parser is used
format json  # json parser is used

Parameters

time_key

Specify the field for event time. Default is time.

time_format

Specify time format for time_key.

See for additional format information.

keep_time_key

If you want to keep time field in the record, set true. Default is false.

types

Although every parsed field has type string by default, you can specify other types. This is useful when filtering particular fields numerically or storing data with sensible type information.

The syntax is

types <field_name_1>:<type_name_1>,<field_name_2>:<type_name_2>,...

e.g.,

types user_id:integer,paid:bool,paid_usd_amount:float

As demonstrated above, "," is used to delimit field-type pairs while ":" is used to separate a field name with its intended type.

Unspecified fields are parsed at the default string type.

The list of supported types are shown below:

  • string

  • bool

  • integer ("int" would NOT work!)

  • float

  • time

  • array

For the time and array types, there is an optional third field after the type name. For the "time" type, you can specify a time format like you would in time_format.

For the "array" type, the third field specifies the delimiter (the default is ","). For example, if a field called "item_ids" contains the value "3,4,5", types item_ids:array parses it as ["3", "4", "5"]. Alternatively, if the value is "Adam|Alice|Bob", types item_ids:array:| parses it as ["Adam", "Alice", "Bob"].

Example

format /^\[(?<logtime>[^\]]*)\] (?<name>[^ ]*) (?<title>[^ ]*) (?<id>\d*)$/
time_key logtime
time_format %Y-%m-%d %H:%M:%S %z
types id:integer

With this config:

[2013-02-28 12:00:00 +0900] alice engineer 1

This incoming log is parsed as:

time:
1362020400 (2013-02-28 12:00:00 +0900)

record:
{
  "name" : "alice",
  "title": "engineer",
  "id"   : 1
}

FAQ

How to debug my regexp pattern?

helps your regexp testing. Another way, is a great website to test your regexp for Fluentd configuration.

NOTE: You may hit Application Error at Fluentular due to . Retry a few hours later or use fluentd-ui instead.

If this article is incorrect or outdated, or omits critical information, please . is a open source project under . All components are available under the Apache 2 License.

Time#strptime
heroku free plan limitation
let us know
Fluentd
Cloud Native Computing Foundation (CNCF)
Fluentular
fluentd-ui's in_tail editor