Logstash supports various input plugins to collect data from different sources. Here are commonly used input plugins and their usage.
1. File Input Plugin
The File plugin is used to read log files from the file system.
Basic Configuration
confinput { file { path => "/var/log/*.log" start_position => "beginning" sincedb_path => "/dev/null" } }
Important Parameters
- path: File path to read, supports wildcards
- start_position: Starting position (beginning or end)
- sincedb_path: File path to record read position
- type: Add type identifier to events
- tags: Add tags to events
Advanced Configuration
confinput { file { path => ["/var/log/apache/*.log", "/var/log/nginx/*.log"] exclude => ["*.gz", "*.zip"] start_position => "beginning" sincedb_path => "/var/lib/logstash/sincedb" discover_interval => 15 stat_interval => 1 mode => "read" file_completed_action => "delete" file_completed_log_path => "/var/log/logstash/completed.log" } }
2. Beats Input Plugin
The Beats plugin is used to receive data from Beats (such as Filebeat, Metricbeat).
Basic Configuration
confinput { beats { port => 5044 } }
Important Parameters
- port: Listening port
- host: Bind address
- ssl: Enable SSL/TLS
- client_inactivity_timeout: Client inactivity timeout
SSL Configuration
confinput { beats { port => 5044 ssl => true ssl_certificate => "/path/to/cert.pem" ssl_key => "/path/to/key.pem" ssl_certificate_authorities => ["/path/to/ca.pem"] ssl_verify_mode => "force_peer" } }
3. Kafka Input Plugin
The Kafka plugin is used to consume data from Kafka message queue.
Basic Configuration
confinput { kafka { bootstrap_servers => "localhost:9092" topics => ["logs"] group_id => "logstash-consumer" } }
Important Parameters
- bootstrap_servers: Kafka server addresses
- topics: List of topics to consume
- group_id: Consumer group ID
- consumer_threads: Number of consumer threads
- decorate_events: Add Kafka metadata to events
Advanced Configuration
confinput { kafka { bootstrap_servers => ["kafka1:9092", "kafka2:9092"] topics => ["app-logs", "system-logs"] group_id => "logstash-group" consumer_threads => 4 fetch_min_bytes => 1 fetch_max_wait_ms => 100 max_partition_fetch_bytes => 1048576 session_timeout_ms => 10000 auto_offset_reset => "latest" enable_auto_commit => false decorate_events => true codec => "json" } }
4. HTTP Input Plugin
The HTTP plugin receives data through HTTP interface.
Basic Configuration
confinput { http { port => 8080 codec => "json" } }
Important Parameters
- port: Listening port
- host: Bind address
- codec: Codec
- ssl: Enable SSL
Authentication Configuration
confinput { http { port => 8080 user => "admin" password => "secret" ssl => true ssl_certificate => "/path/to/cert.pem" ssl_key => "/path/to/key.pem" } }
5. TCP/UDP Input Plugin
TCP/UDP plugins receive data from network protocols.
TCP Configuration
confinput { tcp { port => 5000 codec => "json_lines" mode => "server" } }
UDP Configuration
confinput { udp { port => 5001 codec => "json" workers => 2 } }
6. Syslog Input Plugin
The Syslog plugin receives system logs.
Basic Configuration
confinput { syslog { port => 514 type => "syslog" } }
Advanced Configuration
confinput { syslog { port => 514 host => "0.0.0.0" codec => "plain" use_rfc5424e => true grok_patterns => ["RSYSLOGBASE"] timezone => "UTC" } }
7. JDBC Input Plugin
The JDBC plugin reads data from databases.
Basic Configuration
confinput { jdbc { jdbc_driver_library => "/path/to/mysql-connector.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb" jdbc_user => "user" jdbc_password => "password" schedule => "* * * * *" statement => "SELECT * FROM logs WHERE created_at > :sql_last_value" } }
Important Parameters
- jdbc_driver_library: JDBC driver path
- jdbc_driver_class: JDBC driver class name
- jdbc_connection_string: Database connection string
- schedule: Execution schedule (cron expression)
- statement: SQL query statement
- use_column_value: Use column value tracking
- tracking_column: Tracking column name
- last_run_metadata_path: Metadata storage path
8. Redis Input Plugin
The Redis plugin reads data from Redis.
Basic Configuration
confinput { redis { host => "localhost" port => 6379 data_type => "list" key => "logstash" } }
Data Types
- list: List type
- channel: Pub/Sub channel
- pattern_channel: Pattern matching channel
Multiple Input Configuration
Multiple input plugins can be configured simultaneously:
confinput { file { path => "/var/log/app/*.log" type => "app-log" } beats { port => 5044 type => "beats-log" } kafka { bootstrap_servers => "localhost:9092" topics => ["system-logs"] type => "kafka-log" } }
Best Practices
- Reasonable Use of start_position: Production environment usually uses "end"
- Configure sincedb_path: Avoid re-reading after restart
- Use Types and Tags: Facilitate subsequent filtering and processing
- Enable SSL: Protect data transmission security
- Monitor Input Performance: Use metrics to monitor input plugin performance