Logstash supports various output plugins to send processed data to different target systems. Here are commonly used output plugins and their configuration methods.
1. Elasticsearch Output Plugin
Elasticsearch is the most commonly used output target for Logstash.
Basic Configuration
confoutput { elasticsearch { hosts => ["http://localhost:9200"] index => "logstash-%{+YYYY.MM.dd}" } }
Important Parameters
- hosts: List of Elasticsearch node addresses
- index: Index name, supports date patterns
- document_type: Document type (deprecated after ES 7.x)
- document_id: Document ID
- action: Operation type (index, create, update, delete)
- pipeline: ES pipeline name
Advanced Configuration
confoutput { elasticsearch { hosts => ["http://es1:9200", "http://es2:9200"] index => "app-logs-%{[service]}-%{+YYYY.MM.dd}" document_id => "%{[@metadata][_id]}" action => "update" doc_as_upsert => true pipeline => "timestamp_pipeline" # Performance optimization flush_size => 500 idle_flush_time => 1 retry_on_conflict => 3 # SSL configuration ssl => true cacert => "/path/to/ca.crt" user => "elastic" password => "changeme" } }
Conditional Indexing
confoutput { if [type] == "error" { elasticsearch { hosts => ["http://localhost:9200"] index => "error-logs-%{+YYYY.MM.dd}" } } else { elasticsearch { hosts => ["http://localhost:9200"] index => "access-logs-%{+YYYY.MM.dd}" } } }
2. File Output Plugin
The File plugin writes data to the file system.
Basic Configuration
confoutput { file { path => "/path/to/output.log" } }
Important Parameters
- path: Output file path
- codec: Codec
- flush_interval: Flush interval
- gzip: Enable gzip compression
Advanced Configuration
confoutput { file { path => "/var/log/logstash/%{type}-%{+YYYY-MM-dd}.log" codec => line { format => "%{message}" } flush_interval => 5 gzip => true file_mode => 0644 dir_mode => 0755 } }
3. Kafka Output Plugin
The Kafka plugin sends data to Kafka message queue.
Basic Configuration
confoutput { kafka { bootstrap_servers => "localhost:9092" topic_id => "processed-logs" } }
Important Parameters
- bootstrap_servers: Kafka server addresses
- topic_id: Topic name
- codec: Codec
- compression_type: Compression type (none, gzip, snappy, lz4, zstd)
Advanced Configuration
confoutput { kafka { bootstrap_servers => ["kafka1:9092", "kafka2:9092"] topic_id => "processed-logs" codec => "json" compression_type => "snappy" acks => "all" retries => 3 batch_size => 16384 linger_ms => 10 buffer_memory => 33554432 # SSL configuration security_protocol => "SSL" ssl_keystore_location => "/path/to/keystore.jks" ssl_keystore_password => "password" ssl_truststore_location => "/path/to/truststore.jks" ssl_truststore_password => "password" } }
Dynamic Topics
confoutput { kafka { bootstrap_servers => "localhost:9092" topic_id => "%{[service]}-logs" } }
4. Redis Output Plugin
The Redis plugin sends data to Redis.
Basic Configuration
confoutput { redis { host => "localhost" port => 6379 data_type => "list" key => "logstash" } }
Data Types
- list: List type
- channel: Pub/Sub channel
- set: Set type
Advanced Configuration
confoutput { redis { host => "redis.example.com" port => 6379 data_type => "list" key => "logstash-%{[type]}" codec => "json" db => 0 password => "secret" timeout => 5 reconnect_attempts => 3 reconnect_interval => 2 } }
5. HTTP Output Plugin
The HTTP plugin sends data through HTTP interface.
Basic Configuration
confoutput { http { url => "http://example.com/api/logs" http_method => "post" format => "json" } }
Important Parameters
- url: Target URL
- http_method: HTTP method (post, put, patch)
- format: Data format (json, form, message)
- headers: HTTP request headers
Advanced Configuration
confoutput { http { url => "http://api.example.com/v1/logs" http_method => "post" format => "json" headers => { "Content-Type" => "application/json" "Authorization" => "Bearer %{[api_token]}" } mapping => { "timestamp" => "%{@timestamp}" "message" => "%{message}" "level" => "%{[log_level]}" } pool_size => 50 pool_max_per_route => 25 keepalive => true retry_non_idempotent => true } }
6. Stdout Output Plugin
The Stdout plugin outputs data to standard output, commonly used for debugging.
Basic Configuration
confoutput { stdout { codec => rubydebug } }
Codec Options
- rubydebug: Formatted output
- json: JSON format
- json_lines: One JSON per line
- dots: Dot output
7. Multiple Output Configuration
Multiple output plugins can be configured simultaneously:
confoutput { # Output to Elasticsearch elasticsearch { hosts => ["http://localhost:9200"] index => "logs-%{+YYYY.MM.dd}" } # Simultaneously output to file backup file { path => "/backup/logs-%{+YYYY-MM-dd}.log" } # Error logs send to Kafka if [level] == "ERROR" { kafka { bootstrap_servers => "localhost:9092" topic_id => "error-logs" } } }
8. Conditional Output
Use conditional statements to control data flow:
confoutput { if [type] == "apache" { elasticsearch { hosts => ["http://localhost:9200"] index => "apache-%{+YYYY.MM.dd}" } } else if [type] == "nginx" { elasticsearch { hosts => ["http://localhost:9200"] index => "nginx-%{+YYYY.MM.dd}" } } else { file { path => "/var/log/other-logs.log" } } }
Best Practices
- Batch Writing: Use flush_size and idle_flush_time to optimize performance
- Error Handling: Configure retry mechanisms and error logging
- Index Strategy: Reasonably design index naming and sharding strategies
- Security Configuration: Use SSL/TLS to protect data transmission
- Monitor Metrics: Monitor performance metrics of output plugins
- Backup Strategy: Configure multiple output targets for important data