Centralized log management is essential for troubleshooting, security monitoring, and compliance in modern infrastructure. The ELK Stack — Elasticsearch, Logstash, and Kibana — is the most popular open-source solution for collecting, processing, and visualizing log data.
Architecture Overview
- Elasticsearch: Distributed search and analytics engine for storing and querying logs
- Logstash: Data processing pipeline for ingesting, transforming, and forwarding logs
- Kibana: Visualization and exploration platform for Elasticsearch data
- Filebeat: Lightweight log shipper installed on source servers
Installing Elasticsearch
# Add Elastic repository
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-8.x.list
sudo apt update
sudo apt install elasticsearch
# Configure
sudo nano /etc/elasticsearch/elasticsearch.yml
# cluster.name: my-log-cluster
# network.host: 0.0.0.0
# discovery.type: single-node
# xpack.security.enabled: true
sudo systemctl enable --now elasticsearch
Installing Kibana
sudo apt install kibana
# Configure
sudo nano /etc/kibana/kibana.yml
# server.host: "0.0.0.0"
# elasticsearch.hosts: ["http://localhost:9200"]
sudo systemctl enable --now kibana
# Access at http://server-ip:5601
Installing Logstash
sudo apt install logstash
# Create pipeline configuration
sudo nano /etc/logstash/conf.d/syslog.conf
Logstash Pipeline Configuration
# /etc/logstash/conf.d/syslog.conf
input {
beats {
port => 5044
}
}
filter {
if [fields][log_type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}: %{GREEDYDATA:syslog_message}" }
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
if [fields][log_type] == "nginx" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
geoip {
source => "clientip"
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[fields][log_type]}-%{+YYYY.MM.dd}"
}
}
Installing Filebeat on Source Servers
sudo apt install filebeat
# /etc/filebeat/filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/syslog
- /var/log/auth.log
fields:
log_type: syslog
- type: log
enabled: true
paths:
- /var/log/nginx/access.log
fields:
log_type: nginx
output.logstash:
hosts: ["elk-server:5044"]
sudo systemctl enable --now filebeat
Kibana Dashboard Creation
- Create index patterns matching your log indices
- Use Discover to explore raw log data
- Create visualizations: bar charts, pie charts, maps, timelines
- Build dashboards combining multiple visualizations
- Set up saved searches for common queries
Useful Kibana Queries (KQL)
# Filter by severity
log.level: "error" or log.level: "critical"
# Search specific service
syslog_program: "nginx" and response: 500
# Time-based filtering
@timestamp >= "2026-03-01" and @timestamp < "2026-03-02"
# Free text search
message: "connection refused"
Index Lifecycle Management
# Create ILM policy via API
PUT _ilm/policy/logs-policy
{
"policy": {
"phases": {
"hot": { "actions": { "rollover": { "max_size": "50GB", "max_age": "1d" } } },
"warm": { "min_age": "7d", "actions": { "shrink": { "number_of_shards": 1 } } },
"delete": { "min_age": "30d", "actions": { "delete": {} } }
}
}
}
Best Practices
- Use Filebeat instead of shipping logs directly to Logstash
- Implement index lifecycle management to control storage costs
- Secure the stack with TLS encryption and authentication
- Monitor ELK stack health with Metricbeat
- Create alerting rules for critical log patterns
- Back up Elasticsearch indices regularly
A well-configured ELK stack transforms your log data from scattered files into a powerful, searchable knowledge base. Start with basic syslog collection and gradually expand to application logs, security events, and custom metrics.