🎁 New User? Get 20% off your first purchase with code NEWUSER20 Register Now →
Menu

Categories

Log Management with ELK Stack: Elasticsearch, Logstash, and Kibana Guide

Log Management with ELK Stack: Elasticsearch, Logstash, and Kibana Guide

Centralized log management is essential for troubleshooting, security monitoring, and compliance in modern infrastructure. The ELK Stack — Elasticsearch, Logstash, and Kibana — is the most popular open-source solution for collecting, processing, and visualizing log data.

Architecture Overview

  • Elasticsearch: Distributed search and analytics engine for storing and querying logs
  • Logstash: Data processing pipeline for ingesting, transforming, and forwarding logs
  • Kibana: Visualization and exploration platform for Elasticsearch data
  • Filebeat: Lightweight log shipper installed on source servers

Installing Elasticsearch

# Add Elastic repository
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-8.x.list

sudo apt update
sudo apt install elasticsearch

# Configure
sudo nano /etc/elasticsearch/elasticsearch.yml
# cluster.name: my-log-cluster
# network.host: 0.0.0.0
# discovery.type: single-node
# xpack.security.enabled: true

sudo systemctl enable --now elasticsearch

Installing Kibana

sudo apt install kibana

# Configure
sudo nano /etc/kibana/kibana.yml
# server.host: "0.0.0.0"
# elasticsearch.hosts: ["http://localhost:9200"]

sudo systemctl enable --now kibana
# Access at http://server-ip:5601

Installing Logstash

sudo apt install logstash

# Create pipeline configuration
sudo nano /etc/logstash/conf.d/syslog.conf

Logstash Pipeline Configuration

# /etc/logstash/conf.d/syslog.conf
input {
  beats {
    port => 5044
  }
}

filter {
  if [fields][log_type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}: %{GREEDYDATA:syslog_message}" }
    }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
  
  if [fields][log_type] == "nginx" {
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
    geoip {
      source => "clientip"
    }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "%{[fields][log_type]}-%{+YYYY.MM.dd}"
  }
}

Installing Filebeat on Source Servers

sudo apt install filebeat

# /etc/filebeat/filebeat.yml
filebeat.inputs:
  - type: log
    enabled: true
    paths:
      - /var/log/syslog
      - /var/log/auth.log
    fields:
      log_type: syslog
  
  - type: log
    enabled: true
    paths:
      - /var/log/nginx/access.log
    fields:
      log_type: nginx

output.logstash:
  hosts: ["elk-server:5044"]

sudo systemctl enable --now filebeat

Kibana Dashboard Creation

  1. Create index patterns matching your log indices
  2. Use Discover to explore raw log data
  3. Create visualizations: bar charts, pie charts, maps, timelines
  4. Build dashboards combining multiple visualizations
  5. Set up saved searches for common queries

Useful Kibana Queries (KQL)

# Filter by severity
log.level: "error" or log.level: "critical"

# Search specific service
syslog_program: "nginx" and response: 500

# Time-based filtering
@timestamp >= "2026-03-01" and @timestamp < "2026-03-02"

# Free text search
message: "connection refused"

Index Lifecycle Management

# Create ILM policy via API
PUT _ilm/policy/logs-policy
{
  "policy": {
    "phases": {
      "hot": { "actions": { "rollover": { "max_size": "50GB", "max_age": "1d" } } },
      "warm": { "min_age": "7d", "actions": { "shrink": { "number_of_shards": 1 } } },
      "delete": { "min_age": "30d", "actions": { "delete": {} } }
    }
  }
}

Best Practices

  1. Use Filebeat instead of shipping logs directly to Logstash
  2. Implement index lifecycle management to control storage costs
  3. Secure the stack with TLS encryption and authentication
  4. Monitor ELK stack health with Metricbeat
  5. Create alerting rules for critical log patterns
  6. Back up Elasticsearch indices regularly

A well-configured ELK stack transforms your log data from scattered files into a powerful, searchable knowledge base. Start with basic syslog collection and gradually expand to application logs, security events, and custom metrics.

Share this article:
Mikkel Sorensen
About the Author

Mikkel Sorensen

UX/UI Design, Java Development, User-Centered Application Design, Technical Documentation

Mikkel Sørensen is a UX/UI-focused software developer with a strong background in Java-based application development.

He works at the intersection of user experience design and software engineering, creating applications that are both technically robust and user-centered. His experience includes interface design, inter...

UX Design UI Design Java Applications User Experience Engineering Accessibility Basics

Stay Updated

Subscribe to our newsletter for the latest tutorials, tips, and exclusive offers.