Redis is an in-memory data store used as a cache, message broker, and database. Its sub-millisecond response times make it the go-to solution for improving application performance. This guide covers everything you need to implement Redis caching effectively.
Installation
# Ubuntu/Debian
sudo apt update
sudo apt install redis-server
# Start and enable
sudo systemctl enable --now redis-server
# Verify
redis-cli ping
# Response: PONG
Basic Redis Commands
# String operations
SET user:1:name "John Doe"
GET user:1:name
SET session:abc123 "user_data" EX 3600 # Expires in 1 hour
TTL session:abc123
# Hash operations (structured data)
HSET user:1 name "John" email "john@example.com" age 30
HGETALL user:1
HGET user:1 name
# List operations
LPUSH notifications:user1 "New order received"
LRANGE notifications:user1 0 -1
# Set operations
SADD tags:article1 "python" "automation" "linux"
SMEMBERS tags:article1
# Sorted sets
ZADD leaderboard 100 "player1" 200 "player2" 150 "player3"
ZRANGEBYSCORE leaderboard 0 200 WITHSCORES
Caching Strategies
Cache-Aside (Lazy Loading)
def get_user(user_id):
# Check cache first
cached = redis.get(f"user:{user_id}")
if cached:
return json.loads(cached)
# Cache miss - fetch from database
user = db.query("SELECT * FROM users WHERE id = %s", user_id)
# Store in cache with TTL
redis.setex(f"user:{user_id}", 3600, json.dumps(user))
return user
Write-Through
def update_user(user_id, data):
# Update database
db.execute("UPDATE users SET name = %s WHERE id = %s", data["name"], user_id)
# Update cache simultaneously
redis.setex(f"user:{user_id}", 3600, json.dumps(data))
Configuration for Production
# /etc/redis/redis.conf
# Memory limit
maxmemory 256mb
maxmemory-policy allkeys-lru
# Persistence
save 900 1 # Save if 1 key changed in 900 seconds
save 300 10 # Save if 10 keys changed in 300 seconds
save 60 10000 # Save if 10000 keys changed in 60 seconds
# Security
requirepass your-strong-password
bind 127.0.0.1
protected-mode yes
# Performance
tcp-keepalive 300
timeout 0
Eviction Policies
- allkeys-lru: Remove least recently used keys (recommended for caching)
- volatile-lru: Remove LRU keys with an expiry set
- allkeys-lfu: Remove least frequently used keys
- noeviction: Return errors when memory limit is reached
Monitoring
# Redis CLI monitoring
redis-cli INFO stats
redis-cli INFO memory
redis-cli INFO keyspace
# Key metrics to track
# hit rate = keyspace_hits / (keyspace_hits + keyspace_misses)
# memory usage vs maxmemory
# connected clients
# evicted keys
Best Practices
- Always set TTL on cached data to prevent stale data
- Use meaningful key naming conventions like
entity:id:field - Set appropriate maxmemory and eviction policy
- Monitor cache hit rates and adjust TTLs accordingly
- Use pipelining for bulk operations
- Implement cache invalidation on data updates
- Secure Redis with authentication and network restrictions
Redis caching can dramatically improve your application response times. Start with a cache-aside strategy for read-heavy data and expand your caching layer as you identify performance bottlenecks.