Rails Caching Strategies — From Low-Level Cache to Russian Doll
Caching is how Rails apps go from “works fine with 10 users” to “works fine with 10,000 users” without rewriting everything. Rails ships a complete caching toolkit — low-level key-value cache, fragment caching, HTTP caching, and the Russian doll pattern for nested view caching — all configurable to use Redis, Solid Cache, Memcached, or in-memory storage. Knowing which tool to reach for and when is what separates a cache that helps from a cache that occasionally serves wrong data and is impossible to debug.
The Cache Store
Before caching anything, configure where the cache data lives. Rails supports multiple backends:
Setup:
# config/environments/production.rb
# Redis — most common, battle-tested
config.cache_store = :redis_cache_store, { url: ENV["REDIS_URL"], expires_in: 1.hour }
# Solid Cache (Rails 8, database-backed — no Redis needed)
config.cache_store = :solid_cache_store
# Memcached
config.cache_store = :mem_cache_store, "localhost:11211"
# In-memory (development only — per-process, not shared)
config.cache_store = :memory_store, { size: 64.megabytes }
For development, enable caching with bin/rails dev:cache (toggles a file that development config checks). This lets you test cache behavior locally without a separate Redis instance.
Low-Level Caching
Rails.cache.fetch is the foundation: read from cache, and if it’s a miss, execute the block and store the result.
Example:
def expensive_report_data
Rails.cache.fetch("report/#{Date.today}", expires_in: 6.hours) do
Report.generate_summary # expensive database aggregation
end
end
On first call: generates the report, stores it. On subsequent calls within 6 hours: returns the cached value immediately.
Example:
# Fetch with dynamic key and forced refresh
def user_dashboard_stats(user)
Rails.cache.fetch(["dashboard", user.id, user.updated_at.to_i], expires_in: 30.minutes) do
{
order_count: user.orders.count,
total_spent: user.orders.sum(:total),
recent_items: user.recent_items(limit: 5)
}
end
end
Including user.updated_at.to_i in the cache key means the cache auto-invalidates when the user record changes. No manual cache busting needed — the key changes, the old entry expires naturally.
Fragment Caching in Views
Fragment caching stores rendered HTML snippets. Wrap expensive view partials in cache blocks:
Example:
<%# app/views/products/_product.html.erb %>
<% cache product do %>
<div class="product">
<h2><%= product.name %></h2>
<p><%= product.description %></p>
<%= render product.reviews %>
</div>
<% end %>
cache product uses the model’s cache_key_with_version — which includes the updated_at timestamp — as the cache key. When the product is updated, the key changes and the fragment is regenerated. No explicit expiry needed.
For collections, cache is smart enough to batch-fetch keys:
Example:
<% cache @products do %>
<%= render @products %>
<% end %>
Or more commonly, cache each item individually:
Example:
<%= render partial: "product", collection: @products, cached: true %>
cached: true tells Rails to use fragment caching for each item in the collection — it batches the cache reads in a single call rather than one per item.
Russian Doll Caching
Russian doll caching is fragment caching where outer fragments contain inner fragments. When an inner record changes, only that fragment regenerates — the outer fragment stays cached:
Example:
<%# app/views/orders/show.html.erb %>
<% cache @order do %>
<h1>Order #<%= @order.id %></h1>
<% @order.line_items.each do |item| %>
<% cache item do %>
<div class="line-item">
<%= item.product.name %> × <%= item.quantity %>
<span>$<%= item.subtotal %></span>
</div>
<% end %>
<% end %>
<% end %>
If one line item changes, its inner fragment regenerates. But the outer @order cache key includes @order.updated_at — if only a line item changes without updating the order’s timestamp, the outer cache is stale.
The fix: use touch to propagate changes up the association:
Example:
class LineItem < ApplicationRecord
belongs_to :order, touch: true
end
touch: true updates the order’s updated_at when a line item changes, invalidating the outer fragment.
HTTP Caching with ETags and Last-Modified
HTTP caching reduces server load by telling browsers and CDNs when content hasn’t changed:
Example:
# app/controllers/products_controller.rb
def show
@product = Product.find(params[:id])
if stale?(@product)
respond_to do |format|
format.html
format.json { render json: @product }
end
end
end
stale? checks If-Modified-Since and If-None-Match headers. If the content hasn’t changed, it returns 304 Not Modified without executing the block — saving rendering time and bandwidth. Rails sets ETag and Last-Modified headers automatically from the model.
For index pages with multiple records:
Example:
def index
@products = Product.active.order(:created_at)
fresh_when(etag: @products, last_modified: @products.maximum(:updated_at))
end
Pro-Tip: Always use
cache_key_with_version(which Rails uses internally for fragment caching) over rolling your own key based on ID and timestamp. It handles edge cases — records with nil timestamps, version counters — that custom keys miss. When writing low-level cache keys manually, include enough context to make the key unique: model class, ID, relevant timestamp, and any scope parameters that affect the output.
Cache Invalidation
The hard part. Strategies in order of reliability:
Time-based expiry — simplest, occasional staleness acceptable. Use for data that changes infrequently and where brief staleness is fine.
Key-based invalidation — include updated_at in the key. Automatic, no explicit clearing needed. Only works when you know which records affect the cached value.
Explicit deletion — Rails.cache.delete(key) in callbacks or service objects. Use when you need immediate consistency and can pinpoint the key.
Namespace versioning — increment a version key; all keys with the old version become cache misses. Useful for bulk invalidation (deploy a new version and clear all product caches).
Conclusion
Rails caching is most effective when you start with a specific bottleneck — a slow query, an expensive template render, a repeating API call — and apply the appropriate tool: low-level fetch for computed data, fragment caching for views, Russian doll for nested content, HTTP caching for CDN-friendly endpoints. Adding caching speculatively without profiling first adds complexity without measured benefit. Profile, identify the hotspot, cache it specifically, and verify the improvement.
FAQs
Q1: How do I know what’s in the cache?
Rails.cache.read(key) reads a specific key. For Redis backends, redis-cli keys "*" lists all keys (not safe in production with large caches — use SCAN instead). Rails logs cache hits and misses in development when you run bin/rails dev:cache.
Q2: Should I cache at the database level or Rails level?
Both, for different things. Database-level caching (query cache, index optimization, connection pooling) is the foundation. Rails-level caching layers on top for expensive aggregations, rendered HTML, and external API responses. They complement each other.
Q3: What’s the difference between expires_in and expires_at?
expires_in: 1.hour sets expiry relative to now. expires_at: tomorrow.noon sets an absolute expiry time. Use expires_in for rolling expiry (cache stays alive as long as it’s accessed); use expires_at when you need content to refresh at a specific time (nightly reports, daily digests).
Q4: How do I cache across multiple servers?
Use a shared cache store — Redis, Memcached, or Solid Cache with a shared database. In-memory store is per-process and doesn’t share across app servers. A shared cache is essential for multi-server production deployments.
Q5: Can caching cause data consistency issues?
Yes. Stale caches serve outdated data. The touch: true association option propagates invalidation up relation chains. after_commit callbacks (not after_save) for explicit cache clearing ensure the database write has committed before invalidating. Test cache invalidation paths the same way you test business logic.
Check viewARU - Brand Newsletter!
Newsletter to DEVs by DEVs - boost your Personal Brand & career! 🚀