Add a memory cache local to the thread to reduce Redis load
Loading `ApplicationSetting` from Redis was responsible for at least 50% of the CPU load of the Redis cluster on GitLab.com. Since these values generally don't change very much, we can load this from the database and cache it in memory, skipping Redis altogther. We use `ActiveSupport::Cache::MemoryStore` as a drop-in replacement for `RedisCacheStore` even though we probably don't need synchronized access within `Thread.current`. Closes https://gitlab.com/gitlab-org/gitlab-ce/issues/63977
This commit is contained in:
parent
df0be8b226
commit
978647c6cb
|
|
@ -272,4 +272,12 @@ class ApplicationSetting < ApplicationRecord
|
|||
# We already have an ApplicationSetting record, so just return it.
|
||||
current_without_cache
|
||||
end
|
||||
|
||||
# By default, the backend is Rails.cache, which uses
|
||||
# ActiveSupport::Cache::RedisStore. Since loading ApplicationSetting
|
||||
# can cause a significant amount of load on Redis, let's cache it in
|
||||
# memory.
|
||||
def self.cache_backend
|
||||
Gitlab::ThreadMemoryCache.cache_backend
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ module CacheableAttributes
|
|||
end
|
||||
|
||||
def retrieve_from_cache
|
||||
record = Rails.cache.read(cache_key)
|
||||
record = cache_backend.read(cache_key)
|
||||
ensure_cache_setup if record.present?
|
||||
|
||||
record
|
||||
|
|
@ -58,7 +58,7 @@ module CacheableAttributes
|
|||
end
|
||||
|
||||
def expire
|
||||
Rails.cache.delete(cache_key)
|
||||
cache_backend.delete(cache_key)
|
||||
rescue
|
||||
# Gracefully handle when Redis is not available. For example,
|
||||
# omnibus may fail here during gitlab:assets:compile.
|
||||
|
|
@ -69,9 +69,13 @@ module CacheableAttributes
|
|||
# to be loaded when read from cache: https://github.com/rails/rails/issues/27348
|
||||
define_attribute_methods
|
||||
end
|
||||
|
||||
def cache_backend
|
||||
Rails.cache
|
||||
end
|
||||
end
|
||||
|
||||
def cache!
|
||||
Rails.cache.write(self.class.cache_key, self, expires_in: 1.minute)
|
||||
self.class.cache_backend.write(self.class.cache_key, self, expires_in: 1.minute)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
title: Add a memory cache local to the thread to reduce Redis load
|
||||
merge_request: 30233
|
||||
author:
|
||||
type: performance
|
||||
|
|
@ -0,0 +1,3 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
Gitlab::ThreadMemoryCache.cache_backend
|
||||
|
|
@ -0,0 +1,15 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Gitlab
|
||||
class ThreadMemoryCache
|
||||
THREAD_KEY = :thread_memory_cache
|
||||
|
||||
def self.cache_backend
|
||||
# Note ActiveSupport::Cache::MemoryStore is thread-safe. Since
|
||||
# each backend is local per thread we probably don't need to worry
|
||||
# about synchronizing access, but this is a drop-in replacement
|
||||
# for ActiveSupport::Cache::RedisStore.
|
||||
Thread.current[THREAD_KEY] ||= ActiveSupport::Cache::MemoryStore.new
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -139,6 +139,8 @@ RSpec.configure do |config|
|
|||
allow(Feature).to receive(:enabled?)
|
||||
.with(:force_autodevops_on_by_default, anything)
|
||||
.and_return(false)
|
||||
|
||||
Gitlab::ThreadMemoryCache.cache_backend.clear
|
||||
end
|
||||
|
||||
config.around(:example, :quarantine) do
|
||||
|
|
|
|||
Loading…
Reference in New Issue