Câu hỏi phỏng vấn Software Architecture
Câu hỏi

Explain what is...

Câu trả lời

Cache Stampede, also known as a thundering herd problem, is a phenomenon that occurs in computing systems that rely on caching mechanisms to improve performance and reduce load on the underlying data sources, such as databases or external APIs. This issue arises when multiple concurrent requests attempt to access the same piece of data from the cache simultaneously, but find it missing or expired. As a result, all these requests bypass the cache and hit the underlying data source directly to fetch the data, leading to a sudden surge in load. This can cause significant performance degradation, increased latency, and in extreme cases, can overwhelm the data source, leading to system instability or failure.

The primary causes of Cache Stampede include:

  1. Cache Expiration: When cached data has a fixed expiration time or a time-to-live (TTL) value, all the requests that arrive after the expiration have to fetch the data from the data source[2].
  2. Cache Cold Start: When the cache is empty or has not been populated yet, all the requests have to fetch the data from the data source[1].
  3. Cache Invalidation: When cached data becomes stale or inconsistent due to changes in the data source, all the requests that need the updated data have to fetch it from the data source[2].
  4. High Contention: In scenarios where multiple concurrent requests are accessing the same resource, such as a frequently accessed cache entry, the likelihood of cache stampede increases[2].

To mitigate the Cache Stampede problem, various strategies can be employed:

  1. Locks: A lock ensures that only one request can access or update a shared resource at a time. In the context of caching, a lock can be used to prevent multiple requests from fetching the same data from the data source simultaneously[1].
  2. Probabilistic Early Expiration: This approach involves each process making an independent probabilistic decision to recompute the cache value before its expiration, thereby spreading out the recomputation requests over time[3][4].
  3. Stale-While-Revalidate: This technique allows cached data to be served to users while asynchronously refreshing it in the background, thu...
senior

senior

Gợi ý câu hỏi phỏng vấn

expert

Where DTO should be implemented, in a Domain Layer or in an Application Service Layer? Explain

middle

What is a Model in DDD?

junior

What is meant by the KISS principle?

Bình luận

Chưa có bình luận nào

Chưa có bình luận nào