#6 Caching Basics & Strategies – Why Cache? Cache Invalidation, Write-Through, Write-Back

The Slow Website Problem

Aditi was excited about her new blog. She posted high-quality content, but visitors complained about slow loading times.

Her developer explained the issue: Every time someone visited, the website fetched data from the database. That took time.

The solution? Caching. By storing frequently accessed data closer to the user, responses became nearly instant.

What is Caching?

Caching is the process of storing copies of frequently accessed data in a fast-access location.

Instead of fetching data from a slow database or remote server, the system serves data from the cache, improving response times.

Why Cache?

  1. Speed – Reduces response time significantly.

  2. Reduced Load – Relieves database and server pressure.

  3. Cost Efficiency – Minimizes expensive database queries.

  4. Scalability – Handles high traffic efficiently.

Without caching, every request queries the database, slowing everything down.

Types of Caching

1. Client-Side Caching

Stored in the user's browser (e.g., images, CSS, JavaScript).

Example: A website logo loads instantly because it's cached locally.

2. Server-Side Caching

Stored on the backend to reduce database queries.

Example: Frequently accessed user profiles are cached in memory.

3. Distributed Caching

Data is cached across multiple servers for scalability.

Example: Netflix caches trending movies across data centres.

Cache Invalidation – Keeping Data Fresh

Caching is great, but what if data changes? Cache Invalidation removes stale data.

1. Time-Based Expiry (TTL – Time to Live)

  • Cached data expires after a set time.

  • Example: Weather data refreshes every 30 minutes.

2. Write Invalidation

  • The cache updates when data is modified in the database.

  • Example: A user updates their email, so the cache updates too.

3. Manual Invalidation

  • Developers clear caches manually when needed.

  • Example: Clearing product listings cache after a big sale.

Caching Strategies

1. Write-Through Caching – Immediate Sync

  • Data is written to both the cache and the database simultaneously.

  • Ensures consistency but can slow down writes.

Example: A banking app updates account balances in both cache and database.

Pros: Data is always fresh. ✖ Cons: Slower write operations.

2. Write-Back Caching – Faster Writes

  • Data is first written to the cache, then asynchronously updated in the database.

  • Improves write speed but risks data loss if the cache fails.

Example: Online gaming leaderboards store scores in cache before updating the database.

Pros: Faster write performance. ✖ Cons: Risk of data loss if cache crashes.

Real-World Use Cases

1. E-Commerce Websites

Caching product details reduces database calls, making pages load instantly.

2. Social Media Feeds

Feeds are cached so users don’t wait for fresh content every time.

3. Streaming Services

Popular videos are cached close to users to reduce buffering.

Conclusion

Caching speeds up applications, reduces database load, and enhances scalability.

Key techniques include cache invalidation, write-through, and write-back caching to maintain data accuracy.

Next, we’ll explore Distributed Caching – Redis, Memcached, CDN-based caching.

Powered by wisp

3/3/2025
Related Posts
#7 Distributed Caching – Redis, Memcached, CDN-Based Caching

#7 Distributed Caching – Redis, Memcached, CDN-Based Caching

Overwhelmed by traffic? Learn how distributed caching with Redis, Memcached, and CDNs can save the day. Deliver content faster, reduce server load, and keep your users happy.

Read Full Story
#20 Content Delivery Networks (CDNs) – Akamai, Cloudflare, AWS CloudFront

#20 Content Delivery Networks (CDNs) – Akamai, Cloudflare, AWS CloudFront

Want to make your website load instantly? Let's talk CDNs. We'll cover how they work and when to use them. Think of it as putting your website on a fast track around the world.

Read Full Story
#5 Load Balancer and Techniques – Round Robin, Least Connections, Consistent Hashing

#5 Load Balancer and Techniques – Round Robin, Least Connections, Consistent Hashing

Ever wonder how big sites handle traffic spikes? We break down load balancing: Round Robin, Least Connections, Consistent Hashing. Learn how to keep servers happy and your site running smoothly.

Read Full Story
© Rahul 2025
    #6 Caching Basics & Strategies – Why Cache? Cache Invalidation, Write-Through, Write-Back - Rahul Vijay