,

Reducing Our Cache by 98.741% – Proof in Real Screenshot

Posted by



If you’re like me, dealing with slow loading times on your website can be incredibly frustrating. One of the biggest contributors to slow loading times can be an excessive amount of cache stored on your server. In my case, I was able to cut our cache by a whopping 98.741%, drastically improving our website’s performance. In this tutorial, I’ll walk you through the steps I took to achieve such a significant reduction in cache.

Step 1: Analyze your current cache situation
The first step in cutting down your cache is to understand exactly how much cache is currently being stored on your server. There are various tools available online that can help you with this, but I personally used a tool called Google PageSpeed Insights. This tool provides valuable insights into your website’s performance, including information about your cache size.

Step 2: Identify unnecessary cache
Once you have a clear picture of how much cache is being stored on your server, the next step is to identify any unnecessary cache that can be safely removed. This can include outdated files, duplicate files, or files that are no longer being used on your website.

Step 3: Update your caching settings
Next, you’ll want to review your caching settings and make sure they are optimized for performance. This can include setting expiration dates for your cache files, enabling compression, and using browser caching to store static files locally on your visitors’ devices.

Step 4: Implement lazy loading
Lazy loading is a technique that delays the loading of non-essential resources on your website until they are actually needed. This can help reduce the amount of cache being stored on your server and improve loading times for your visitors. There are many plugins available for popular content management systems like WordPress that can help you implement lazy loading on your website.

Step 5: Clean up your database
In addition to cleaning up your cache files, it’s also important to regularly clean up your website’s database. This can help reduce the amount of unnecessary data being stored on your server and improve overall performance. There are many plugins available that can help you clean up your database, or you can manually remove unused data from your database using phpMyAdmin.

Step 6: Monitor and optimize
Once you’ve implemented these steps, it’s important to regularly monitor your website’s performance and make adjustments as needed. Keep an eye on your website’s loading times, cache size, and overall performance to ensure that your optimizations are having the desired effect.

By following these steps, you can greatly reduce the amount of cache being stored on your server and improve your website’s performance. In my case, cutting our cache by 98.741% made a huge difference in our website’s loading times and overall user experience. I hope this tutorial helps you achieve similar results on your own website!

0 0 votes
Article Rating
48 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
@sahilaggarwal2004
3 months ago

This seems much better that mutating the standard fetch API, I think vercel should make this a standard for caching in Next.js

@gr-lf9ul
3 months ago

that moment when you guesstimated instead of piping through wc -c… it hurt

@rijkvanwel
3 months ago

Also, when you have to go through this much trouble to use their (paid) cache functionality — why not at that point just use Redis or sth

@AaronHarding-u6q
3 months ago

11:39 how do you get next.js to cache components? 😳

@Seedwreck
3 months ago

Uh, no.
Cache API is great.

@Strammeiche
3 months ago

Ah yes, complex instabile Solutions for simple Problems

@NotZeldaLive
3 months ago

This does have the un-intended consequence of using the SWR (Stale-While-Revalidate) model though. The first time there is a cache miss, that request will be given the old data, and it will not wait on the new fetch. Only the subequent users will get the updated data once the fetch is completed succssfully.

@DarkChaosMC
3 months ago

Happy to see a lack if hate comments from DarkVipers community.

@rikschaaf
3 months ago

So:
– if you just want to simply add cache, add a simple fetch cache. No need to optimize yet.
– if you notice your caching impacts your bandwidth (or wallet) too much, optimize your caches, like explained in this video
That, or just use the @Cached annotation on the method in question that handles all this for you. JS/TS has that functionality, right? Right?

@bartekmajster6734
3 months ago

where nextjs store cached data?

@thatguy2567
3 months ago

Whatever Junior dev actually implemented this is probably crying themselves to sleep tonight feeling called out in this vid, lol oops

@masterflitzer
3 months ago

i mean caching only what you need is a pretty obvious, the question is why is the right thing still unstable and everybody recommends a shitty solution?

@thatguy2567
3 months ago

let varName = ""; and then setting it from within a loop with a ton of ifs is HUGE code smell

@HyproTube
3 months ago

IMO this is what we get for using "magic" frameworks and libraries that abstract away all of the details of what's going on. Sure, enable caching by importing a library and adding the cache annotation to a function or whatever "for free". I miss the days when developers had to think about what was actually going on when they used someone else's code. I guess at most places it may not be worth the time spent understanding, which, if true, would be sad 🤷

@akashthoriya
3 months ago

Can you create video on "Maestro: Netflix’s Workflow Orchestrator"

@ws_stelzi79
3 months ago

Oh, JS devs are a funny bunch of people! Wondering when caching the full response that they use GIGABYTES for just a silly non-core function that just wraps npm calls!

@quemediga
3 months ago

this is a great win. Now we have low code issues to solve 👏👏👏

@pokalen
3 months ago

Don't you own the package you're querying version info about? Couldn't you instead add a github workflow that stores these versions somewhere whenever you publish the package? It could even just be setting them as ENV vars for your backend, removing the need for a cache completely.

@neofox2526
3 months ago

I agree with the others. Make a Lambda and set it to go every 24 hours and put it in the DB or another store.

@evansmaina9902
3 months ago

why not use upstash redis