,

Reducing Our Cache by 98.741% – Proof in Real Screenshot

Posted by



If you’re like me, dealing with slow loading times on your website can be incredibly frustrating. One of the biggest contributors to slow loading times can be an excessive amount of cache stored on your server. In my case, I was able to cut our cache by a whopping 98.741%, drastically improving our website’s performance. In this tutorial, I’ll walk you through the steps I took to achieve such a significant reduction in cache.

Step 1: Analyze your current cache situation
The first step in cutting down your cache is to understand exactly how much cache is currently being stored on your server. There are various tools available online that can help you with this, but I personally used a tool called Google PageSpeed Insights. This tool provides valuable insights into your website’s performance, including information about your cache size.

Step 2: Identify unnecessary cache
Once you have a clear picture of how much cache is being stored on your server, the next step is to identify any unnecessary cache that can be safely removed. This can include outdated files, duplicate files, or files that are no longer being used on your website.

Step 3: Update your caching settings
Next, you’ll want to review your caching settings and make sure they are optimized for performance. This can include setting expiration dates for your cache files, enabling compression, and using browser caching to store static files locally on your visitors’ devices.

Step 4: Implement lazy loading
Lazy loading is a technique that delays the loading of non-essential resources on your website until they are actually needed. This can help reduce the amount of cache being stored on your server and improve loading times for your visitors. There are many plugins available for popular content management systems like WordPress that can help you implement lazy loading on your website.

Step 5: Clean up your database
In addition to cleaning up your cache files, it’s also important to regularly clean up your website’s database. This can help reduce the amount of unnecessary data being stored on your server and improve overall performance. There are many plugins available that can help you clean up your database, or you can manually remove unused data from your database using phpMyAdmin.

Step 6: Monitor and optimize
Once you’ve implemented these steps, it’s important to regularly monitor your website’s performance and make adjustments as needed. Keep an eye on your website’s loading times, cache size, and overall performance to ensure that your optimizations are having the desired effect.

By following these steps, you can greatly reduce the amount of cache being stored on your server and improve your website’s performance. In my case, cutting our cache by 98.741% made a huge difference in our website’s loading times and overall user experience. I hope this tutorial helps you achieve similar results on your own website!

0 0 votes
Article Rating

Leave a Reply

48 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
@sahilaggarwal2004
24 days ago

This seems much better that mutating the standard fetch API, I think vercel should make this a standard for caching in Next.js

@gr-lf9ul
24 days ago

that moment when you guesstimated instead of piping through wc -c… it hurt

@rijkvanwel
24 days ago

Also, when you have to go through this much trouble to use their (paid) cache functionality — why not at that point just use Redis or sth

@AaronHarding-u6q
24 days ago

11:39 how do you get next.js to cache components? 😳

@Seedwreck
24 days ago

Uh, no.
Cache API is great.

@Strammeiche
24 days ago

Ah yes, complex instabile Solutions for simple Problems

@NotZeldaLive
24 days ago

This does have the un-intended consequence of using the SWR (Stale-While-Revalidate) model though. The first time there is a cache miss, that request will be given the old data, and it will not wait on the new fetch. Only the subequent users will get the updated data once the fetch is completed succssfully.

@DarkChaosMC
24 days ago

Happy to see a lack if hate comments from DarkVipers community.

@rikschaaf
24 days ago

So:
– if you just want to simply add cache, add a simple fetch cache. No need to optimize yet.
– if you notice your caching impacts your bandwidth (or wallet) too much, optimize your caches, like explained in this video
That, or just use the @Cached annotation on the method in question that handles all this for you. JS/TS has that functionality, right? Right?

@bartekmajster6734
24 days ago

where nextjs store cached data?

@thatguy2567
24 days ago

Whatever Junior dev actually implemented this is probably crying themselves to sleep tonight feeling called out in this vid, lol oops

@masterflitzer
24 days ago

i mean caching only what you need is a pretty obvious, the question is why is the right thing still unstable and everybody recommends a shitty solution?

@thatguy2567
24 days ago

let varName = ""; and then setting it from within a loop with a ton of ifs is HUGE code smell

@HyproTube
24 days ago

IMO this is what we get for using "magic" frameworks and libraries that abstract away all of the details of what's going on. Sure, enable caching by importing a library and adding the cache annotation to a function or whatever "for free". I miss the days when developers had to think about what was actually going on when they used someone else's code. I guess at most places it may not be worth the time spent understanding, which, if true, would be sad 🤷

@akashthoriya
24 days ago

Can you create video on "Maestro: Netflix’s Workflow Orchestrator"

@ws_stelzi79
24 days ago

Oh, JS devs are a funny bunch of people! Wondering when caching the full response that they use GIGABYTES for just a silly non-core function that just wraps npm calls!

@quemediga
24 days ago

this is a great win. Now we have low code issues to solve 👏👏👏

@pokalen
24 days ago

Don't you own the package you're querying version info about? Couldn't you instead add a github workflow that stores these versions somewhere whenever you publish the package? It could even just be setting them as ENV vars for your backend, removing the need for a cache completely.

@neofox2526
24 days ago

I agree with the others. Make a Lambda and set it to go every 24 hours and put it in the DB or another store.

@evansmaina9902
24 days ago

why not use upstash redis

48
0
Would love your thoughts, please comment.x
()
x