Nope, we solve it the same way, it's a pain. I did a presentation once for devs where I unraveled the up-to-9 different layers of caching between an end user and our website (when you take into account the browser and any tiered CDN caching)
It's a pest of a problem but pre-deploying static assets is the best answer.
On several of my modern projects, there's not a single piece of static data that can't be cached forever in a CDN. That's because server-side code is not getting really good at managing the initial build of static assets and the delivery of their URL.
Exactly, you’re incentivized to make your website as static as possible, and I attach standard http cache headers to most of the server rendered stuff so that their responses get cached in Vercel’s CDN, and once again not invoked super often.
If you weren't caching with CDN, would serving those static assets as efficiently as possible be a good reason to keep using (eg) nginx, do you think?
Oh, I guess load balancing (with a multiple-host scale) is another good reason, if you don't have heroku doing it for you, nginx is a convenient way to do it just fine.
Browsers could safely pull a list of very commonly requested, content addressable resources from various CDNs and pre cache it (independently of any request). That would even help with first request latency, and for mobile (where bandwidth is expensive) you could do the pre caching on Wifi.
> just stick your content on a static host and you're done
1 (relatively small) advantage is not needing to control the server, just the domain. The server then polls IPNS and updates things with the latest stuff.
2 There's the standardized (and perhaps more meaningful) cache busting aspect for assets. i.e. the IPFS Companion browser extension takes all /ipfs/<CID>/filename.ext urls and sends them from the local datastore instead of making requests. This way, assets aren't requested from a remote each time when they're the same on multiple websites. This helps since CDN's don't use common caches anymore. Though, it might come with the same privacy risks.
https://www.stefanjudis.com/notes/say-goodbye-to-resource-ca...
> it's hard to embed and resource-hungry, which hampers adoption
It is cheaper for me to put my react app in front of a cdn, split out my app into an api and front end than for me to have my site be entirely uncacheable.
I can also cache certain endpoints behind the cdn that are mostly invariant for users.
And, the network egress of json is much lesser than the egress of markup.
Typically CDN caches are cleared after X amount of time, or are cleared manually. This solution generates your entire website as static HTML and viewer requests never actually reach the backend.
I've spent a fair amount of time (over) optimizing https://starthq.com and would say that network latency needs to be at the top of the list of things to look at.
The solution has been to serve static HTML, i.e. the index page of the single page app part, with a one hour cache expiration and links to other static resources (JavaScript, CSS, images etc.) decorated with their E-Tag and loaded from a CDN. For non static pages the set up is the same, but the caching time is lower. The static resources have far off (one year or so) expiration headers so are cached permanently.
By using a CDN for all your assets you can reduce a 200ms roundtrip to 8ms for all users worldwide, bringing the page loading time to way below 200ms with an empty cache and well below 100ms with a primed cache since you still need to do an XHR to check the login status.
Small time saving tip: if you go with CloudFront, go all out and use all edge locations - it's cheap. I tried using Europe only at first only to eventually find out that I was still being given an edge location in the US despite being in Finland.
It's a pest of a problem but pre-deploying static assets is the best answer.
reply