Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Nope, we solve it the same way, it's a pain. I did a presentation once for devs where I unraveled the up-to-9 different layers of caching between an end user and our website (when you take into account the browser and any tiered CDN caching)

It's a pest of a problem but pre-deploying static assets is the best answer.



sort by: page size:

On several of my modern projects, there's not a single piece of static data that can't be cached forever in a CDN. That's because server-side code is not getting really good at managing the initial build of static assets and the delivery of their URL.

No, not a dev at SO. I am guessing what would be rather a standard use of CDN (hosting static assets, caching them geographically).

What you're saying is probably right.


I avoid this problem entirely by keeping static content static--and hosted via CDN.

The same approach goes for dynamic (database-backed content)--cache the ever-loving shit out of it.


This doesn't seem like an issue. You can store images / other assets on the CDN, and serve the real time stuff from the origin without caching.

I don't see why they would be more of a hassle. Why would they even care if the HTML is being rendered on demand or pre-rendered and stored?

Regarding cache, I don't agree - having your site cached by the edge nodes of a CDN is quite important nowadays.


Can't they just cache let's say these 1000 marketing, sales and customer support pages into cdn then?

Exactly, you’re incentivized to make your website as static as possible, and I attach standard http cache headers to most of the server rendered stuff so that their responses get cached in Vercel’s CDN, and once again not invoked super often.

If you weren't caching with CDN, would serving those static assets as efficiently as possible be a good reason to keep using (eg) nginx, do you think?

Oh, I guess load balancing (with a multiple-host scale) is another good reason, if you don't have heroku doing it for you, nginx is a convenient way to do it just fine.


Nope, spot on. You either cache extremely aggressively with a CDN if you can, or otherwise you eat the data out charges for dynamic data.

You could use webhooks on deploy to call your CDN's API to purge the cache. Otherwise, I guess just keep it time-based.

Won't Cloudflare help by caching static stuff and providing the CDN?

Browsers could safely pull a list of very commonly requested, content addressable resources from various CDNs and pre cache it (independently of any request). That would even help with first request latency, and for mobile (where bandwidth is expensive) you could do the pre caching on Wifi.

You could have CloudFront in front of your application servers, though, and it could be doing passthrough requests for non-cacheable assets.

Not that I'd ever want this, because CloudFront is dog slow compared to other CDNs, but yeah..


> just stick your content on a static host and you're done

1 (relatively small) advantage is not needing to control the server, just the domain. The server then polls IPNS and updates things with the latest stuff.

2 There's the standardized (and perhaps more meaningful) cache busting aspect for assets. i.e. the IPFS Companion browser extension takes all /ipfs/<CID>/filename.ext urls and sends them from the local datastore instead of making requests. This way, assets aren't requested from a remote each time when they're the same on multiple websites. This helps since CDN's don't use common caches anymore. Though, it might come with the same privacy risks. https://www.stefanjudis.com/notes/say-goodbye-to-resource-ca...

> it's hard to embed and resource-hungry, which hampers adoption

yes.


Put your site behind a CDN like Cloudflare and cache it aggressively.

It's really not rocket science.


At cost of reducing cache ability though no?

It is cheaper for me to put my react app in front of a cdn, split out my app into an api and front end than for me to have my site be entirely uncacheable.

I can also cache certain endpoints behind the cdn that are mostly invariant for users. And, the network egress of json is much lesser than the egress of markup.


The next blog post will be about the speedup obtained with serving static assets via CDN?

Typically CDN caches are cleared after X amount of time, or are cleared manually. This solution generates your entire website as static HTML and viewer requests never actually reach the backend.

I've spent a fair amount of time (over) optimizing https://starthq.com and would say that network latency needs to be at the top of the list of things to look at.

The solution has been to serve static HTML, i.e. the index page of the single page app part, with a one hour cache expiration and links to other static resources (JavaScript, CSS, images etc.) decorated with their E-Tag and loaded from a CDN. For non static pages the set up is the same, but the caching time is lower. The static resources have far off (one year or so) expiration headers so are cached permanently.

By using a CDN for all your assets you can reduce a 200ms roundtrip to 8ms for all users worldwide, bringing the page loading time to way below 200ms with an empty cache and well below 100ms with a primed cache since you still need to do an XHR to check the login status.

Small time saving tip: if you go with CloudFront, go all out and use all edge locations - it's cheap. I tried using Europe only at first only to eventually find out that I was still being given an edge location in the US despite being in Finland.

next

Legal | privacy