Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It seems like sticking a CDN in front of the static map API could be used to reduce costs quite a bit there as well, especially if you forced it to cache more aggressively than the cache control headers returned by Google indicate. Has anyone tried this?


sort by: page size:

No, not a dev at SO. I am guessing what would be rather a standard use of CDN (hosting static assets, caching them geographically).

What you're saying is probably right.


At cost of reducing cache ability though no?

It is cheaper for me to put my react app in front of a cdn, split out my app into an api and front end than for me to have my site be entirely uncacheable.

I can also cache certain endpoints behind the cdn that are mostly invariant for users. And, the network egress of json is much lesser than the egress of markup.


I think that depends on user. Some use CDNs to improve performance via already cached files, others to offload traffic and use the actual CDN performance from Geo balancing.

Personally I don't think a user will have anything cached other than jQuery from Google. Which in this case I removed to follow the rule of eating own dogfood.


but for a CDN, you'd probably have caching. that would eliminate the need to take advantage of this kind of optimization

Why, pray tell? Isn't the Google CDN:

- faster for people in different geographic areas

- at least as reliable as your infrastructure

- already cached (this would be nullified by your proposal)

I ask because normally I wouldn't think twice about using the Google CDN libraries.


A CDN is most useful for static sites, no? Since the page content rarely changes it's easy to cache.

The stack I describe in the post is only for map tiles - Map tiles are a good fit for CDNs because the input space is small (just Z/X/Y coordinates on a square grid) and thus very cacheable.

Geocoding is a very different problem because the input space - human language - is much, much larger, and answering queries quickly to support features like autocomplete really requires a server with hot data in memory.

One of my favorite projects in this space is Pelias https://pelias.io which is an open source auto-completing geocoder based on OSM plus other open data. It's backed by a great team that also runs a business: Geocode Earth https://geocode.earth


couldnt they set up their own cdn or is redistribution of google maps cache files disallowed as a result of the existing Terms of Service?

Nope, spot on. You either cache extremely aggressively with a CDN if you can, or otherwise you eat the data out charges for dynamic data.

I have never made that to work. With Akamai we are resorting to lower ttl for cache expiry. The tag based cache architecture in practice is not helping with constraints from CDN providers

Nope, we solve it the same way, it's a pain. I did a presentation once for devs where I unraveled the up-to-9 different layers of caching between an end user and our website (when you take into account the browser and any tiered CDN caching)

It's a pest of a problem but pre-deploying static assets is the best answer.


Don't Google have a CDN service?

Edit: yes. (https://cloud.google.com/compute/docs/load-balancing/http/cd...) But it's more of a CloudFlare competitor—a distributed caching reverse-proxy with a 4MB object cachability limit. Costs $0.008/GB, which is cheap compared to a real CDN, but expensive compared to CloudFlare's "free."


Won't Cloudflare help by caching static stuff and providing the CDN?

Yes, I totally agree. Sorry if that was confusing—I was trying to simplify the diagrams and perhaps oversimplified.

You would of course still use a CDN for any static assets. Only the HTML that changes would be served by FastBoot, and of course you probably want to cache certain responses from that as well.


Not the same as static but you can get pretty good results by caching in a CDN. It’s also pretty easy to set up these days.

Specifically for the javascript frameworks, the use of a popular CDN increases the chance that the browser will have the asset in the cache already. Browser cache is a huge win for load times.

Static file hosting via a managed CDN is a fairly reliable option, better than many companies can build on their own.


Using a centralized cdn would increase the chance of a cache hit.

Isn't using Google CDN (quite possibly cached by user) with a fallback to a local copy the best of both worlds?

On several of my modern projects, there's not a single piece of static data that can't be cached forever in a CDN. That's because server-side code is not getting really good at managing the initial build of static assets and the delivery of their URL.
next

Legal | privacy