defer and async are about eliminating round trip latency by opening multiple simultaneous http connections. Multiple TCP streams are sometimes faster than one. (I say sometimes, because negotiating a new connection has an overhead)
I think that depends on user. Some use CDNs to improve performance via already cached files, others to offload traffic and use the actual CDN performance from Geo balancing.
Personally I don't think a user will have anything cached other than jQuery from Google. Which in this case I removed to follow the rule of eating own dogfood.
It kinda depends on statistics; I'd like to see some figures about CDN cache hits like that. I know the theory is sound, but I also gathered that one usually only gets 20% cache hits on websites with caching enabled. or something.
Last time I tried this, I ran into the problems of 1) unbounded URL length breaking down in old browsers, routers, etc. and 2) hobbled caching. (And also going against the grain of REST.)
I'd be interested if anyone has actually done something like this successfully. Did you have the issues I did? Was it worth it?
I don't think getting "batch" resources is against REST, Fielding himself stated that it's not a problem as long as the URL for the batch is consistent -- the batches are just separate resources, although they might have special semantics to the client.
In my opinion Google Hosted Libraries[1] are the best way to go if you're serving "big" commonly used JavaScript libraries to folks.
One of the major upsides is that it's so heavily used, that a user probably already has it cached in their browser. At least that's the idea, not sure of actual numbers.
I agree with you but to pull number out of thin air. I would honestly be surprised if 50mbps+ LTE mobile internet is even 1% of the connected internet.
I'm very wary about loading JS from random CDNs. In my opinion, the negative aspects outweigh the benefits by far:
- The CDN gets to decide *what* code is delivered to *which* users. Could be a prime target for, say, another FERRETCANNON.
- If the CDN is compromised, so is your site.
- If an attacker on a local network manages to inject poisoned cache data into requests for said CDN, your site is compromised.
- All of your visitors are disclosed to the CDN owner.
- If the CDN goes down, your site does so, too. Note that the inverse doesn't apply: the CDNs superior availability has no positive effect on your site.
- Loading from another host may cause an unnecessary DNS lookup and will cause an unnecessary TLS connection.
What would be cool is <script sha="2afdb28d" name="angular.js" version="1.2.10" src="xxx"></script>
This would mean that browser can essentially cache the exact version of the script from any source, verify it with a hash, and still have a fallback URL to download it from.
The name attribute would be purely aesthetic, so no reason for a version attribute. SHA-1 is good enough for git content-addressed-storage, it's good enough for browsers.
[1] https://developer.mozilla.org/en-US/docs/Web/HTML/Element/sc...
reply