You also obviously have to allow the standard CDNs and such, but the result is that 90% of the web then "just works", without most of the tracking or problems.
If you're not using a CDN, you can just enable it for your own site. They explained in the article why they didn't think enough sites would do this to make it worthwhile.
Literally no significant extra effort to download the files and put them with your website. If you fetch them from a CDN with a fixed version you're not getting updates anyway.
CDNs are server side trackers. Even when not serving up browser side tracking code that makes yet more requests (which they often do), they're collecting location and other data based on response times and headers. Google's gv2 beacons, newrelic, cloudflare, etc.
This is why tools like decrentral-eyes exists. Unfortunately, tools like actually introduce additional unique identifiers.
Even with Javascript disabled, web fonts are tracking you. Even with javascript and web fonts disabled, images and CSS are tracking you. Even first-party images and CSS track you, even over VPNs and Tor.
First class web experiences that can't track are only possible by routing all traffic through a trusted remote privacy-by-design pre-caching hash table, like via a trusted IPFS node. Then you also need to only transact with the non-web (e.g. server hosting) service over a routeless protocol.
The latter requires the service to support non-custodial routeless protocols for payment and control. This is something dApps do well.
It sounds complicated, but every Web3.0 (not to be confused with Web3.js) service already supports this by necessity, even in outdated browsers, simply by choosing a trusted node and updating your browser or device's DNS and proxy settings. No additional software required.
Yes, it would. Actually so many uses of CDN's are nonsensical. I mean, I literally block those requests and sites just work OK. (For my definition of OK. If I don't see autoloading autoplaying video it's only a big big plus)
Your box running your web server is far less complicated than using a CDN and worrying about countless additional points of failure. Network problems are only a minor risk.
It's unfeasible, but also completely unnecessary. CDNs can give you some speedup, but they don't magically make a site smaller and less resource-intensive to use.
This is what CDNs are designed for, the allow a few companies to get a monopoly on your browsing data. This is why I prefer uMatrix, it blocks all third party requests, a lot of stuff breaks but it breaks because their tracking you.
reply