i ran this on my work laptop (windows 10 / firefox - mostly default settings). the "details" tab shows a full breakdown of each fingerprinting component with a "similarity ratio" (percentage of fingerprints that share the same value).
every entry is 20% or greater, most of them around 50% except for timezone (4%), canvas (1%), and user agent (0.12%).
so, yeah, user agents are still the largest source of entropy by a wide margin for fingerprinting scripts.
>With the fancy new anti-fingerprinting Safari on macOS Mojave I get just over 14.5 bits of entropy with the most entropic source being my canvas fingerprint (1 in 600).
That's actually pretty good, considering tor browser (which has resistfingerprinting enabled) with default window size (1000x1000) has 14.82 bits of entropy.
Unfortunately the many, many browser capabilities have given adtech enough entropy to actually create globally unique fingerprints. You can lookup yours with an estimation of uniqueness here: https://amiunique.org/fp
I wonder how many bits of entropy that fingerprint has, though. 8 bits would make for an impressive and scary looking demo, for example, but for ad tracking it would be useless.
I was saying the number of people who install a plugin that varies the user agent is tiny, and even then there are other ways to detect (just check out the lists of things, screen sizes etc.)
And the big picture is, regardless of these small imperfections, a digital fingerprint is over 95% accurate! So that's very valuable!
That's great, but once you factor in all the people who downloaded this plugin and are otherwise slightly harder to fingerprint, it's still 99% accurate!
Yup, I agree with you about this. It’d be interesting to do a deep dive into a library like FingerprintJS and see what has the most weight in terms of uniqueness. Maybe getImageData is worthwhile blocking, but perhaps other APIs will increase the amount of entropy.
This one overestimates uniqueness because it doesn't consider stability (e.g. it uses your current battery charge level as a uniqueness measure, which is obviously not stable minute-to-minute let alone day-to-day).
Probably true, however, is it feasible for anti-fingerprinting technology to be sufficiently standardized that website authors can tell "oh, they're using anti-fingerprinting", but not derive more details?
If a piece of anti-fingerprinting software hides more information than it reveals, it's a net positive. If it does the opposite, it's actively harmful. There's probably a nice formulation of this in terms of entropy, but I can't quite state it, so hopefully this makes sense.
So what. Removing one of the largest sources entropy available for fingerprinting users is important. We shouldn't maintain the terrible long-term effects[1] of tracking everything just to help you have an easier time debugging.
https://amiunique.org/
i ran this on my work laptop (windows 10 / firefox - mostly default settings). the "details" tab shows a full breakdown of each fingerprinting component with a "similarity ratio" (percentage of fingerprints that share the same value).
every entry is 20% or greater, most of them around 50% except for timezone (4%), canvas (1%), and user agent (0.12%).
so, yeah, user agents are still the largest source of entropy by a wide margin for fingerprinting scripts.
reply