Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The trade-off for extracting maximum performance from a user's hardware is that it becomes much easier to fingerprint. Judging by the history of the web this is a trade-off that probably isn't worth making.


sort by: page size:

There's generally a tradeoff between usability and performance, and resistance to fingerprinting. If your browser has WebGL enabled, the machine (not just the browser) can be fingerprinted. If it caches resources, adversaries can discover browsing history.

There's so much fingerprinting that can't really be disabled. Think about it:

Performance

- Single-threaded CPU performance

- Multi-threaded CPU performance

- WebGL performance

- Video performance

- Network performance (how long does it take to transfer data to various locations, what's the lag, is the lag consistent, etc.)

- (Maybe) Time it takes to execute certain JavaScript functions

User behavior

- How does the user use their mouse when navigating web pages?

- Not at all?

- Jerky movements?

- Smooth movements?

- If the user uses the keyboard, do they appear to be advanced keyboard users, do they have an IME, etc.

- Does the user press X buttons on tiny annoying popups that wouldn't interfere with the page's browsing experience?

- Does the user appear to block access to certain resources? (ad blocker)

- Does the user's workplace/country/etc. appear to block anything?


The whole article seems to be based on this study:

https://www.doc.ic.ac.uk/~maffeis/331/EffectivenessOfFingerp...

And sees it as an indicator that preventing fingerprinting is possible:

    Only a third of users had unique fingerprints,
    despite the researchers’ use of a comprehensive
    set of 17 fingerprinting attributes.
To me, the 17 attributes do not seem comprehensive at all. For example they don't make use of the users IP. So much can be derived from the IP. Carrier, approximate location etc. They also don't use the local IP, which is leaked via WebRTC:

https://browserleaks.com/webrtc

They also don't seem to measure the performance of CPU, RAM and GPU when performing different tasks.

But yes: Browsers should do more to prevent fingerprinting. But it seems they have no inclination to do so. That they don't plug the WebRTC hole that leaks the local IP is a strong indicator for me that privacy is low on the list of the browser vendors. Or maybe not on the list at all.


Browser fingerprinting, for one, is something that can be avoided.

The BLAS have historically relied on a wide range of microarchitecture-specific optimizations to get the most out of each processor generation. An ideal solution would be for the browser to provide that to the application in such a way that it is difficult to fingerprint.

See also the history of Atlas, GotoBLAS, Intel MKL, etc.


Security implications are always good to point out, though this is not unique to webGPU at all. Some fingerprinting techniques don't even require enabling javascript.

They're doing far more when fingerprinting browsers than looking at the User-Agent.

There are already so many ways to fingerprint a browser, it's really not something they need.

>Most browser "fingerprinting" methods have a pretty short half-life. The last assessment I read said something like half of the fingerprints were lost within 24 hours.

That's absolutely not my experience. Maybe if you use some weird research-level fingerprinting technique, but most fingerprints are just regular old boring stuff - screen and browser viewport size, installed plugins and fonts, browser UA and settings, hardware/gpu quirks, etc[1]. And it doesn't have to be 100% reliable, just reliable enough to track your activity to show you some ads.

[1] as a privacy conscious individual I'm fully aware just viewport size is enough to almost uniquely fingerprint me. I use my laptop screen, with sidebery extension, browser tab bar hidden by user css and sway in tabbed mode. My second computer is less unique, I "just" use sway and firefox with the minimal tab size (that for some reason is hidden must be unlocked in the about:config so it's very rarely used).


I feel like fingerprinting is inevitable with any hardware access, including WebGL or WebGPU. It’s one of my big concerns about Chrome exposing more and more of the hardware it runs on in the goal of being a Web based OS.

That said, fingerprinting is not as big a risk as what I was thinking of, which is one process being able to peer into another’s on the GPU. There are various takes on isolation on the GPU but they tend to have strong caveats attached.


Another interesting technique to fingerprint users online is called GPU Fingerprinting [1] (2022).

Codenamed 'DrawnApart', the technique relies on WebGL to count the number and speed of the execution units in the GPU, measure the time needed to complete vertex renders, handle stall functions, and more stuff

________________

1. https://www.bleepingcomputer.com/news/security/researchers-u...


You can't stop fingerprinting. I wish you could, but you can't.

But you CAN make it enough of a pain to cut down on the number of half-qualified Web monkeys who try to use the information in ham-handed ways. You can stop just handing the information over for free to people who might want to casually exploit you. You might even make it harder for some more sophisticated and/or committed actors to do it; for example, if I'm an ISP running a middlebox and trying to fingerprint all the traffic that runs through me, I can't use JavaScript. And you can save some bandwidth in the process.


Another method for web fingerprinting is called GPU-Fingerprinting [0], codenamed 'DrawnApart', it relies on WebGL to count the number and speed of the execution units in the GPU, measure the time needed to complete vertex renders, handle stall functions, and more stuff..

_______________________

0. https://www.bleepingcomputer.com/news/security/researchers-u...


There does seem to be a fingerprinting angle on it, but I wrote about using CPU, Battery, Memory etc. to be consideration over how much JavaScript you load for your users: https://umaar.com/dev-tips/242-considerate-javascript/#load-...

How do you kill fingerprinting, though? It's impossible without limiting what web developers can do with their sites.

Once talked with over a beer with a web performance head of company X, he said that you can use the web performance profile - without looking at any other browser identifiers - to pretty accurately do browser fingerprinting. When you then look additionally at the interaction speed within an webapp you can even profile different users.

I would expect browser fingerprinting and other similar techniques would be more likely to be used.

I did a little bit of research on browser fingerprinting years ago and even tried to write my own library to do it and ever since, I've always questioned how useful it actually is.

Using the list of measurement points on https://amiunique.org as guide most of the things that are constant about my computer like platform, browser, or requested language are not really unique to me and are shared by a large percentage of the other users who have come to the site.

On the other hand, most of the data points that are unique to my machine change semi-frequently. User agent and version change on browser updates, timezone changes when I travel, screen size and resolution change when I plug into my external monitor, new fonts will slowly be installed over time, and even things like how the canvases are rendered can change slightly depending on how much strain my GPU is under at the time I get fingerprinted.

Just plugging in to my external monitor was enough to get amiunique to treat me as a different user. (if you want to try, be sure to clear your local storage and cookies in between visits as the site saves a uuid there and will serve you your previous results if it finds it).

I'm sure there's some magic formula that gives different weights to different data points that can give a decent guess at who you are, but I doubt it can say with 100% accuracy that you are who it thinks you are.

It seems to me all it would take to defeat fingerprinting is a browser extension that modifies the browser apis to randomly slightly alter the requested data (add a random font to the list, add some nonsense to the user agent, etc). Sure, the fingerprint would still be unique, but it would be unique on every visit which would defeat the ability to track a user across visits.

*I'm not an expert on this subject at all, so if I got something wrong, please correct me


So any user on Edge can be hardware fingerprinted easily? I can see why other browsers stay far away.
next

Legal | privacy