Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

DNT just became another fingerprinting datapoint, so it ended up being better to just set it to whatever the critical mass is using (which is typically off).


sort by: page size:

I had to switch it to sensitive mode for the same reason. It's also more useful as a relative metric than specific number.

They probably reduce the precision to give some plausible deniability in the face of people who have their network connections monitored.

Cern does lots of data analysis where measurement precision is not essential at least in evaluation phase. Switching into higher precision may be needed only for the final calculations It's also possible to scale and normalize data to lower precision.

Also, thanks for the thought re: digit precision. I am tracking it here: https://github.com/tokio-rs/console/issues/224

There are precision issues with -1...1. 0...1 is actually the right choice IMO.

I've stopped using it as I found it far too sensitive to small differences.

Ah fair, I should've known. I suppose the precision is still required for scientific purposes. Thankfully ML stuff now gets more appropriate precision for a speed increase.

It's still better than using centiinches per millihours.

Dropping numerical precision seems to have done that to some degree?

Empirically, it doesn't seem to work very well to try for that level of precision.

Maybe it is using single precision.

That's great! All we need to do is negate the output and it will be more accurate.

You might be thinking of Selective Availability (which limited accuracy on purpose) being turned off in 2000.

Because it's hard to dose correctly, and has a very slim error margin?

Does the ONNX thing actually work?

I'm super skeptical of these interchanges, because it seems very difficult to avoid train/test skew. Any difference in detail between the two implementations is a potential problem. I can imagine different order of operations putting out some values by 0.1%, causing 1% eventual loss of accuracy.


Something like that. The point was accuracy, not throughput.

Right, but it can sync arbitrary ranges sooner, which is also awful for consistency.

More precision requires more data.

My understanding is that nuclear doesn’t do variability very well
next

Legal | privacy