Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You can get a rough idea of the required precision by checking if the dose is weight dependent. If it isn't, 2% won't matter.


sort by: page size:

In that case you might still want to test that it's correct within a certain degree of precision.

It depends on what you consider “accurate”. For home/diy usage, +/-10% may be absolutely sufficient as long as the error is stable (i.E: it always is a similar offset). For lab/industrial usage, 0.1% may be too far off. There’s a large gap in engineering effort that drives up the prices for more precise sensors.

I'd be more concerned about the phony precision. Not 250%, not 270%, no -- 273%.

I think there's a real danger to underestimating small % improvements over time, depending on how you're measuring.

Especially if you're looking at error rates; 99% accuracy is very different to 95% accuracy is very different to 90% etc.


Hum i take this back the calculation should be 0.13 * 0.89 + 0.87 * 0.5 = 55% total accuracy

Can you even measure 3% accurately and repeatedly?

I may have used ‘exactly’ sloppy here. What I mean is: When the context doesn’t say otherwise, one should assume accuracy or precision as rounding to the least significant digit given. I.e. by default 1kg may indicate a mass between 500g and 1.499g. If the accuracy is about 10g you should write 1.00kg.

Even for 0% accuracy?

We're not trying to hit a comet with a rocket here. 1 significant figure is more than sufficient for an initial consultation. Any additional accuracy required would be billable follow-on work.

Is 90% accuracy good enough to be useful?

If you expect bit-for-bit reproducible results, then yea, you'd have to know about the nitty-gritty details. The values should usually still correspond to the same thing in common real world precision though.

it doesn't matter if it's accurate, it matters if it's precise, and consistent. kind of like the scale you stand on, or weigh your food with.

Is 90% correct rate considered good enough for this kind of use?

Seems like 1/10 wrong would be bad, how does that compare with a doctor doing it?


Agreed. 20% margin of error isn't really that relevant to me either.

I don't really care that much if I'm 70mg/dl or 100mg/dl if the line is horizontal. Both values are, roughly, in the "OK Zone" that I just don't care about better precision.

Even if I'm going low, the difference between 50 or 70 isn't nearly as relevant as the rate of change. If I'm going down, but the slope is gentle enough, I know I probably don't need much to get back to normal, while if it's steep going down, I probably need to eat something regardless if I'm 50 or 70, or 80.


As long as your measurement is a random sample everything is okay. Even if it is not, it is much more information than you had before. You just need to keep it in mind when evaluating conclusions. We are not talking about drug tails here and no one dies if the measurement is not 100% accurate.

I am able to measure everything 100% accurate. But this is really really expensive. It's a trade-of.


It always surprised me to see them reporting results to a 10th of a percent precision (e.g. "2.3%"). I'd assumed that the best they could do on an individual was a couple of orders of magnitude less precise (e.g. 10%, 20%)

It'll depend on your needs, you have to compute your precision against recall to then decide what is a good cutoff for your application

You should indeed try and target a 50% accuracy rate.

Being precisely inaccurate isn't optimal. There needs to be a balance between precision and accuracy for a measure that's close to reality. Of course, there's going to be tradeoffs between the two, as well.
next

Legal | privacy