Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The positivity rate of testing is 1% and below. Guess what the false positive rate of the actual test is?

What’s more likely, the virus actually has a perfect 1.00 Rt factor, or there’s a systemic error introduced by humans?



sort by: page size:

I read that the false positive rate is extremely low (below 1%).

False positives are not a huge problem when you are only testing people with symptoms. False positives would only become a problem when you are doing a large number of tests on people who have no symptoms (e.g. 1% of 1 million people when close to none have the virus).

I think your numbers are the detection rate: false negative rate is high (up to 40% for early tests).


It's important to remember that the positivity rate is a measure of how much testing you're doing, not a measure of how much virus is around. Unless you're reading articles about the politics of rapid tests or the like, you should mostly just ignore the pos rate, it doesn't tell you anything useful otherwise.

A 50% false positive rate means that, since most people don't have the virus, about 50% of all the tests will come back positive. You just about might as well flip a coin instead.

So, what's the rate of false positives, given that only 1 in 1000 is a true positive?

If you test 1000 people with a test which has a 1% false positive rate (lets ignore false negatives), but only one person in the 1000 is really infected you get about 11 positive tests results of which 10 are wrong.

I imagine with many tests, even at 0.1% eventually you'll get some false positive due to human error, or other things going wrong

Tests like this have false positive rates that are significant. The test may be a useful tool when we already reasonably expect someone to test positive (showing symptoms plus contact with a known case or travel from an affected area).

However, if you have no reason to believe you are infected, a false positive might easily be more likely than a true positive. And in fact, even a very modest false positive rate would still make it orders of magnitude more likely than a true positive.

Consider that even if you assume the worst about the state of the virus in the US, it's probably a few thousand people. If you randomly tested every person in the US, and the test had a false positive rate of only 1%, you would have thousands of times more positive results than real cases. Someone else in this thread mentioned an existing test that might have a 40% false positive rate.


What's the false-positive rate?

> If you test positive, there is a __% chance it's a false positive.

This percentage is based on both the test and the real infection rate.


Okay - I'm going to reveal my statistics ignorance here - but if they did get a false positive, how does 0.0% fall within the range of possibilities (or is that allowing for it might be 0.0499..% or lower?)

What is the false positive rate though?

What's the False Positive Rate?

At the latest low point in the epidemic, some areas were seeing test positivity rates of under 0.5%. If false positives were a large portion of positives, this would seem to be impossible.

What is the false positive rate, though?

I don't know but the problem is more with low infection rates. Suppose you have 3% false positives and 3% false negatives, and 0.1% of people are infected.

The false negatives will be 3% of 0.1%. The false positives will be 3% of 99.9%. You exaggerate your infection rate by about 30X even though the test is equally inaccurate in both directions.


Supposedly, the false positive rate is 1 in a trillion.

You are incorrect. False positive rates in practice appear to be about 20-50%, even with a p-value of 0.05.

A test with a 50% false positive rate, administered to 100 million people who aren't infected, would say 50 million people were infected.

False positive rate is a confusing term; it means the % of tests that should be negative that report a false positive.

I think the term for what you're probably thinking of (% of positive results that are false positives) is the false discovery rate.


> unless 100% of the population is administered a test with no risk of false negative or positives.

Actually, the "risk" of false positives or false negatives is dependent on prevalence, 100% is not always required. If you have a test where 5 of 100 cases are false (i.e. 95% times is not wrong), and nobody of these tested is actually infected, you could incorrectly believe that 5% of population is infected even when nobody is, meaning such a conclusion would be completely false.

But if the population is actually already e.g. 50% infected, the same test can "lie" only 5%, giving you 47% or 53% but still being "mostly true" (from the engineering point of view).

So it is important to ignore the test reports as long as they are close to their false positive rate, which they were in a lot of antibody tests done up to now.

Also "false positives" and "false negatives" can lead to wrong handling of the cases, but that's another topic.

next

Legal | privacy