Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's just wrong. This is ML 101

False positive = (Machine says positive but grand truth is negative) / (Machine says is positive)

In English, a false positive is when the machine declares something as positive but it is actually negative.

This fits a sanity test because ~50% of people that pass through a detector do not get patted down. The truth is closer to 10%. Even then, a true positive probably includes someone with two pennies in their pocket.



https://en.m.wikipedia.org/wiki/False_positive_rate

In statistics and ML it’s (FP) / (FP + TN) aka (false positives) / (actual negatives samples). This is by far the most common definition.

Here ML using the common definition see fall-out: https://en.m.wikipedia.org/wiki/Precision_and_recall


You are right this is ML 101, but you are wrong. Someone else has linked the wiki page, and I've posted it elsewhere and I would strongly recommend you read it as your interpretation is very incorrect.

> In English, a false positive is when the machine declares something as positive but it is actually negative.

Yes, and the rate is based on the frequency this happens in your negative set.

>This fits a sanity test because ~50% of people that pass through a detector do not get patted down.

The 50% figure may be wrong, but my definition of what a false positive rate is not.


You're thinking of FDR: False discovery rate. It equals Σ False positive/Σ Predicted condition positive.

See https://en.m.wikipedia.org/wiki/Precision_and_recall for a table of different terms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: