In this context of using them in public spaces it’s the false positive that matters most. The complaint is that the machines basically always alarm, which means the only thing they can be used for is making up a bogus excuse to manually search / harass someone based on no real suspicion.
But they don't always alarm according to the number that was presented. It alarms twice as many times as it should. That's not good and I certainly wouldn't want these in public spaces.
But you can't just look at the false positive rate and how much more work and unpleasant/unnecessary searches they cause. You have to take into account the false negative rate to get a complete picture. Manufacturers and the people who want these machines will always err on the side of too many false positives than too many false negatives.
Again, I personally think that these machines have no place in public spaces, no matter their hit/miss-rate.
I think this is a misunderstanding. From Wikipedia [1]:
> A false positive error, or in short a false positive, commonly called a "false alarm", is a result that indicates a given condition exists, when it does not.
So a 50% false positive means also that in 50% of alarms the condition was actually true.
What you mean is a 50% alarm/trigger rate. Which, depending on how many people do have something on them that should trigger the alarm, would result in a far far higher false positive rate.
No I don't. It's just that the true positive rate is close enough to zero that both false positive and alarm rate are roughly the same number.
Nevertheless you're still not understanding "false positive" correctly. If you correctly pick 5 out of 100 true positives without any flaws, and also have a 54% false positive rate, you're grabbing 51 false positives, and >90% of your picks were incorrect.
And I'm pretty sure the true positive rate is well under 5% to start with, which just makes it worse.
"In Germany, the false positive rate was 54 percent, meaning that every other person who went through the scanner had to undergo at least a limited pat-down that found nothing."
This is wrong. Even your quote shows this. A false positive rate is how often it triggers when the true result is false.
> A false positive error, or in short a false positive, commonly called a "false alarm", is a result that indicates a given condition exists, when it does not.
So 50% false positive rate means 50% of people who should not be flagged are flagged.