Says more about you than anything else. They have completely different business models, so I don't get the shock, let alone the relation. Facebook can't scale if they have to have human moderation at an amount that would actually prevent the pedos. It's just also facially incorrect - Amazon does hire warehouse workers. Facebook does not hire enough content monitors.
> It's just also facially incorrect - Amazon does hire warehouse workers. Facebook does not hire enough content monitors.
This is the whole point. Amazon had to hire hundreds of thousands of warehouse workers to scale. They have 1.5 million employees. Facebook is capable of doing the same. The idea that they "can't scale" if they have to stop unloading their negative externalities is absurd. Amazon scaled, while hiring 1.5 million employees. Meta can scale and do the same.
Amazon actually delivers things. Facebook still has a pedo problem, so no, I don't think you're right and it's certainly not even a good example to make your point with anyway as there is no similarity at all. It's pretty clear facebook's business model doesn't work when you have to actually sift through everything they spread. Amazon has nothing to do with it.
Facebook still has a pedo problem because they still haven't hired 1 million employees to deal with it. Amazon would still have a logistics problem if they hadn't hired 1 million employees to move boxes around.
The idea that this is literally an unsolvable problem is just absurd. If every Amazon warehouse worker instead became a Facebook pedo monitor making a >95% decrease happen would be pretty trivial.
I think we can't actually know this unless Meta tries it. I think there are two main open questions:
1. With aggressive, noisy referrals to prosecution, and banning people who report others in bad faith, can you get these people to stop approaching kids on the platform? Can you get the human review burden to a tractable level b/c the rate of real issues and the rate of false reports is sufficiently low?
2. Can better moderation / safety measures _facilitate_ growth b/c people won't be scared or disgusted away from your product? We have plenty of people whose advice is "don't let your kids use their products unsupervised" and assuming you don't have the free time to _watch_ your kids use their product that quickly turns into "don't let your kids use their products". A safe platform that people _believe_ is safe might experience faster growth.
1. That presupposes the problem is bad faith referrals or that pedos aren't sufficiently aware they can get popped on FB. I don't think either are likely true.
2. I don't think the scalability issues are related to the size of the social network, so I don't think this is ever a relevant question, at least from my perspective. My point is that it would not be commercially reasonable for Meta to actually employ the number of people required to run down, verify and then forward reports.
The website that's one of the hardest to use anonymously, that won't let you use an account without verifying with a phone number, won't let you even view content when not logged in, is an "oasis for pedos"?
Sorry, but from my point of view, they serve pedos to police on a silver platter. If the police don't take action, that's not Facebook's fault.
>Sorry, but from my point of view, they serve pedos to police on a silver platter. If the police don't take action, that's not Facebook's fault.
That's a bit of a strawman. I've never seen it suggested that the problem is that govts do not prosecute enough of what Facebook reports and that is why so much of it happens on Facebook. I certainly wasn't making that suggestion. My point is that a lot of child solicitation does happen on Facebook. Despite phone verification, so I'm not sure what point you are really making. It seems more like you are coming about it from an abstract privacy perspective, which is valid, but not what you are claiming. Facebook is an oasis for pedos. They are all over Facebook and Instagram trying to interact with kids. Plenty of articles about it and how meta takes very few if any simple precautionary steps, and sometimes even connects these people through the applications of its social algorithms. You are acting like children hang out on the dark web or something. They don't. They are on Facebook. They are on Instagram, YouTube and on video games.
> I've never seen it suggested that the problem is that govts do not prosecute enough of what Facebook reports
How odd, I wonder if there's a reason for that.
I remember in one transparency report, FB itself sent over 12 million referrals to NCMEC, yet we don't see stories about all those being rounded up for justice