Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.


Algorithms that reverse the damage by providing opposing opinions could be implemented.


Why would Google ever do that? People are likely to leave YouTube for some other entertainment, and then they won't see more ads.


I agree. My point was that it is possible. Google would never do it without being forced.


I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.


If anything, these people see the removal of their "favorite" videos as validation - if a video is removed, it must be because it was especially truthful and THEY didn't like that...


It's actually been showed many times that deplatforming significantly reduces the number of followers an influencer has. Many watch out of habit or convenience, but won't follow when they move to a platform with less moderation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: