Hacker Newsnew | past | comments | ask | show | jobs | submit | lucas_v's commentslogin

> superhuman persuasion and means of manipulating human urges such that, if it wanted to, it could entrap anyone it wanted to complete subservience

Wow this potential/theoretical danger sounds a lot like the already existent attention economy; we're already manipulated at scale by "superhuman" algorithms designed to hijack our "human urges," create "cult-like" echo chambers, and sell us things 24/7. That future is already here


Which is true. People are scammed by very low-quality schemes (like that French woman who sent 1 million to the scammers who claimed to be Brad Pitt who for some reason needed money for medical procedures).

Humans have generally a natural wariness/mental immune system response to these things however, and for 99% of people the attention economy is far from being able to completely upend their life or send all their money in an agreement made in bad faith by the other party. I don't see why, if some AI were to possess persuasion powers to a superhuman degree, would be able to cause 100x the damage when directed at the right marks.


Though there might be nothing beneficial about being right or getting upvotes, and it is easy to be wrong, an important thing on a forum like this is the spread of new ideas. While someone might be hesitant to share something because it's half-baked, they might have the start to an idea that could have some discussion value.


Or you could just "git push" responsibly...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: