Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you look at just the potential impact of current AI or slightly better ones that are around the corner...

If you see cognitive work as a distribution, it's not far-fetched to think that it can take out the bottom half and after that ever more. It's naive to think that this bottom half will become some super AI coder, that's what the top half might do, or more likely the top 10%.

Every current threat in the digital domain (misinformation, polarization, cyber crime, the like) might go times a 100 or a 1,000. Are we ready for that?

Unique human values such as creativity and simply just communication are on the chopping block too. Is it worth it? What remain of us? Biological prompters? Why not link it to our actual brain then to complete the Matrix scenario.

What happens to truth? Or culture? Does it matter anymore? What about one private company being in charge of such nuclear scenarios?

We're talking existential impact here. This is like dropping a nuke without knowing what a nuke is or what it does.

It's pathetic that private individuals have to write this letter. Don't you think that in an exponentially disruptive world we should have some kind of formal oversight? Instead of a 80 year old guy asking 10 years after the blast what Wifi is?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: