Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I interpret 3 as the chat bot requires no prompt from a human. Imagine the chat bot wakes you up in the middle of the night saying, "Hey, sorry to disturb you, but I was thinking... wouldn't eliminating all humans help to reverse the climate crisis more quickly? Just a thought." That is what I think of as number 3 (not actually wanting to kill off humans, but it is thinking on its own, without any prompts and then choosing to share what it has been thinking about with a human).


Don’t we already have this? If you prompt GPT to “care” about a human and then keep sending it observations, it does a pretty good job.


But there is still an initial prompt. I think number 3 is all about no initial prompt. The AI reaches out, not because it has been programmed or prompted to do so, but because it wants to do so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: