Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>> Isn’t hallucination just the result of speaking out loud the first possible answer to the question you’ve been asked?

>No.

Not literally, but it's certainly comparable.

>There is no reason to believe that LLMs should be compared to human minds

There is plenty of reason to do that. They are not the same, but that doesn't mean it's useless to look at the similarities that do exist.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: