Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That wouldn't prevent hallucination. An LLM doesn't know what it doesn't know. It will always try to come up with a response that sounds plausible, based on its knowledge or lack thereof.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: