Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
renjimen
on Sept 14, 2024
|
parent
|
context
|
favorite
| on:
LLMs Will Always Hallucinate, and We Need to Live ...
Post training includes mechanisms to allow LLMs to understand areas that they should exercise caution in answering. It’s not as simple as you say anymore.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: