Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Do you genuinely think it’s worse that someone makes a decision, whether good or bad, after consulting with GPT versus making it in solitude?

Depending on the decision yes. An LLM might confidently hallucinate incorrect information and misinform, which is worse than simply not knowing.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: