Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Easy, you ask them to talk you through their reasoning.

LLM’s are fucking terrible at this for anything nontrivial.



> LLM’s are fucking terrible at this for anything nontrivial.

They seem pretty solid in my experience, in fact, if they shine at anything it's expository blobs of text.


Not my experience at all. I would say humans are much worse at this.


> Easy, you ask them to talk you through their reasoning.

> LLM’s are fucking terrible at this for anything nontrivial.

Amusingly, one of the common pieces of advice to limit hallucinations is to ask the LLM to walk through the reasoning.


while you say that, millions of dollars a month are being passed around to do exactly that.. right now in the USA




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: