Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs are acting like humans, I believe humans will have biases if you ask them to do random things :)

On a more serious note, you could always adjust the temperature so they behave more randomly.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: