Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I assume not that LLM would be isolated, I assume that LLM would be incapable of interacting in any meaningful way on its own (i.e. not triggered by direct input from a programmer).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: