Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>It may help to remember that current LLMs would require an infinity of RAM to be even computationally complete right now.

Anything that is computationally compute needs an infinite amount of RAM. This is not unique to LLMs or even to machine learning.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: