Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think most people think AGI is achieved when a machine can do at least everything that humans can do.

Like not necessarily physical things, but mental or digital things.

Humans will create a better LLM, (say GPT-5) than all the other LLMs that currently exist.

If you tasked any current LLM with creating a GPT-5 LLM that is better than itself, can it do it? If not then it's probably not AGI and has some shortcomings making it not general or intelligent enough.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: