Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In 10 years most serious users of AI will be using local LLMs on insanely powerful devices, with no ads. API based services will have limited horizon.


Some will but you’re underestimating the burning desire to avoid IT and sysadmin work. Look at how much companies overspend on cloud just to not have to do IT work. They’d rather pay 10X-100X more to not have to admin stuff.


>"Look at how much companies overspend on cloud just to not have to do IT work."

I think they are doing it for a different reasons. Some are legit like renting this supercomputer for a day and some are like everybody else is doing it. I am friends with the small company owner and they have sysadmin who picks nose and does nothing and then they pay a fortune to Amazon


I’m talking about prepackaged offline local only on device LLMs.

What you are describing though will almost certainly happen even sooner once AI tech stablizes and investing in powerful hardware no longer means you will become quickly out of date.


It is just downloading a program and using it


Ok, but there will be users using even more insanely powerful datacenter computers that will be able to our-AI the local AI users.


Nvidia/Apple (hardware companies) are the only winner in this case




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: