Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Don't need them. Ever since Llama 3.1 came out I can control my data and not give it to OpenAI.


Do you host your own 405B parameter model?


I've tried it on Lambda but it doesn't seem any better than 70B or even 8B for most tasks.


This is kind of a HN dropbox comment. Very few non programmers would go through that


Thing is, I don't think openAI has that dropbox mindset of 'do what the user wants and make it super easy' either.

Sure, they have a great chat interface... But they have tech that can do so much more, and they aren't getting it directly into the hands of users... For example, where is "openAI translate"? Where is "openAI photo backup"? Where is "openAI phoneOS"?

With a burn rate of $1B/month, they could hire tens of thousands of regular developers and rebuild a whole suite of AI powered tools.


True, but enterprises wanting to make LLMs available to their non-technical employees would.


They won't have to because a thousand developers like me will offer them simple solutions. Facebook absolutely destroyed anyone's "moat" in this area.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: