Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the strategy is more to prevent competitors from monetizing


That's a huge reason to do it also, but it also makes sense if you have researchers + developers improving the engine of something that powers your product. The moat / competitive advantage at FB is their network, not so much the proprietary underlying tech.


People often say this but having interviewed ~200 facebook engineers over the years, their scaling tech around both software and hardware is pretty impressive.


Yeah I guess it's a competitive advantage when a competitor (Twitter) is showing to have technical problems operating at global scale with a smaller team. Their scale is not trivial by any means. But people aren't going to go to FB because they have the best LLM, makes sense to offload that development to the open source community.


You still need to build real-time serving infrastructure on top of LLaMA/Vicuna/Alpaca in order to compete with ChatGPT/OpenAI so it's not going to be done by that many companies and OpenAI already has a mindshare/first mover advantage.


When you use ChatGPT you are leasing their GPU infrastructure and their proprietary model, this opens the possibility of leasing GPU infrastructure from another company and using an open source model. You don't necessarily need to do the hard parts yourself, you can hire it out to competing companies.


Sure, but it's extra work slowing you down as your competitor is surfing the wave at full speed. Moreover, you are relying on an old LLM whereas OpenAI is developing newer versions of theirs, keeping their competitive advantage. Even Google who has the infra has a ridiculously bad LLM to compete.


Yes, commoditize the competition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: