Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The calculations here seem to depend on a few false assumptions:

1. That all datacenter GPUs being purchased are feeding AI. You might be able to argue that some are or a lot are, but you don't know how many just looking at Nvidia sales numbers. I know of at least two projects deploying rows of cabinets in datacenters full of GPUs for non-AI workloads.

2. The assumption that pay-for-an-API is the only AI business model. What we now call "AI" has been driving Google's search and ad businesses for nearly a decade, sooo AI is already doing $300B/yr in revenue? There is no way for this guy to quantify how AI is solving problems that aren't SaaS.

David Chan, if you are reading this feel free to email me if you want a fact check for what will surely be the third installment in the series.



Yes but Google hasn't needed gpus to power search. It's used cpu even for ML inference and training before deep learning.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: