Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, and no. If you already have a lot of experience building apps that run on servers, there's a learning curve to switching to serverless. Is it huge? Not really. But there are certainly pitfalls and best practices to learn about. The costs can be harder to predict, especially when starting out. And the tooling is different (and much less mature). So now you have a bunch of stuff to learn about or consider, or you can just go do the same thing you already know how to do with minimal friction. It's possible that the cost savings of not overprovisioning servers is worth it, but I don't think it's that straightforward of an answer, and if your server costs aren't massive, you might be better of spending your time building a great product than learning a new way to build.


I think, and I'm replying to both above comments, that there's a non-trivial cost reduction + DevOps reduction potential here.

Of course, the overarching rule is always "if it works, don't touch it".


How? His boxes are running at 100%. He might be able save some money by switching to hosted db and maybe run his webservers serverless (and creating some keep-alive triggers).

Nontrivial cost reduction would be switching to a different host instead of aws.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: