This is one of the most cleanly straightforward and mature posts I've read on HN in awhile. And quite practical too, particularly that last portion. Also, agreed.
The original tradeoff was mega-cap FAANG companies trying to offload processing power to the client. There never was an organic open source push for SPA's or front end JS frameworks. They add a ton of tech debt and degrade the UX. Premature optimization and anti-pattern for everyone but a handful of companies, imo.
The old world was having a complex web stack that included strange templating languages hacked onto languages that were sometimes invented before HTML was even a thing (see: Python) that spat out a mix of HTML and JavaScript.
Then there was the fact that state lived on both the client and the server and could (would...) easily get out of sync leading to a crappy user experience, or even lost data.
Oh and web apps of the era were slow. Like, dog slow. However bloated and crappy the reddit app is, the old Slashdot site was slower, even on broadband.
> They add a ton of tech debt and degrade the UX.
They remove a huge portion of the tech stack, no longer do you have a backend managing data, a back end generating HTML+JS, and a front end that is JS.
Does no one remember that JQuery was used IN ADDITION TO server side rendering?
And for what its worth, modern frameworks like React are not that large. A fully featured complex SPA with fancy effects, animations, and live DB connections with real time state updates can weigh in at under a megabyte.
Time to first paint is another concern, but that is a much more complicated issue.
If people want to complain about anything I'd say complain about ads. The SINGLE 500KB bundle being streamed down from the main page isn't taking 5 seconds to load. (And good sites will split the bundle up into parts and prioritize delivering the code that is needed for initial first use, so however long 100KB takes to transfer nowadays),
> Oh and web apps of the era were slow. Like, dog slow. However bloated and crappy the reddit app is, the old Slashdot site was slower, even on broadband.
Just those that attempted to realize every minuscule client side UI change by performing full page server side rendering. Which admittedly were quite a few, but by far all of them.
The better ones were those that struck a good balance between doing stuff on the server and on the client, and those were blazingly fast. This very site, HN, would probably qualify as one of those, albeit a functionally simple example.
SPAs are just a capitulation in the face of the task to strike this balance. That doesn't mean that it is necessarily the wrong path - if the ideal balance for a particular use case would be very client side heavy (think a web image editor application) then the availability of robust SPA frameworks is a godsend.
However, that does not mean it would be a good idea to apply the SPA approach to other cases in which the ideal balance would be to do much more on the server side - which in my opinion applies to most of the "classic" types of websites that we are used to since the early days, like bulletin boards, for example.
> Oh and web apps of the era were slow. Like, dog slow. However bloated and crappy the reddit app is, the old Slashdot site was slower, even on broadband.
Which reddit app are you talking about, the redesign or old.reddit.com? I ask because the old version of reddit itself certainly wasn't slow on the user side, iirc reddit moved to the new SPA because their code on the server side was nigh unmaintanable and slow because of bad practices of the time.
> Time to first paint is another concern, but that is a much more complicated issue.
That's the thing though, with static sites where JQuery is used only on updates to your data, the initial rendering is fast. Browsers are really good at rendering static content, whereas predicting what JS is going to do is really hard..
The new reddit site on desktop is actually really nice. Once I understood that it is optimized around content consumption I realized how it is an improvement for certain use cases. Previously opening comments opened either a new tab, or navigated away from the current page, which meant when hitting back the user lost their place in the flow of the front page. The new UI fixes that.
Mobile sucks, I use RIF instead, or old.reddit.com if I am roaming internationally and want to read some text only subreddits.
> That's the thing though, with static sites where JQuery is used only on updates to your data, the initial rendering is fast. Browsers are really good at rendering static content, whereas predicting what JS is going to do is really hard..
Depends how static the content is. For a blog post? Sure, the content should be delivered statically and the comments loaded dynamically. Let's ignore how many implementations of that are absolutely horrible (disqus) and presume someone at least tries to do it correctly.
But we're all forgetting how slow server side rendering was. 10 years ago, before SPAs, sites took forever to load not because of slow connections (I had a 20mbps connection back in 1999, by 2010 I was up to maybe 40, not much has changed in the last 10 years) but because server side was slow.
If anything more content (ads, trackers..) is being delivered now in the same amount of time.
New reddit makes it easier to push ads; any other motivation for its implementation is an afterthought. There's plenty of valid criticism that can be levied against the claim that the redesign is "superior" by default. And I think often we confuse amount of information with quality of information exchange. Due (mostly) to the ever increasing amounts of new users that it desires, you could easily make the point that the quality of content on reddit has nosedived. Optimizing for time on site is not the same thing as optimizing for time well spent.
Reddit as a company obviously wants more users; a design that lets people scroll on through images ad nauseam is certainly better than a design that is more information dense, so if that's something you'd cite as an example of "better in certain use cases" then I agree, otherwise there's plenty of reasons to use old.reddit.com from an end user's perspective.
Even if everything you said was true (it's definitely not!) that doesn't explain why the web is bogged down with entirely static content being delivered with beefy JavaScript frameworks.
10 years ago it was static content being delivered by ASPX, JSP and PHP, with a bunch of hacked together JS being delivered to the client for attempts at a rich user experience.
It still sucked. It just sucked differently. I'll admit it was better for the user's battery life, but even the article shows that it was not any faster.
I don't know where this misconception came from - XMLHttpRequest was invented by Microsoft for use in Outlook Web Access, Gmail was essentially a copy of that.
The first web versions of Outlook were plenty fast and usable on my then workstation (PIII 667 MHz w/ 256 meg). In fact, a lot of the web applications made 15 years ago were fast enough for comfortable use on Pentium 4 and G4 CPUs, because most used combinations of server-side rendering and vanilla JS. It was painful to develop, sure, but the tradeoff in place now is severely detrimental to end users.
Anything we evolve to desire signals genetic fitness.
Genetic fitness means anything that results in more grandchildren in the end. Physical health is a big factor, sure, but so is intelligence and social skills.
Maybe this is in part a linguistic disagreement. By "attractive" I don't just mean "looks good in a photo", but any factor that attracts a partner.
As an arts/law graduate, who has since moved into a senior, highly technical IT role, I can honestly say that had I not moved into the IT field, my existing and already large university tuition debt would never have been repaid within my lifetime.
There is such a huge glut of humanities graduates in Australia now, that unless you are graduating within the very very top percentile, you are guaranteed to have wasted both time and money for no return.
Rust is much too low level for most glue/web services. Unless you have a specific high performance requirement (and Go doesn't meet this), there's no real strong case for migration here.
Rust itself is not inherently "low level" per se. But others are probably right that the whole web services ecosystem for Rust is rather half-baked at this time, the OP notwithstanding.
But I don’t? The borrow checker takes care of that? String vs &str is trivial to get ones head around, usually it’s really easy to decide whether you’d like to pass a reference or ownership, and worst comes to worst, sprinkling some copy/clone etc to get things sorted quickly still yields a binary that’s faster and more robust than something I can whip up in Python...
The borrow checker only checks, it does not solve the problem. In other languages the problem does not even exist to begin with.
It is not a trivial problem to solve (as you claim), otherwise we would have never needed the borrow checker to avoid memory bugs, nor higher level languages to speed up development by avoiding the problem altogether.
If you are going to end up sprinkling clones, heap allocating and reference counting, then you could have used C#, Java or JS to begin with which are plenty fast with their JIT/VMs and not think about memory at all.
Finally, comparing against Python is a very, very low bar for performance.
> In other languages the problem does not even exist to begin with
I am going to disagree here, because I've run into my share of memory issues in Python and C#/F#, and I'm sure by this point, everyone is well acquainted with Java's memory issues.
> It is not a trivial problem to solve (as you claim), otherwise we would have never needed the borrow checker to avoid memory bugs, nor higher level languages to speed up development by avoiding the problem altogether.
I'm not claiming that memory management is a trivial problem, I'm saying the borrow checker takes care of enough and the compiler/clippy hints when I do something wrong help me fix it easily enough. I write code slightly slower than I would in Python, but at the end, what I get from the Rust code is something that is more robust and more hardware efficient.
> If you are going to end up sprinkling clones, heap allocating and reference counting, then you could have used C#, Java or JS to begin with which are plenty fast with their JIT/VMs and not think about memory at all.
Rusts type system is enough to make me want to use it over dotnet, JS is a language with...some issues...that is fortunate enough to have a nice JIT, I consider it a serious choice for doing anything except web front-ends. I find C# needlessly convoluted and I dislike all the implicit mutability, but those complaints are very subjective.
The difference is that even if I have some clones and ref counts, they're rare, and the resulting binary is still outrageously fast, and has clear indicators of where to come back to and improve so as to not need the clone/reference counting/etc.
> Finally, comparing against Python is a very, very low bar for performance.
I compare against Python because that's the other language I do most of my work in.
You were talking about the borrow checker, which is mainly about memory safety, not memory limit issues.
In Python, C#, Java, JS... you are memory safe without dealing with memory management nor a borrow checker.
There are many languages running on top of those VMs for all kinds of tastes (OOP, functional, strict type systems, loose ones...). Claiming Rust leads to more robust software than any of those is an exceptional claim, but even if that were true, the key is the development cost.
A typed language is a typed language, there are other languages that are easy to get performance out of. I’m not a rust dev and I’m highly skeptical it will be used outside of firefox and a few niche projects after this initial hype train dies off. What other features would make me pick rust over golang or one of the interpreted languages?
> a typed language is a typed language
Well yeah, but not every type system is equal. For example I vastly prefer Rust's type system to C's because of Options instead of null and enums as sum types.
> what other features would make me pick rust over golang
Generics, iterators, pattern matching, etc. There's lots of features Rust has that golang doesn't; that's not necessarily a good thing but for what I do it is. IMO the only good thing about golang's featurelessness is the compile times and the standard library.
As for interpreted languages, IMO it's just better to be able to catch errors at compile time.
> Well yeah, but not every type system is equal. For example I vastly prefer Rust's type system to C's because of Options instead of null and enums as sum types.
Fair enough. But I'm not advocating using C here either.
> As for interpreted languages, IMO it's just better to be able to catch errors at compile time.
Just because you have a garbage collector doesn't mean you don't have to worry about memory management. I've see too many problems pop up because people don't understand how memory is managed in their GC'd language.
This isn’t true, in all languages with one you can ignore the garbage collector and still get work done. It may not be the most efficient but you still get work done. Let’s get a fresh out of code bootcamp grad in here and throw two languages in front of them if you want to test this.
You may be able to get work done, but I've seen actual bugs because people didn't understand how memory was managed. For example, not realizing that passing an object to a function was passing a reference, and not a copy. These are things that are explicit in Rust.
This won't result in any security related bug, you'd be updating the referenced version instead of a copied version. Both testing and use of the written code will show this "bug" if it's in fact a bug for this specific codebase. So now the question is, does rusts difficult learning curve warrant removing this "maybe" bug? There are other things to consider as well, memory fragmentation, performance etc. Have you measured the performance of code that both copies and updates?
But this is still using a hammer to screw in a nail. Rust is a systems language, it’s a junior dev move to force it into a web server. Use go or typescript for this, not rust. Just like I would write c++ for a backend unless i’m trying to shave off some nanoseconds.
Actually the OP only wrote that the current state of the ecosystem is surprisingly mature, but he doesn't recommend writing anything serious in it yet.
Personally I don't see the point to implement a typical web application in Rust - the performance improvements you get will be lost on IO-bound applications, but you'll still be saddled with the complexity of the memory management. I'd rather suggest to rewrite VS Code or the Slack client in Rust (i.e. apps which currently use Web technologies on the desktop) - those would definitely benefit more from increased performance and reduced memory footprint...
> the performance improvements you get will be lost on IO-bound applications
Performance starts mattering even in IO-bound applications as soon as you're trying to seriously scale out. Especially when running on a cloud-based platform. As for "the complexity of memory management", people like to bring this up about Rust but OP suggests that it's not a huge concern with the language.
I do agree that rewriting stuff like Electron-based apps should be a priority, and that Rust can help this via easy bindings to native OS and GUI platforms.
If you remove the specific technical choice from mind (a great engineering team can move mountains with any data layer, something I've also learnt working in numerous contract teams at multi-billion dollar companies) - the simple act of rewriting and consolidating huge amounts of data/infra (and stacks) can yield enormous gains.
I will be tracking their technical reporting more closely over the next year.
"I will be tracking their technical reporting more closely over the next year."
Just curious, but what specifically do you mean? ie. I'd love to keep an eye on this also, but have no idea where to find these types of (public) reports.