This performance conundrum reeks of excessive (incidental) complexity. I've warmed up to Javascript over the past 15 years, but forcing website authors to concern themselves with such details becomes less and less appealing and grows ever more ridiculous. I hope the future of HTML+JS will involve to a better way.
I thoroughly enjoyed the spiritual essence and relative purity of HyperCard at the time in the 1990s.
Edit: @ly: Apologies, I should have been more specific and clear.
Referring to the complexity matrix in terms of what may or may not cause page rendering to block at any given moment. Complexity in this location all but ensures you will make a suboptimal decision during development unless you run every release candidate through Google Page Speed.
The general trend is to make new things non-blocking, and discourage use of the old blocking things. (Hence stuff like <script async> and ES 'import' declarations.) If you’re writing greenfield, you don’t really have to worry about any of this: new developers who aren’t working on legacy code could be taught about only the non-blocking way of doing things and never even know these problems exist.
Hopefully with wasm, and combined with a new DOM at some point, we can have the situation on the web as we do with native code where we’re free to use any language we want. With fair competition I expect JS would slowly fade.
This is a good point, though for the sake of discussion imagine that JS didn't exist, and had no ecosystem at the time Node came on the scene. Do you assume it would have been conceived for that purpose?
Arguably Node took off because it allowed an already-existing huge ecosystem of libraries, tools, and developers to be leveraged for the back end, with all of those resources in turn coming from the browser.
I'd argue that Node took off because it was a greenfield environment where nearly everything was encouraged to use the event loop for asynchronous programming, so everything worked nicely together
> Node took off because it allowed an already-existing huge ecosystem of libraries, tools
It didn't. JS didn't have a huge set of existing tools and libraries when Node showed up. And the ones that did exist certainly didn't work in Node, considering they were written for the browser without a care in the world for other environments.
Why is node so pupular serverside? I believe js is one of the easiest languages to step into. Its standard syntax is super easy to read. Its standard lib is very small. And it performs extremely good. The most performant scripting languages there is. Heavily optimized browser render engines and javascript executors helping with this. Nah. There is a reason why js is with us.
No, the sole reason why JS became popular server-side is because it’s the same language as used in the browser client-side. A large number of people started out learning programming in the browser, and having been “primed” with JS that way, it’s what they’re used to and then also want to use server-side.
This , nobody liked JS on server side world before. Until node came in.
And many people still hate JS even after year of serverside development. Especially who had done work with far sound languages like python and ruby on Django or RoR frameworks. There is no Quality framework for JS on that space even though many startups are using node as backend. Django Popularity rising back these days on hires .
One big reason I use JS on the backend is because Prisma can generate types for my API endpoints which I can use for my client code. Having confidence that my code works as expected is a huge deal and having end to end type safety is a great first step towards that.
Do you see a new DOM being realistic? Not a sarky comment. I ask because I don't see the problem with a new <bellend> tag. Meaning that in 2023 browsers start having a html and a bellend tag representing an end to legacy and a line to the future.
One silver lining of the otherwise (IMHO) mostly-bad situation we have with centralization of browsers is that it is conceivable through our standards groups that the major browser vendors agree to experiment with a replacement.
From my perspective the problem is even in the name itself, the D in DOM stands for document, which is exactly the problem. The DOM was never designed as an application framework but we've shoehorned it into that role.
I find both this comment and the original article super weird. Conventional wisdom has been against the use of document.write since before many of today's webdevs were even born.
It's still in the browser, of course—because it has to be. But it's not as if people are being taught to use document.write and then having the rug pulled out from under them in a "sike!" moment when it's revealed that this often used fixture of the programmer's toolbox is is actually problematic. What strikes me as odd about the article is that we've reached a point where not only are people not being taught to use document.write anymore, they're not being taught not to use it anymore, either. That's how much of a non-issue this is. So what's with the article belaboring the point?
Nobody is forcing anyone not to use Document.write(). The author has voluntarily run a test using Lighthouse on their website, which gives this suggestion as it is not conforming to the Lighthouse "Best Practices".
The runtimes for Node and the browser engines is different enough that they are really different dialects. JavaScript is like Lisp in this way. It was simpler when JavaScript wasn’t supposed to be a write-once-run-anywhere language.
All of these complexities have either existed or evolved in the browser completely independent of Node. And while I have plenty of criticism for Node particularly around its dependency resolution, it didn’t introduce new complexity in this case (other than what it inflicted on itself to allow ESM/CJS to coexist, which doesn’t extend to browsers at all.)
As far as write once/run anywhere, Deno has been pretty good at showing this is still limiting but could be much less painful if the server runtime shared more applicable APIs instead of ad hoc doing idiosyncratic stuff.
I’ll definitely agree that JS is a bit lisp like, but not due to runtimes. It’s been a build target with build time semantics for a long while. Unfortunately it’s unlike lisp in that those semantics are part of arbitrary tooling which don’t always agree but have been forced to coexist, rather than semantics of the language itself.
> The whole reason I’m writing this post is that I have a client at the moment who is using document.write() late in the <head>. As we now know, this pushes both the fetch and the execution on the main thread.
Is there any tools that you can use to help detect stuff like this? In order to do performance optmizations
Off-topic: Apparently the URL flavor this month is infected with the .dev TLD proliferation g-corp agenda (as is golang.org -> golang.dev). Do cool URLs ever really change?
I’m not defending the usage of `document.write()`, but my understanding of why 3rd party scripts might like using it is to prevent element flicker on load. If you own the site, there are various tricks available to avoid element flashing, but less tools available as a 3rd party. Still, it’s not a good choice, and probably as a 3rd party you shouldn’t be trying to make those design decisions, but I think that is why it’s usage is more common in 3rd party scripts.
Is that faster though? Can your Ruby/PHP/Python running on the server really do the conditional rendering faster than the JavaScript running in the browser?
Your server is already running a scripting language. Putting a script tag on the page necessarily means the browser then needs to parse and interpret some JavaScript, which it simply wouldn't otherwise need to do. If we assume the server and the client both execute an if statement at the same speed, yes, it's purely adding overhead to run that on the client (not to mention the potential for the additional cost of transmitting the eliminated code).
Sorry, I thought it was obvious. The overhead is practically zero compared to the amount of html you can comment out.
You can have a cached static html document with really huge (conditional) comments with tiny performance loss. display:none for example still does all kinds of things. Inserting html with js is even worse.
I thoroughly enjoyed the spiritual essence and relative purity of HyperCard at the time in the 1990s.
Edit: @ly: Apologies, I should have been more specific and clear. Referring to the complexity matrix in terms of what may or may not cause page rendering to block at any given moment. Complexity in this location all but ensures you will make a suboptimal decision during development unless you run every release candidate through Google Page Speed.