> would significantly reduce the engine's performance (ranging, depending on the type of workload, from 1.5–10× or more for computationally intensive tasks).
And the downside being?
Seriously, JS was never meant to be performant. In the real world, it's very rarely used for anything computationally intensive.
If you mean “wasn’t originally meant”, that might be true. But it’s been meant to be performant for quite a long time, with huge investments behind the realization of that intent.
It’s fine if you have nostalgia for whatever you think was the original vision behind JS. But that hasn’t been the operating vision for it for many years.
Very curious what your unique definition of 'computationally intensive' is, that manages to not include one of the most significant computational workloads worldwide, both in terms of absolute volume and impact on human productivity. Namely, web browser rendering performance.
Huh? The rendering is part of the browser itself. It's not like JS has to run every frame to put the pixels onto the screen.
Making ajax requests and doing things to the DOM tree isn't a "computational workload" because there's hardly any computation happening in that JS code.
Rendering pages server-side and avoiding 5-megabyte bundles might help with that. JIT and other browser-side performance optimizations just delay the problem anyway. The culture around web development needs to change.
And the downside being?
Seriously, JS was never meant to be performant. In the real world, it's very rarely used for anything computationally intensive.