No kidding. I'm using a > 1GBit fibre line and the majority of my page loads is still downloading recursive, pointless javascript dependencies, or doing a billion CORS OPTIONS requests and waiting for the response. DNS latency doesn't even factor into the end result user experience, it's dominated entirely by front end design decisions.
If this was a concern it’s an admission that JavaScript developers have optimized to the best of their ability. Which is just sad.
When I first saw the CORS headers on large sites that are sent with every effin request, I thought I gone mad, only to learn that it's encouraged to be used... I remember times with monsteriusly sized cookies, this is the same, except this won't go away with the rise of server side sessions.
Would I recommend uMatrix to my non technical friends? Absolutely not.
For technical people it's a matter of self selection. It's worse for the ones of us that can't bother checking which scripts break the page and deciding whether to enable them. It's great for the others. Personally I can't imagine using the web without uMatrix (and uBlock Origin). For the site that really breaks no matter what, if I really need it either I open it in a Vivaldi private window (I only have uBlock in that browser) or if anything else fails, I start Chrome and close it immediately after I did what I had to do.
But it is compensated for by the sites that actually work better when you strip their non-local JS from them. I've lost track of the number of times I've come to the HN comments and read about how unreadable the page was due to all the popups and ads and modal dialogs when I just read the text. You're absolutely not wrong that some sites get worse, but it's not one-sided in that direction.
uBlock Origin is probably the better middle way for most. It should accomplish what you describe most of the time yet breakage is the exception instead of the rule.
Many inexperienced developers don't understand the difference in latency between a local memory call and a remote HTTP API call, so treat the two as equal in terms of architectural concerns.
It's been a truism for many years now: "slow" is often not your side of the pipe, especially with high-bandwidth connections.
>If this was a concern it’s an admission that JavaScript developers have optimized to the best of their ability. Which is just sad.
I mean sure, but "is this the best you can do?" requires you to know what the goals are, and I suspect the issues raised in these comments are not on hardly any of the lists. New Relic and its third-party cool-usage-graphing friends are way way higher in priority.
If this was a concern it’s an admission that JavaScript developers have optimized to the best of their ability. Which is just sad.