I mean, all of that is grand but you remain for, for from the metal in all these instances and even in assembly you cannot go under several layers of abstraction with modern architectures
in the days of 6502 and Z80 compatible microprocessors, you could go down to the NAND gate level, to the transistor level - it was doable, not that it made much sense to go there for most people, but it was nearly a requirement to know how many cycles did each instruction take and lock conditions like bus access when dealing with different parts of the memory, etc
there were essentially no hard black boxes then, and now there are many at the hardware level and even the OS level is too impenetrable; not even the CTOs of major OS companies and initiatives have a complete understanding of the OS they produce, perhaps a few select ones have a functional understanding with only 2 or 3 black boxes involved (specific driver magic, hardware detail that goes under C and assembly optimisation etc - which typically outsourced specialists deal with)
GPUs alone these days are more complex than the entirety of the systems back then, and the implications of this are two-fold:
- the culture on the producer end is that software, machines and accessories are sold as black boxes, both by expectations and by enforcement on the manufacturer's end that gets no incentive whatsoever to expose internals of their product, and many reasons to the contrary as the competition might use it; these dynamics are extending to repair-ability as well
- the culture on the consumer end is that internals are not possible to deal with, are best taken for granted and the curiosity is just not there anymore; also if things break you just get new things, which disincentives also producing anything meant to last, as the assumption is that it will be obsolete soon anyway, or the system will just stop working, reinforcing that dynamic and cheapening the value of work
In essence, the hardware hasn't changed a lot. There was less focus on efficiency, much more on price - computers were expensive. In the (early) 80s, memory came at a premium, most micros had no permanent storage (except audio cassettes), so CPU, peripherals and OS abstraction weren't though of as the bottleneck.
Given only kilobytes of memory, the problem basically came down to: how much of your computer can you make get to do something useful or fun within those RAM limits. I think that perspective allowed room for (and necessity to) learn about and understand all levels of everything inside the piece of hardware you bought. Abstraction was still relevant, but detailed documentation on the actual guts was much more necessary then than it is today.
To exaggerate: why would you need to understand how to bitshift/add/compare to perform multiplication or division if you never need to think beyond python 3/typescript/java syntax?!
there have been many developments that have been strongly anti-user
partly because of the limitations of these days, there was a strong incentive to expose every detail of products to consumers that were not abstracted away from professionals or developers at any level
the abstractions of the day were solidly grounded by the nature of the underlying hardware, whereas nowadays we've used the freedom and leeway of more powerful hardware to make intermediate abstractions that are more transient and throw-away
for instance, everybody doing amateur level computing in the mid 80s had a strong understanding of he concept of interpretation vs compiling, understood binary coding, base conversion, some decent amount of information theory and boolean logic, addressing modes, fundamental data structures... those were all basic requirements and a lot of that is knowledge that won't fully go obsolete per se
whereas now, nothing of that being a requirement, people are often just given some explicit indications to do something and it just "automagically" happens, and this comes pretty much by definition with an increasingly "smart" interface and "dumb" user who is usually defeated if he or she tries to think outside the box (and thus doesn't)
I mean, all of that is grand but you remain for, for from the metal in all these instances and even in assembly you cannot go under several layers of abstraction with modern architectures
in the days of 6502 and Z80 compatible microprocessors, you could go down to the NAND gate level, to the transistor level - it was doable, not that it made much sense to go there for most people, but it was nearly a requirement to know how many cycles did each instruction take and lock conditions like bus access when dealing with different parts of the memory, etc
there were essentially no hard black boxes then, and now there are many at the hardware level and even the OS level is too impenetrable; not even the CTOs of major OS companies and initiatives have a complete understanding of the OS they produce, perhaps a few select ones have a functional understanding with only 2 or 3 black boxes involved (specific driver magic, hardware detail that goes under C and assembly optimisation etc - which typically outsourced specialists deal with)
GPUs alone these days are more complex than the entirety of the systems back then, and the implications of this are two-fold:
- the culture on the producer end is that software, machines and accessories are sold as black boxes, both by expectations and by enforcement on the manufacturer's end that gets no incentive whatsoever to expose internals of their product, and many reasons to the contrary as the competition might use it; these dynamics are extending to repair-ability as well
- the culture on the consumer end is that internals are not possible to deal with, are best taken for granted and the curiosity is just not there anymore; also if things break you just get new things, which disincentives also producing anything meant to last, as the assumption is that it will be obsolete soon anyway, or the system will just stop working, reinforcing that dynamic and cheapening the value of work