Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: What was better in the past in tech?
74 points by rixed on Feb 21, 2022 | hide | past | favorite | 138 comments
Not that progress should be denied entirely, but I often think that the past gets discarded too quickly and many good ideas got lost with the bath water. I'm certainly not the only one to feel that way.

So I'm wondering what such good ideas that have disappeared can other HNers remember.

I'll start:

In the golden age of sun stations, the BIOS was written in forth, and the ROM contained a forth interpreter. Not only all extension cards ROM was interpreted and therefore all extension cards were architecture independent, but you were given a Forth REPL to tinker around the boot process, or in fact at any later point once the system had started with a special key combination.

That was in my opinion way ahead of the modern BIOSes, even taking into account OpenBIOS.

Your turn?



Programs were sold, not subscribed to. They continued to work long after the company folded.

Being a mischievous teenage hacker didn't land you in prison.

Software makers gave a shit about how much RAM and how many CPU cycles they were using. Disk space was sacred.

The technocrats weren't always a given. At least hacker-hippies that resented corporate control gave us an alternative; we could be living in a completely proprietary world. Compilers and even languages used to cost money, I shudder now when I see a proprietary language.

Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.

Software used to not hide all the options to "protect me from myself".

Computers used to be bigger. I love my office computer, but you gotta admit, fridge size and even room size 1401 style computers are pretty damn cool. I'm planning to buy a fiberglass cooling tower for a big-ish computer for a project this summer...

There used to be killer apps and amazing innovations but now its just ads and single function SaaS leases. At least open source projects are incredible, still. There are a few amazing commercial software products though. It's the future, after all.

Thats enough grumpy ranting for the minute, I'm sure I'll have to append this.

(P.S. remember when computers didn't have an out-of-band management system doing God knows what in the background?)


>> Software used to not hide all the options to "protect me from myself". I agree this is not for for us power users. However - ten years ago I was the free pc support for extended family and distant friends. Constantly removing 5 toolbars from IE and all sorts of adware, spyware, pre-installed crap ... Now it's years ago I had to resolve anything. So at least this "protecting people from themselves" is really working out great.


Not that long ago I built a gaming computer for my girlfriend's nephew. He asked me to help him with a problem and when I opened up Chrome to google it defaulted to Yahoo search. He said it has been doing that for a while and he didn't know why. Those days are unfortunately not over.


The in-laws' computer had a similar issue, in that it was getting a bunch of shady popups piling up from the bottom right corner (Windows notification popups), where eventually it would cover 1/4 of the screen. Tracked it down to Chrome, and sites that were allowed push notifications. Took a minute to purge the allow list, turn off the prompt to ask for notification permissions (non-technical people tend to click on "ok" or "yes" on most dialog boxes).

But overall much easier than the old days of having to manually remove malware and/or reinstall the OS.


>Being a mischievous teenage hacker didn't land you in prison.

On this note: Hacking, cracking and phreaking were done purely for curiosity sake. Even virus developers did it just because they could.

There was a sense in the community of pushing the technology to its limits.

Nowadays hacking, cracking and all the 'black arts' are for pure profit. Being it private or government...


>Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.

Remember when you could note the number of sent/received packets in your network connection properties before going to bed, and wake up in the morning and see exactly the same number?


> Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.

Varoufakis answers the question here[^1]. Use subtitles, translation is acceptable.

Yanis pointed out any new "successful common" (e.g. building societies) gets privitized. The most widely known example he points at, is the internet.

He also point out, that this is nothing new. Happened in the 19th century as well.

[^1]: https://youtu.be/JfGgRf0JPr8?t=1761

[^2]: https://en.wikipedia.org/wiki/Kratos_(mythology)


The first computer which was mine (completely mine) was a Gameboy. Like, the original, classic Gameboy, whatever its name is. We also had a 80286 PC, and compared to that it was 'small'. This was back in 1991 or so. I also had a 'smartwatch' which had a Nintendo game (one. Yes, one game): Tetris. You also had Palm computers back then. Everyone has a 'small' portable/mobile computer with networking these days, but back then portable/mobile computers existed as well. They just didn't have networking, were very limited, etc. Piracy existed then as well. Back in the days you could study a chip by putting it under a microscope. The Soviet Union made VAX clones via such way. Today with EUV I don't think that's possible anymore.


> Being a mischievous teenage hacker didn't land you in prison.

This is describing a time before many readers of HN were alive, let alone programming. The https://en.wikipedia.org/wiki/The_Hacker_Crackdown documents when the US started cracking down in the 80s.

The UK had a similar incident in which a PRESTEL account belonging to Prince Charles was hacked in the late 80s, resulting in the Computer Misuse Act 1990.


And the downsides:

* Getting information and help was difficult. At best there was c.l.c and such if you had a modem, but there was so much gatekeeping going on that it was hard to get a straight answer on anything.

* Source code was hard to come by. Everyone was so damn bent on keeping their precious code secret that you could only learn best practices if you happened to be in a job with good leadership.

* Hobbyist embedded systems were all but impossible unless you rolled your own. Working on an embedded platform meant using crappy tools, expensive and clunky in-circuit-emulators, and proprietary toolchains. Otherwise it was time for a homebrew etching tank.

* Storage, backups, and code versioning were a problem. Sure, we had CVS and eventually SVN and sourcesafe, but man did they suck!

* Hardware was super expensive. Software was super expensive. Getting anything done on a tight budget required a lot of creative thinking.

* Spending all day squinting at a small monitor sucked.

* Software (especially development environments) was chock full of sharp edges and required arcane knowledge and incantations. They were called UNIX wizards for a reason.

* Data communications and interchange formats were TERRIBLE (and always proprietary)

* Multilingual support was an exercise in madness.

* Bash was one of the nastiest undead languages ever invented. Oh wait, it it still is...

* "If it was hard to write, it should be hard to read and understand" was the mantra of the day.


Regarding "getting information was difficult"

I've worked in game development for 25 years so YMMV: a few years ago I had an internet outage for a week or so and had to work on the engine just using my library of books. It was definitely slower and more cumbersome BUT it was a much more rewarding experience. I produced substantially better ideas with better documentation and with MUCH better execution. I don't think it was just because of the quality of my library. I think it was because I was afforded the space to think, something that had not happened for quite some time.

I now make a point to investigate my library before blogs, forums and the pitiful stack-overflow. Slower learning has lead to deeper thinking.


"Instant information is not for me. I prefer to search library stacks because when I work to learn something, I remember it." -Harper Lee


The sheer insanity that are shell scripts is something I don't see talked about often enough.

Bash is absolutely wonderful as CLI driver. It's truly an awful scripting language).

And yet, I see many, many projects with huge, enormous shell scripts. It beggars belief.


"I can use the shell" != "I can write shell scripts"

Bash is really difficult to write well (if at all) and too many people don't appreciate this. Of course this leads to instability in shell scripts. Offender number 1? Dockerfiles!


That's not a shell problem, that's due to docker not having start and end layer flags, so you end up with stupid long cmd && cmd && cmd sequences to ensure it's all in one layer


Bash truly shouldn't be used for a bunch of scripting tasks.

Quite simply I don't think you should ever start using shell scripts in your code base in any capacity

Eventually you're going to reach something you just don't want to do in Bash, and suddenly now you've got two scripting languages in your codebase?

Don't think you want that trouble. Start with a scripting language with decent control structures on day 1 and never have to worry about refactoring.


> and clunky in-circuit-emulators…

I wrote software for in-circuit emulators from Applied Microsystems back in the late 80s and early 90s. I was fortunate to be able to work with some really talented people.

Our high-end ICEs were expensive, maybe $40K at the time. I can think of at least one company that bought 120 of them from us, although most of our customers only bought a few.

A lot of NRE went into building such systems. And yes, I think they probably were clunky yet capable of some amazing things when it came to debugging embedded systems.


> Getting information and help was difficult

As time goes on, I don't see this as the benefit I used to. When Google supplanted Altavista as my go-to engine for technical docs, and then when StackOverflow come to the scene, I initially thought these were great developments. But I was wrong.

Yes, it took much longer to both obtain and read the Intel/Microsoft/VESA/Cisco/IETF/etc documentation, but at the end of that process I always had a very solid understanding of what I was doing, why I was doing it, and where to look if X or Y goes wrong. Engineering coworkers and managers understood the process, understood the time delays involved, and accepted what were typical turnaround times for both programming and debugging iterations.

Nowadays? I solve the problem much more quickly, but within a range of zero to very little understanding of why it works, what is above and below me in the abstraction hierarchy, what the hardware is actually doing, what the performance tradeoffs are, etc. And then to speak to engineering culture, if you ever step out to do it the old-fashioned way, impatience often sets in and you're told to just Google it, bro.

If something goes horribly wrong, would you prefer your engineering team to have developed a deep understanding of what's actually going on under the hood, or would you rather have a pile of stackoverflow answers patched together?

I think the old way was better. My brain is dumber and lazier than before, and I'm uncomfortable with this feeling. It's just sad.


Exactly, that is my personal feeling as well. I'd rather prefer to spend the extra time understanding the solution instead of just copy-pasting and adapting the first Stack Overflow answer.

In fact, one of the most striking things I encountered in Github issues, Stack Overflow, etc... is the amount of people who simply try something, works for them and they share it like some sacred solution, but when asked why that works, they shrug and cannot provide an answer. And there is a sea of validated answers just like that in those platforms.

My fear is that we are slowly rolling down a hill in which IT ultimately becomes less of an empirical industry and everything comes down to trying until it works (without the proper understanding), because it is magic.

Yes, sometimes (if not most of the time) developers are in a rush and have no extra time to spend on understanding a solution, so it is much better to slam a copy-pasted code and move on. Still, this should not be the way to go...


> Sure, we had CVS > […] > Data communications and interchange formats were TERRIBLE (and always proprietary)

Not always. We had Kermit (https://en.wikipedia.org/wiki/Kermit_(protocol)) and NCSA Telnet (https://en.wikipedia.org/wiki/NCSA_Telnet) years before we had CVS (https://en.wikipedia.org/wiki/Concurrent_Versions_System) (1981, 1986, 1990)


Yeah, documentation required effort. There was good documentation, but you had to know where to find it. You had to network for such, too.

Software-wise, everything was enabled by default. You had to disable services. Which meant your machine was remotely exploitable right after install.


>* Getting information and help was difficult. At best there was c.l.c and such if you had a modem, but there was so much gatekeeping going on that it was hard to get a straight answer on anything.

Just to clarify, by c.l.c, you mean comp.lang.c on Usenet, right?


Yes, usually accessed via fidonet if you weren't at a university.


> * Spending all day squinting at a small monitor sucked.

To be fair, the pixels were bigger back then.


Rapid Application Development, similar to classic Visual Basic and Borland Delphi. I remember being a college student and we can create GUI apps with relative ease.

Today, even electron is reserved for the experienced web dev. See roadmap.sh

I believed we've lost a lot when we transitioned from desktop apps to web and mobile apps and JavaScript won.

Even ActionScript was way ahead of JS during its time!


Low-Code tools like Retool allow you to do this. They have the same limitations though (like inflexible layouts that don't adapt to different screen sizes), which hurts especially on mobile.


There were similar low code/no code systems at the time, they were called “App Makers” or “App Generators”.

Something in the class of VB6 or early Delphi is not commonly available. Everything today is comparatively clunky and bureaucratic. They can do a ton more, but take a lot more effort to produce the first usable thing.

(Similar to the simplicity of old rom basics in micros is now unavailable. You can do a lot more, and more easily, but you can’t just 10 print “hello” / 20 go to 10


Xojo does a pretty good job (Like VB6 for Mac/Win/Linux/Web)


Responsive layouts for varying screen sizes dont exists in the 90s so its not a fair comparison. Responsive Web Design was coined only in 2010. And RAD apps arent tied to the database unlike Retool.

With RAD apps, the barrier to entry is low. You install the IDE from this thing called a CD-ROM, drag some components, add some code to make it reactive (Yes we have reactive components in 1998!). Compile and run.


If it were not for VB and Delphi, I wouldn't have kept my interest for programming. Mucking around with C on the command line took my teenage self only so far before losing any interest.

If I had free time and enough dedication I'd re-learn Pascal just to write desktop apps with Lazarus. Alas, that will probably never happen.


> Today, even electron is reserved for the experienced web dev. See roadmap.sh

I am a recent graduate, but I think UI development has always been hard, whether mobile, web or desktop. Web seems to have had the least barrier to _entry_ (because HTML and JS, I think).

I have had classmates who tried flutter tell that it's so much easier than doing the same thing on web. While it was UI builders then, UI-as-code is new trend. There are some exciting directions now too (Flutter, Jetpack compose, svelte with its first-class reactivity).


Now to put on my grumpy pants...

Programming nowadays feels more like an exercise in importing other people's code correctly. I feel like it's mostly writing something that takes data format X, converts it to format Y for library Z and then 'just works'[sic]. The 'heavy lifting' is usually not happening in your code, but in one of the libraries/full-on programs being pulled in, and your code just happens to be calling it with the right parameters.

This isn't all bad, in that it allows ideas to be tested and products to be created in a testable (and sometimes even shippable) form in a ridiculously fast amount of time, but then you're also far more prone to discover some library in your stack had a breaking change a few versions ago and you suddenly don't know when your upgrade will be delivered because you don't know if it's just a 'legacy=true' parameter that needs to be passed in or they redid their core somehow.

Or you try to profile your code just to find 97% of the execution is from the single 'load_thing_from_internet()' call and you have no idea if you want to fork and maintain a branch of that thing, switch it out for something else, or try to write your own. And you probably have dozens of these in the code you don't even know about because the libraries you import are just the same thing.

I think this whole process makes for sloppy, difficult-to-understand, and slightly scary applications -- and this is basically all applications running today.


> Programming nowadays feels more like an exercise in importing other people's code correctly. I feel like it's mostly writing something that takes data format X, converts it to format Y for library Z and then 'just works'[sic]. The 'heavy lifting' is usually not happening in your code, but in one of the libraries/full-on programs being pulled in, and your code just happens to be calling it with the right parameters.

AKA "plumbing". Yes I agree this is pretty much all I've been doing the last few years.

> This isn't all bad, in that it allows ideas to be tested and products to be created in a testable (and sometimes even shippable) form in a ridiculously fast amount of time, but then you're also far more prone to discover some library in your stack had a breaking change a few versions ago and you suddenly don't know when your upgrade will be delivered because you don't know if it's just a 'legacy=true' parameter that needs to be passed in or they redid their core somehow.

I know it's a buzzword but it shows how important DevOps is in modern development. It's something I need to spend more time on...


The web, circa 2001.

It was easy enough to make a website that anyone could. There were services to build a page (Geocities, Angelfire), and if you wanted a bit more control you could host something on a shared server as simply as FTP'ing some HTML files to a remote directory. Expectations were low. People rarely criticised. That meant some truly whacky and creative things got built. Taking payments online was relatively hard work. That meant no one really expected to make much money online. Even ads were only just starting really, and most people didn't bother. People made fan sites for things they were passionate about, just to say they have a website. It was never a "side hustle", it was a just hobby. That was nice.

It was also the era of Flash, which lead to some brilliant and creative sites.

Languages like Perl and PHP were taking hold of server side generation so real SaaS business were starting to take shape as well.

I miss it a little. I have no doubts that the web of today is better, especially given the fact it's the basis of my 25 year (so far) career, but there are definitely aspects of it that I'd bring back if I could. The web should be more fun.


>It was easy enough to make a website that anyone could. There were services to build a page (Geocities, Angelfire), and if you wanted a bit more control you could host something on a shared server as simply as FTP'ing some HTML files to a remote directory.

Just for what it's worth, this web still exists! Free hosting sites are still around, as is (S)FTP and plain ol' HTML.

Just a couple of weeks ago, I wrote a python script that generates + uploads webpages to help my wife with her work using nothing more than print() statements and an SFTP command.


Just for what it's worth, this web still exists!

I wrote a python script that generates + uploads webpages...

This is the problem in a nutshell. I'm aware that it still exists technically, but my own experience mean that as soon as I start I think "I'll be doing this more than once, I'll automate it!" and I dive in to a thorny bush of build scripts and dependencies and questions like "How will I notify myself if the script fails?" which takes a lot of the fun away.

My expectations of what a website is, and how it should work, and my awareness of the pitfalls of the Old Ways means I can't really go back to how it was before. I don't want to. I miss the good bits, but I know there were lots of bad bits as well.


I much prefer the web now, than the web then.

Mainly because now you can run your own VPS cheaply and install whatever exotic tech stack you want on it.


Though the VPS is the cloud, not the web. Your exotic tech stack would be the equivalent of a set of .HTMLs in notepad.


Similar to https://news.ycombinator.com/item?id=30414068:

FoxPro

Imagine what all that "low-code" apps pretend to do, but do it better, faster, far more powerful. And in DOS 2.6.

You can do all: Forms, Reports, Apps, utilities, code the database (queries, stored procedures, etc) all with the SAME language and 0 impedance mismatch.

And the Form/Report Builder rival what Delphi do it later.

---

My dream is to resurrect the spirit of it (https://tablam.org). One thing that make this apart from the "low-code" of today: These were tools made for run in YOURS devices, not in a "cloud" where you are at mercy, not only of overlords, but of the latency. This kind of tool feel way faster than moderns because this little thing: Run local is way faster!

So, If I could get the way to dedicate it, this could be the major advantage: Local First, Cloud capable.


ForPro was a fantastic improvement over dBase. It was rumored that Microsoft bought it to put it to pasture because it was competing with MS Access. Seeing how Visual FoxPro quickly stalled, I tend to agree.


> It was rumored that Microsoft bought it to put it to pasture because it was competing with MS Access

And certainly, Fox was the more capable. Also, MS Access as UI builder was fine (a few nicer here and there) but scripting with VB was a joke. Is clear that the Fox language is a better match, and that is NOT "imho"!.

But that was not the only competition: Sql Server. With Fox around, the need for Sql Server is near zero for a lot of small business.

Contrast: The Engine of Access not only is a joke: is actively bad (corruptible) so is very fast you need a "proper" db!

---

I know a good replacement was needed since the end of support from MS, but keep waiting for a replacement, thinking it was a no-brainer product to make. How foolish...


You're completely right, it was to push SQL Server. Small businesses could get away with an Access database, but they were more likely to just use Excel.

Maybe I mixed the two because FoxPro could service both markets, but SQL Server wasn't a good fit for small businesses, and Access was supposed to fill that niche.


Consistent graphical user interfaces drawn by the operating system with standard widgets that automatically respected user selected color and font themes.


100% this.

I use linux with GTK and if there is a choice of applications I'll try the GTK one first because of that consistency.

Which is fine as far as it goes but a lot of the proprietary apps I have to use for work I can't do much about.

IMO the classic WIMP desktop peaked with Win2000 from a UI/UX point of view (and I say that as someone who has run Linux since that era...)


Same, but with Qt. KDE has done a remarkable job at making GTK applications look native too though.


Wait, that was a thing?


Well, yes and no.

In the 1990s, a lot of desktop software was Windows-only, and used the Windows toolkit directly, so they supported the OS theming [1] including high-contrast options.

The idea you'd avoid the OS widgets, or that you'd want a touchscreen-but-no-keyboard UI on desktop software for consistency with a mobile version simply hadn't become mainstream. (Java had started testing the waters with cross-platform applets embedded in web pages but that's another story)

Of course, this was in the days of 1024x768 so large fonts were for the partially sighted, not like today's "high DPI mode" for people with 13" 4k screens. And most developers didn't test with different OS themes, so large-font settings could mean labels too big for their widgets, vital buttons getting pushed off screen and suchlike.

Other applications (well, MP3 players at least) would come with their own 'skinning' system allowing you to style them completely independently of the OS. And many games had their own full-screen menu systems that ignored OS theming, just like they do to this day.

[1] https://blog.codinghorror.com/a-tribute-to-the-windows-31-ho...


In the 1990s, a lot of desktop software was Windows-only, and used the Windows toolkit directly, so they supported the OS theming [1] including high-contrast options.

Windows wasn't alone in providing this. Most desktop operating systems of the day did, including MacOS 8/9, BeOS, OS/2 Warp...

Of course, you're right that there never was a utopia of consistency where everything looked and worked the same and always respected user prefs, but the mid to late 90s was about as close as we ever got.


Got you, thanks for the link! I've been meaning to write a small native GUI app for fun but all the current options (GTK, Qt) are so large, it's a bit daunting. So I was full of hope when reading the original comment ^^'


> I've been meaning to write a small native GUI app for fun

Windows + Visual Studio Community + C# + WinForms = you can drag a button on to a panel, double-click it and edit the code behind, and start gluing a native app together in minutes.

The install is large, and the limitation of WinForms is that it's not dpi-aware and tends not to be resizable either, but the development experience is still pretty great. And if you somehow like the Windows 3.1 experience, all the C code from that era should still compile and run on a 2022 system. You can learn what a window procedure is; I believe people call this "immediate mode" GUI these days.

Oh, and as always, you can have native XOR portable.


I'd like to keep my workflow as Windows independent as possible but I'll maybe check that out if all else fails.


I've dipped my toes into GUI development just enough to know you can have "simple," or you can have "native," but not both. There are smaller projects like nanogui[0] and microgui[1] out there, but of course they're only as small as they are because they don't use native widgets.

[0]https://github.com/wjakob/nanogui/tree/master/src

[1]https://github.com/ryankurte/micro-gui


You could go with Lazarus[0]. It is dead simple to use (hope you like Pascal, though) and can use native widgets from your choice of Win32, Cocoa, GTK+ or QT.

[0] https://www.lazarus-ide.org/


Alright thanks, it's pretty sad we've handed cross-platform development to the web mostly. I see if I find something great along the way.


Cross-platform desktop development never really worked properly, partly because the platforms were competitors and had no interest in making it easy. The web is the "neutral zone" where platform-specific stuff has mostly been fought off, at least since we killed IE and Flash.

If you want to make a native app, the first question has to be "native to what platform?"


True enough, I'll still hold onto a dream of cross-compatible GUI's which work on most platform. Maybe new advances in VM stuff will make that possible?


I think one big reason the web works as a GUI platform is that it wasn't designed with programmers in mind, but around a much simpler document and box-model paradigm with separate DSLs for layout, style and execution (HTML, CSS and JS respectively) - only one of which is (arguably) a programming language, and a simple one at that. It would be much different if anyone wanting to write an HTML document or GUI application had to do so in C++ using web APIs directly. Most GUI frameworks just aren't at a high enough level of abstraction to be as easy to work with as the web.

Also, obviously, because browsers and the web stack are already ubiquitous. Electron is basically just shipping a web app with a standardized runtime (a Chromium instance) which is exactly what many other apps do (eg, most game frameworks and anything written in Python, Java, etc,) but Electron still needs a platform specific binary to distribute. TANSTAAFL. At the end of the day, nothing is truly cross platform.


I agree with what you wrote, but at the same time I dislike how energy-intensive the web is. It doesn't _have_ to be, but it is nowadays. Going back to somewhat native GUI's is a step in the right direction in my opinion, at least in some aspects.


I'm gonna go with an emphatic NO on this one:

- Buying software, especially computer games for Windows 95/98 was a complete gamble. Maybe one third just worked, one third required Direct X/Soundcard Driver/Graphic/Bios fiddling to get something running (though crashes were frequent), one third just outright never worked at all.

- Web Development was an endless nightmare of IE6 compatibility, tables with eight separate pngs around an element to create a drop shadow, clearfixes and floatfixes, polyfills and fallbacks.

- SVN/Turtoise version control was constantly corrupted or in some dodgy state. Different line endings or upper/lowercase filenames could get files stuck and unrecoverable.

- So much (and I mean SOOO MUCH) money was spent on buying and maintaining the serverroom (no cloud, remember) - so you had to buy and amateurishly maintain all this super expensive hardware - usually on a standard, off-the shelf T1 line connection to the internet.

- CD-ROMs got scratched, files got corrupted if your computer crashed during saving

- You spent eight hours a day in front of a giant, non-flatscreen cathode monitor that blasted your eyes with light and radiation, making you look properly stoned when you came home.

... I could go on and on... but honestly, stuff got soo much better over time, especially in tech. Sure, there are downsides (lootbox/micro-transactions in games, expensive software subscriptions for what's essentially static products (I'm looking at you here, my 59$/month Adobe CC abo) etc. - but overall, tech is much, much better today as it was 20 or 30 years ago.

Now - when it comes to social interactions and interpersonal communication, I am less sure...


> I'm gonna go with an emphatic NO on this one:

The point of the post is that its two steps forward, one step backwards. Not that everything used to be better back in the days.

There's countless of examples why the tech world is better today. HTTPS alone makes it more difficult for a MITM attack, for example, but its not a panacea.

> Buying software, especially computer games for Windows 95/98 was a complete gamble. Maybe one third just worked, one third required Direct X/Soundcard Driver/Graphic/Bios fiddling to get something running (though crashes were frequent), one third just outright never worked at all.

That's why I worked with references, so I'd read a review in a magazine, or learn from a friend which hardware was stable. For example, I used a Plextor CDRW which used SCSI. This lead to massively less buffer underflows.

> You spent eight hours a day in front of a giant, non-flatscreen cathode monitor that blasted your eyes with light and radiation, making you look properly stoned when you came home.

Radiation from CRT monitors was harmful?


> - So much (and I mean SOOO MUCH) money was spent on buying and maintaining the serverroom (no cloud, remember) - so you had to buy and amateurishly maintain all this super expensive hardware - usually on a standard, off-the shelf T1 line connection to the internet.

This is not a thing of the past, migrating everything to the cloud is not advantageous to everyone. Even if it still does, you still are left with (edge) servers to maintain.


I kind of agree with most of this. The question was not "don't you think the past was better", but rather "If you could revive something from the past, what would you pick?"


Documentation on system-level details was available, with DEC’s servers and workstations (PDP, VAX, MIPS, Alpha) and the Motorola computer products (e.g., the MVME board series for 68/88k and PPC) as great examples.

The availability of documentation enabled the porting of Linux and BSD systems in the 1990s without wasting a lot of time on reverse engineering the hardware details from the original OS.


What i find interesting is that this used to be the case on PCs too. For example i have a manual from my first PC's "ATI Graphics Solution" graphics card that came with full information in the manual on BIOS calls, memory layout, etc on how to program it, including BASIC and C source code in the floppy disk.


The flip-side is that producing that manual was the work of 10 people.

Writing those things must have been an enormous endeavor.


This extends to videogame manuals.


Human interface guidelines. We had them and we respected them and it meant every application had a consistent set of basic behavior. By comparison web app usability is anarchy, and not the good kind.

A lot of other people have mentioned that software wasn't scraping the bottom of the revenue barrel by spying on your every move but I'll say it again because I think it's so important. This has been a huge shift in how software is designed and in the incentive structures behind app development. It has pushed the user far, far down the priority ladder, and I look forward to when this period in software history is over.

Oh, and ironically, software was faster.


The worse UI pattern the modern web has normalized is UI reflow. Things can move from under your mouse/finger, and sometimes you touch the wrong element, or a notification.


30 years ago, information technology was made with the expectation that users would tinker with it. Subsequently, most products were designed with that in mind, and the documentation was written with tinkering in mind.

Today's consumer products are made with the expectation that they should always "just work". The result is that users reactions to problems have a tendency to range from frustration to panic and rage, rather than an inquisitive curiosity aimed at solving the problem.

I guess you could still tinker with most products, even if some brands make it increasingly hard. The real difference is in the expectations of the users.


>The result is that users reactions to problems have a tendency to range from frustration to panic and rage, rather than an inquisitive curiosity aimed at solving the problem.

To be fair, a lot more people now (many of whom are not tech nerds) are forced to use computers in order to do things that didn't require a computer 30 years ago.

I'm sure the average HN user's reaction would be closer to frustration than inquisitive curiosity if they were forced to navigate a complex and banal social system in order do something like sign up for a phone plan or watch a movie.


Hell, my reaction is frustration when I am forced to navigate a complex and banal stack of broken technologies in order to do something like make my light switch work, and I'm a developer.

Technology is terrible, it barely works, and when you work at the edges of it (like we do), you hit the "barely" very often.


> To be fair, a lot more people now (many of whom are not tech nerds) are forced to use computers in order to do things that didn't require a computer 30 years ago.

This is definitely a contributing factor. Forcing non-technical people to use computers is a rather daft idea, but it saves money since the majority know just about enough about computing to manage to do things after a bit of trial and error. Any bump in the road is sure to cause tears and shouting though.


The difference is that only people who accepted the consequences of tinkering bought those products back then. We knew that if we messed up and bricked it, that was on us.

Consumer devices are made for consumers - people who don't know how to create. If they follow some messed-up tutorial on YouTube and brick their device, they don't see that that's their fault. They complain to the manufacturer and expect a replacement or a refund.

So manufacturers have to protect themselves from these people, by locking the device down so you can't mess it up. It's not to stop us from tinkering with it. It's to stop the consumers from breaking their stuff "accidentally".


VAX VMS Clustering: a single CPU architecture on VAX HW meant that micros and minis could cluster, and your process could be hosted anywhere in the cluster. It's taking a long, long time to reinvent that particular wheel with a combination of microservices and containers.

Xerox PARC Alto workstation: GUI, TCP networking and Smalltalk 72 OO programming system, all in 1973! This wheel has been reinvented in part many times since by Apple, MS and others. How much real progress has there been in the last ~50 years?


Re. the VMS clustering: pardon my complete ignorance but how did orchestration work? How was the host of a new process chosen? Could processes be migrated from host to host? And how do you connect to a given service if you don't know where it's running?

Any link to some overview of how this work would be greatly appreciated.


I miss machines that didn't require UEFI just to boot. Phooey to you Microsoft.

I miss repairable machines. Replacing a keyboard in a modern laptop is all hidden screws, sticky tape and one time use plastic studs you have to reglue because they have to be snapped off to remove the broken keyboard. Phooey to you Acer.

Like others I miss the whole rapid application development movement. VB6 and Delphi just rocked at CRUD apps. Now we got nothing but dependency hell trying to get Electron started after you upgrade some library to pick up a bug fix and it cascades into a complete rebuild. Or trying to remember what dark wizard CLI you use to add a page to angular or react.

I miss being able to catch an exception in VisualWorks, fix the code and hitting continue.


The magic of having the TV blink on every press of the membrane keyboard while copying programs out of the manual, which had a funny smell[0]. Once my brother and I got Commodore 64's, we could see the golden age receding -- the Super Expander[1] clearly did not want us to suffer enough for our own good, and GEOS[2] embraced the blasphemy of XEROX. By the time I bought an Amiga instead of a car, my retired neighbor was already looking up from his Tandy[3] to will me off his damned lawn.

Seriously though, there have been a lot of Golden Ages already and I hope there will be a lot more.

My nostalgia for the time I started out is strong, but when I look at it objectively I see two things I think genuinely were better in the 80's, from a cultural if not an economic perspective:

1) There were many competing hardware and software ecosystems, even paradigms, and it was absolutely not obvious what we'd all be using five or ten years down the road. We live with an impoverished imagination of personal computing now.

2) People working in tech out of pure nerdy love outnumbered the people in it for the money by about 20:1. Now that's reversed, or worse.

[0]: https://www.timexsinclair.com/computers/sinclair-zx80/

[1]: https://www.c64-wiki.com/wiki/Super_Expander_64

[2]: https://en.wikipedia.org/wiki/GEOS_(8-bit_operating_system)

[3]: https://en.wikipedia.org/wiki/TRS-80


Applications worked offline, they made good use of available system resources in exactly the same way that modern "web apps" do not.


The whole language standard of a Borland C++ compiler was written down in a telephone-book-sized manual that was delivered with the compiler itself. No Stackoverflow, no loading hundreds of libraries to do the most basic things. You just learned the language.


back in the day everything wasn't a means to sell ads or collect rent or promote yourself. tech wasn't about disrupting the marketplace or the industry but about solving problems


I agree with you statement. But I do have a - genuine, absolutely not sarcastic - question. As products and services were not meant for the purposes you mentioned, how was money made? Did everything have a price upfront and was nothing ‘free’?


An amazing amount was free and usable. Usenet was used for the stuff that is now broken among Facebook (non private use), Twitter, stackexchange, etc. free/libre software did exist, and though limited in scope, where it existed it was often better. Gcc wasn’t as good a compiler as some vendors, but was better than some in code generation, and better than all in standard compliance, error messages and general UI. emacs and vi were the better editors around.

NCSA Mosaic and Server (that started the web) were free. The web was free and a huge percentage of it informative / labor of love free from commercial interest.

There was a lot of free/gratis but not free/libre software as well.

But also, you had to pay for a lot more things than you do today. WYSIWYG Word processors and spell checkers, for example, were expensive, and there was no free alternative (TeX was, it was superior in quality but with abysmal UI)

I would gladly return the net, and possibly the software world, back to that state. A lot of people assume that without ads the net cannot exist - but it did, it was informative, and useful.


Quite a lot of it was, as always, pirated.

There obviously was a mainstream tech industry that charged for its products. Often these were unaffordable to amateurs. But, since SaaS was infeasible, the software had to be shipped on physical media and run on isolated computers, which meant that it could be pirated. That got a lot of people into the industry.

There was also the hybrid model of "shareware", where you were encouraged to make copies and give to your friends - but if you liked it you were asked to mail some money to the author. Possibly the last great shareware product was the original DOOM - the demo version fit on four floppy disks, and you could mail off for the release version with more levels/weapons.


there was of course payment for products and services. but there were also demos, shareware, and the equivalents of today's "social media" and "metaverse" but everything was not cynically designed to extract money from people.


I started to code 35 years ago on a Commodore 64. The beauty of those machines was that you literally had to write code to even load a game (LOAD, RUN, etc.).

As such, the barriers to start exploring programming on your own were low: the development environment was already there, you were familiar with the interface, and it booted in an instant. I haven't seen any modern day technology replicate that ease of access for beginning coders.


I always consider myself extremely lucky to have started learning tech when I did, because the devices were so simple then. I could code in Assembler on my BBC micro with no problem, because there were only 3 registers and everything was simple.

But the beauty of that was that there was nothing more complex. There are simple devices and environments around today, but they pale in comparison to the more complex environments. You can totally code a game up in a BBC emulator, for example, but it looks shit in comparison to the Unity tutorial game.

I was forced to learn Assembler because it was the only way of writing a video game on the BBC micro. And there was nothing better out there. If I was 14 again and trying to create video games to sate my urge now, I'd be learning Unity. Even though there are simpler environments available.


While I wasn't a fan of Windows 95/98 per se, there were some fun hacks where you would strip it down to only a few megabytes and run it from a ram disk. And that was serious feat with ram sizes back then.

I guess my point is that it was more easily understandable and hackable and it wasn't a 10 GB install if all you need is SOME version of windows to run your games on.

Also zero phone home or mandatory/dark patterned Microsoft accounts.


This is true -- I managed to get Win95 running in a 16MB hardware RAMdisk. It was an interesting exercise, but it did work.

There was some terrible product I reviewed that allowed you multiple versions of Win9x side-by-side on a PC with gigs of RAM (mid-1990s): it booted DOS, created a big RAMdisk, copied your entire OS into the RAMdrive, switched drive letters, and then your PC ran very fast...

But you had to do the special shutdown procedure, or you lost everything. And Win9x tended to crash relatively easily.

It was like someone came up with the most dangerous possible way to use lots of RAM to make Win9x go faster.

I gave it such a negative review that my editor refused to run it and wrote his own. Never happened before or since.


hehe, I don't remember all the details - but I just did it after some random tutorial on the internet, and obviously you should have all your "stuff" on some d: partition, but I actually ran it for a while - the only downside was the same as with every 'disk reimagining' program like Norton Ghost, if you wanted to change so much as as your desktop wallpaper and wanted it to stick - do a special boot to 'save changes mode'.

But I'm pretty sure a lot of things were handled with some autostart-link to a special batch file on d: :P As I said, it was fun - not sure I would've wanted this kind of stuff for a work machine, yeah.


I could give a dozen. Let me try to restrict myself...

Single function operating systems.

Cisco's PIX OS ran routers and it did nothing else.

Netware 2 & 3 shared files and printers, and nothing else.

As a result, Netware (for instance) was relatively small and simple enough to understand completely, top to bottom, and the performance was blistering.

When you use the same handful of general purpose OSes for everything, on all hardware, the inevitable result is that they become huge to try to cover every conceivable base. Result: vast OSes which are vastly complex, so much so that no individual can totally understand the whole thing.

A functional OS for one specific specific task should be able to fit into no more than a double-digit number of megabytes of code, and one human should be able to read that entire codebase top to bottom in a comparable number of weeks and understand it completely.

If it can't fit into a normal human's head, then it can't be completely debugged and optimised except by stochastic methods. That's bad.

It's normal now and everyone expects it, but it's still bad.


In the old days, if the system couldn't satisfy a memory allocation request, the malloc function in the C library would return NULL and the application would respond accordingly. These days, if the system can't satisfy a memory request, it perversely returns a pointer to nowhere in the hope that the application won't try to use it. There's no portable way for an application to distinguish between usable and unusable pointers, and even if there were, the system could decide later that it has given away too much memory and therefore must repossess it by killing applications at random. (Don't get me started on "memory safety" in languages that ignore finiteness of the heap.) Yes, I know it's possible to configure a kernel like the old days, but then it better be on a dedicated server because otherwise desktop applications that have come to rely on this behavior would get flaky.


Usenet was amazing in its heyday. Newswatcher and it’s derivatives were an amazingly efficient way to scan groups to find/avoid the flames or whatever you were interested in.


The golden age for me was also on Sun workstations. I could leave my SPARCstation 20 running the C compiler for half an hour, go for a small walk around the factory floor, go and get some coffee and actually think about the next problem I was tackling and scribble some notes on paper.

Edit: also no YAML!


Magnetic media was for the people. I don't mean spinning hard drives but audio cassettes and floppies provided more freedom than any other format.


Audio cassettes are making a comeback!

It's a tiny niche right now, but that's what people said about LPs not so long ago.

Some of my favorite bands now release cassette tapes. I don't have a player at the moment, but soon!

https://theconversation.com/audio-cassettes-despite-being-a-...


I'm aware of /r/cassetteculture and other communities like it, but cassettes won't be back until literally any brand starts manufacturing and selling Type II cassettes, which are needed for archival work / fourtrack recording.


Wow, TIL... sealed TDK SA 90's cost $25 each on eBay!

Apparently this is due to a shortage of raw materials as well as market size?

I wonder if the problem could be solved by using other tape formats that might be more readily available, i.e. record audio to proper "good" tape in some other format and then figure out the rest later.


I definitely have some nostalgia for audio cassettes. I remember listing to the radio with the tape paused on record waiting for a song to come on to make a mix tape. Floppies I can't say I miss much. They were always getting corrupted. The only good thing I can think of is that AOL sent out free ones.


Ha, youngsters.

We listened to the radio with the tape paused to record computer programs!


UK?


NL, UK equipment ( Acorn Electron )


It was much easier to understand everything about a system.

I remember working at an e-commerce system in the late 90s. I knew how Apache worked, I knew which modules I had compiled in, I knew how our proprietary code worked, I knew how Linux worked. Maybe to a lesser extent I knew our our Oracle DB worked in a deep sense, but I knew enough about RMDBs, our PL/SQL, query plans, indexes etc, such that it was sufficient.

Whenever we had an issue, there wasn't part of the system I wasn't deeply familiar with, and I could just in wherever.

I don't even know if that's possible today.


One thing I figured advantageous is kids in the 80s had to learn assembly on say Apple II to make real games. The machine was so approachable at the time so they could wrap their heads around the whole architecture.


Manuals. Nothing can replace a good manual.


>Manuals. Nothing can replace a good manual.

"One of the questions that comes up all the time is: How enthusiastic is our support for UNIX? Unix was written on our machines and for our machines many years ago. Today, much of UNIX being done is done on our machines. Ten percent of our VAXs are going for UNIX use. UNIX is a simple language, easy to understand, easy to get started with. It's great for students, great for somewhat casual users, and it's great for interchanging programs between different machines. And so, because of its popularity in these markets, we support it. We have good UNIX on VAX and good UNIX on PDP-11s.

It is our belief, however, that serious professional users will run out of things they can do with UNIX. They'll want a real system and will end up doing VMS when they get to be serious about programming. With UNIX, if you're looking for something, you can easily and quickly check that small manual and find out that it's not there. With VMS, no matter what you look for -- it's literally a five-foot shelf of documentation -- if you look long enough it's there.

That's the difference -- the beauty of UNIX is it's simple; and the beauty of VMS is that it's all there." -- Ken Olsen, president of DEC, DECWORLD Vol. 8 No. 5, 1984

[0] http://www.anvari.org/fortune/Miscellaneous_Collections/3696...


Everything was so fragile.

You'd break things, learn, fix them for the time being, until the next thing broke.

Computing today is far less stressful-somehow reminds me of how vehicles today take next-to-zero mechanical knowledge to operate, whereas not all that long ago you needed to know a mechanic and be reasonably handy yourself to keep them running smoothly.

Now things just work. That's broadly better. But I have to wonder how many of us would be in tech if we hadn't tinkered with tech growing up.


I think we will look back and see this as a golden age.

Supply-chain attacks are about to destroy our current model of trust in FOSS. Much as the internet was never designed with security in mind, and so has had to have security layered on top of it, badly, our current model of trusting code from the internet is about to get badly broken.

In 10 years' time we'll look back and get all nostalgic about being able to just pull a package off Git__b without worry.


> Supply-chain attacks are about to destroy our current model of trust in FOSS

If you are referring to the various NPM style issues that cropped up in the past couple of years, I would say that non-curated public repositories are definitely _NOT_ "the current model of trust in FOSS".

The current model of trust is a curated restricted store where trusted people publish signed versions of the software, namely the individual distributions' repositories. The public versions of these, something like the AUR and PPAs, always come with caveats and trust disclaimers. The fact that there are programming languages communities that want to circumvent this model does not mean that they represent the status quo.


Contrarian post :)

Firstly, you don't really mention when the golden age of tech is for you. The answer can be quite different depending on the technological niche you are thinking of[1]. I'd argue that it's not over yet, and we are probably only at the beginning.

Although I agree that many lessons were learned then forgotten to the past, and we keep re-discovering those.

Only this time open source and free software is a thing, hopefully this will help us build on a common and expanding base, instead of reinventing everything all the time.

[1] especially as tech is so vague. The golden age of siege engines was probably the time of the Roman Empire. We keep building on previous technological bases though, so technology is moving forward, you'd also have to define "golden age".


The question was vague on purpose. I do not believe in any specific golden age actually, but I do believe that good ideas form the past has been forgotten. I evoked the microcosm of sun stations but others might remember good original ideas from amiga, BeOS, Vax, Alphas... Isn't it good practice to remind ourselves that alternatives did exist, that sometimes were better. Maybe some might even be revived one day, who knows.


"golden age of sun stations, the BIOS was written in forth"

And your user interface was all in PostScript!


Wait? Did sun stations use some sort of NeWS implementation ?


Yes, I used NeWS & HyperNeWS for a few very happy years in the early 90s.


Cool, I'll have to add one of those to my collection.


Some stuff about HyperNeWS and related things:

https://donhopkins.medium.com/hyperlook-nee-hypernews-nee-go...


In the golden age of the internet, we defined interoperable protocols to exchange information and you were free to use different service providers or clients. Now services are built as closed gardens with little interoperability between them.


giving 2 <beeps> about memory. we would need to be very very careful about allocating and then recovering memory. treated it as more precious than gold. thats all lost or at least no one pays as much attention now imo...


It's always good when a constraint is lifted.


4:3 laptop screens, which some are working to revive with new motherboards in classic Thinkpad chassis, https://www.xyte.ch/2021/11/09/t700-part-1-preparation/. There's no technical reason we can't have 4:3 HiDPI laptop screens today, since iPad Pros already ship these in volume.

Some tech stacks from the past have returned, e.g. Yahoo Groups was reborn as https://groups.io.


I miss the time when writing a simple game or software package, or basically anything by yourself, was both doable and a great achievement.

Everything is so ridiculously complex now, and weighed down by so much compliance and need for security (because bad actors are everywhere). You need a whole team to even get started on anything worthwhile.

I dunno, I just liked the early days when we were naive enough to assume malicious actors didn’t exist, and everything somehow worked out alright.


The quality of tech tyrants has definitely declined. Zuckerberg is a pouty, flouncing fop. Clive Sinclair and Alan Sugar were proper gangstas.


Much simpler environment. In the mainframe days, you had a no-choice environment for coding. It was the same no matter where you went.

Training was solid and documentation was professional. You had to be on the right side of management to spend the money to get either, though.

Some things were better. Not enough to make it worth changing back, though.


Lot's of nostalgia and comments about the lost culture. This is understandable I guess, but I think it would be more interesting to focus on actual technologies, be it hardware or software. If only because we can build replica of past things, but not replica of past cultures.


anything was better, the "new things" were really new things and everyone aspired to progress and research, while the old things coexisted peacefully with the new ones

now everything is fake and marketing, empty words and profiling, pure and constant consumerism, and "research" is just a constant justification for the staggering costs of marketing departments that have to sell "new things" to someone

once we were the freaks, instead now everyone is a master of everything while no one knows how anything really works anymore, because layers of layer of layers of crap

...I'm very tired, I'd just like to know nothing more about anything and do something else with the little time I have left (time that I will spend anyway to infiniscrolling empty things)


The gap between software and hardware is now so great. Developers today don’t even have a grasp of the simple concepts so when you try to engage why code is buggy those conversations require dumbing down to simple terms. I feel like my soul dies each day doing this.


The further back you go, the further above average you needed to be just to touch a computer.


Mine would be, that the software was simple and rather bad looking in general, so as a hobbyist developer that level of sophistication seemed achievable. (Not that I ever did achieve it, but it was not discouraging like today)


Do you think that the current trend to make things look ‘pretty’ has its causes in the fact that a certain product must appeal to a larger audience?


I think it's both the effect and the cause. Even personally I'd rather use a program that looks nice as compared to one that does a little bit more but is ugly. So better looking ones will have larger audiences, and thus to get a large audience you need to invest in looks.

In terms of simplicity a lot has to do with the capabilities of the hardware. The computers were much more limited, so the ceiling was lower as compared to today.

Example would be something like making a game. When I started coding, having just text on screen was far from being the best, but it was not totally out of place. These days 3D or nice looking pixel art is basically the baseline.


Virtual communities aka BBSes.


There is a definite upper limit to a working community before self censorship starts failing. I reckon its less than 100 active users.


Command line interfaces and making applications scriptable.


Televisions didn't have a startup sound, nor did they boot up.


If you're old enough to remember CRT televisions, they definitely had a distinct startup sound from the high voltage power supplies and they sometimes took a moment to warm up


Undo and Save. Every application allowed you to reverse or abandon a change.


No cookie banners


If there is a "golden age of tech", it is right here, right now. I would never in a million years go back to the tech I've used in the past.

This kind of empty nostalgia is the epitome of intellectual laziness.


The OP is asking for things that were actually better in the past, not for empty nostalgia. To be sure, you'll get a lot of that too :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: