Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Oldest software system in continuous use (guinnessworldrecords.com)
209 points by ZeljkoS on Nov 25, 2022 | hide | past | favorite | 104 comments


Some of the modernisation efforts for these sorts of systems are fascinating.

The US DOD have built a COBOL to Java transpiler that emits nearly-idiomatic Java. As far as I can remember they expect it to essentially be reviewed like any other Java contribution from an engineer and receive minor tweaks, and they're expecting to translate millions of lines of code with it.

There's also MicroFocus, a company who build JVM COBOL, with all of the implementation/CPU specific bugs, so that you can move everything on to JVM (or maybe even CLR I think?) and then start replacing pieces with Java/C#.

It's not unreasonable for a company or organisation around since the 60s to have this sort of stuff around, however with all the modern options available today, it's pretty unreasonable for them to not have a convincing modernisation plan.


Are there any reasonably fast emulator for the system 360? It seems like ridding yourself of vendor lock would be goal one. Then add an interface wrapper that logs inputs and outputs for several years to build validation tests for any ports or upgrades.

It seems like a truly insane effort to port 20e6 lines of assembly and expect to get identical behavior. I doubt the current behavior is even fully documented. Ensuring identical outputs for a lot of recorded data seems at least slightly more valid than trying to go back to documentation or requirements.

Perhaps the simplest way to migrate the IRS code is to make the laws simple enough you don't need 20e6 lines of assembly to implement it anymore?


System 360 is very well documented by the Principles Of Operation: https://dl.acm.org/doi/pdf/10.5555/1102026

So much better written than Intel’s Software Developer Manuals, where anything complicated is just “look at these three pages of c-like pseudo code we probably pulled from our tests, good luck!”

IBM themselves have had to maintain backwards compatibility for > 50 years, and there is a long history of compatible machines: https://www.computerhistory.org/revolution/mainframe-compute...

So all the strange behaviours are well documented.


If you're looking for an open-source 360/370/390 emulator, Hercules is probably what you want: http://www.hercules-390.org/


Provided the 360 CPU is well-enough documented, I don't see the problem here?


With enough saved input/output data, maybe some parts could even be replaced by a ml model?

Not 100% serious but then again seems like serious resources are put into this problem, so maybe worth a try


An AI-aided translation might make more sense, since at least you could review that for correctness and it'd likely be easier+cheaper to run the result.


I wonder if an AI system for the IRS would use double float or big decimal types to do its math.


> it's pretty unreasonable for them to not have a convincing modernisation plan

The planning isn't the issue, it's the conviction, resolve, determination...whatever you want to call it.

COBOL modernization is quite expensive and fraught with major risk. You're talking about multi-year efforts where the key players who initiate the effort are often not around to see it through. Limited legacy resources, poor-to-absent documentation, mission critical systems...it's a mess no one internally wants to touch or take the fall for. MicroFocus, Blu Age (now part of AWS), TSRI, and plenty of other companies are happy to take your money and tell you it can be done, and it certainly can be done technically.

The reality is the cost benefit is often years down the line, so keeping the right people around to sustain the effort and see it through financially and technically is the tallest of enterprise orders. So it's the old two-step: 1) Pay IBM for another three years of support and licensing and 2) Go through the motions of migration planning then repeat Step 1.


Yes and: My observation is that (poor) data quality is as big as, perhaps even bigger, risk than porting & modernizing code.

Of course, bad code and bad data are cofactors.


Actually what the IRS needs is just funding.


> The US DOD have built a COBOL to Java transpile

I think it would be a hard challenge to create such a beast, But did they also have to create a huge framework around it so that the Java could could interact with the same environment as the COBOL code?

That seems like a hard challenge as well.


I worked at AA/Sabre in the mid 90s, straight out of college, although I was working on the PC side on the software they put in travel agencies. I left just after AA spun Sabre off into a separate public company. The mainframe software was done in TPF (https://en.wikipedia.org/wiki/Transaction_Processing_Facilit...) and I believe to this day a large part of it still is. IBM created a C compiler some time in the late 90s and this was considered a major advance for the platform. I believe that the system still runs on TPF with some parts written in C. Those programming jobs were not typical CS grads, most people simply had a HS degree or some college that could pass a logic test, the company had to provide their own training because there was no other way to gain experience with that environment.

On the subject of emulation, Sabre also had acquired a company called Agency Data Systems which was written for a Data General minicomputer using a language developed in-house. The guy who invented the language was named Hugh, so internally this was called "HUBOL" (the actual name was something else that only Hugh actually could remember). Some time in the 80s they decided to port it to a PC architecture but instead of porting ADS to a more modern language, they decided to build a DG emulator to run on a PC. When I was there, they were still updating the HUBOL source code (Y2K was a big deal at the time, plus with changes in the travel business, the updates were constant) but running on the homegrown DG emulator on a MS-DOS system. Hugh was quite a character. Back then we still had to wear a shirt and tie to work and Hugh's shirts all had these breast pockets that were stuffed with odd slips of paper. The joke was that he had run every project he'd worked on in the previous 30 years out of his shirt pocket, given the way things operated, there was probably some truth to that.

Somebody linked to Adam Fletcher's talk. Adam co-founded ITA which was eventually acquired by Google and probably is the basis for Google Flights. I saw a demo in probably early 1997 when they came around the startup I worked at looking for customers. Their software was written in Lisp and ran on a PC and was completely jaw-dropping. I never realized how bad Sabre's pricing software was until I saw what they were doing. ITA absolutely was the best pricing engine around at the time. In retrospect I probably should have quit my job and begged those guys to hire me.


It's possible that the reason there was no C for TPF is similar to why C arrived very, very late to CICS on S/390 (or z architecture).

Namely, C standard library was not reentrant safe, with considerable global variables et al - and the code on CICS (and I suspect TPF) had to be fully reentrant because it effectively ran inside green threads.

So, unless IBM wanted to advertise C that wouldn't contain standard standard library, modules written in C would be possibly dangerously playing with global side effects. Relatively recently this was solved by common runtime component whose name eludes me at this time, which underlies standard run times for C/C++ and I think Java, at least inside CICS


Due to my experience with C on Unix, I once did a contract to program in C on IBM/370 systems. It was extremely painful. Especially editing C code on 3270 terminal. C on 370 was driven by marketing to entice the porting of Unix hosted software to mainframe. AFAIK it didn't gain traction.


> Adam co-founded ITA which was eventually acquired by Google and probably is the basis for Google Flights

This is an absolutely wild pair of claims. Adam was nothing like (and does not claim to be) a co-founder and google flights is driven entirely by ITAs QPX product. ITA was a very good place to work.


My current contract position is with an airline (not American, but a partner) working on modernizing their check-in. We talk to SABRE a lot, through a middleware layer so we just do json/REST. BUT, the endpoints are just an insane mess. There's at least three different ways to get flight information, all requiring different request parameters (though usually airline, flight #, departure date & origin are enough) and one will return some subset of data, and another a different but somewhat overlapping subset.

It's kind of a joke to me that we're modernizing a small part of the whole system and still using 1960s-era backend systems. Polishing a turd.


I worked in their HR department in 2000 and 2001, building various systems to support compensation, performance review, etc. Significant part of the role involved working with DBAs who pulled data from those systems into our RDBMS, as we were using web languages of the day.


It's funny how if you ask people about forward-looking systems today, they have either learned mistruths or haven't learned enough about the history of computing to picture much that's reasonable.

"Java!" used to be talked about as a good way to create software that you could run in the future, but anyone who has had to keep around old laptops / old VMs to run ancient Java to handle KVMs or device management tools knows how ridiculous an expectation about the stability of Java can be.

"Windows!" is funny, because I can't tell you the number of places that have an ancient Windows PC that has either been repaired or carefully replaced with new enough, yet old enough hardware to still run Windows 2000 or XP, because copy protection is stupid, and / or because drivers for specific hardware can't be updated or run on newer Windows, and / or because the software can't be compiled and run on newer Windows without tons of work.

On the other hand, you can take large, complicated programs from the 1980s written in C and compile them on a modern Unix computer without issues, even when the modern computer is running architectures which hadn't even been dreamt of when the software was written...


> "Java!" used to be talked about as a good way to create software that you could run in the future, but anyone who has had to keep around old laptops / old VMs to run ancient Java to handle KVMs or device management tools knows how ridiculous an expectation about the stability of Java can be.

Hard disagree. In my 20 years of experience, Java is very, very, extremely stable and backwards-compatible.

I would absolutely expect Java bytecode compiled in 1996 on Java 1.1 to have a very good chance to run without errors on a brand new Java 20 JVM, and source code from that time to compile with some very minor adjustments such as changing identifiers that clash with newly introduced keywords. Ironically, the older the code, the less likely it is to have problems with the post-Java 8 breaking changes.

But you talk about "handling KVMs or device management tools" - that's hardware stuff not covered by Java's standard API, so it will involve native code. That will bring you compatibility problems, not Java. Admittedly, that could be seen as Java avoiding the hard problems rather than solving them.

> On the other hand, you can take large, complicated programs from the 1980s written in C and compile them on a modern Unix computer without issues, even when the modern computer is running architectures which hadn't even been dreamt of when the software was written...

I want some of what you're smoking. Toy programs, sure. But nothing that does any kind of hardware interfacing (notice a theme?) or uses any but the most trivial syscalls (this would be pre POSIX), or makes any of a myriad common assumptions about hardware architecture (see https://catb.org/jargon/html/V/vaxocentrism.html). So how many large, complicated programs does that really leave?


Good point about Java, even though talking to KVMs over the network and displaying a GUI isn't really hardware stuff.

But I think you're forgetting how much code is written to C89. How much of bash, for instance, is ancient, with updates that avoid newer toolchain features so that it can be as portable as possible?

Yes, people don't often write stuff with portability as a goal at the beginning, but once something is relatively portable, it stays that way. Lots of code that wasn't poorly written made it from all-the-world's-a-VAX to i386, from i386 to amd64, and now ARM and aarch64, with a minimum of extra effort. There just had to be a little effort to NOT program like a jerk, which, as funny as it is, is still an issue now.

I'm running Pine as my daily email program, which was written in 1989 and hasn't had any real updates since 2005. New architecture? Compile it. Lots of modern software started out as C software from the 1980s.


> Good point about Java, even though talking to KVMs over the network and displaying a GUI isn't really hardware stuff.

If it's over the network (using IP sockets), Java can do it just fine and what I wrote about downwards compatibility applies. Same if it's a Swing GUI, which is part of the standard API. Admittedly, Swing is the area where I once witnessed a real bug caused by version incompatibility; IIRC code made some assumptions about undocumented behaviour in event handling, which caused an endless loop on a newer JVM.

As for portable C, I have very limited experience really working with C, but remember trying to get some tarballs to compile and having big problems with fragile autoconf scripts.


Windows is a particularly funny one, considering how quick the advice for most issues turns to doing a fresh install. It's a very brittle black box once you try to do any amount of changes from default settings.


?

That's not remotely true. Saying that as a Windows user for three last 35 years.


Every time I've ever had Windows issues. A fresh install is suggested no less than 3 steps into the troubleshooting process.

Usually something like:

1. Run these handful of commands that never actually fix anything but take hours to complete (sfc/dism/chkdsk/etc).

2. Reseat RAM/GPU/CMOS.

3. Out of ideas, may as well just fresh install.


I would say it is Jacquard's Loom which uses punched card programs to weave patterns. It was invented 200 years back[0] and I could find one video about 100 year old machine used commercially[1].

[0]: https://www.youtube.com/watch?v=MQzpLLhN0fY

[1]: https://www.youtube.com/watch?v=y5uX143hx38


I would say that RNA is a lot older than Jacquard's Loom.

And it is still actively used.


Biology isn't software. I say this as a biologist.

Software like things can be done with it, but this is no different than treating a bunch of pipes, valves, and a water source as a computer. (Edit to add: Which would make our sewers the oldest hardware system. And I'm sure there are supercentenarians still pumping software through those systems.)

I imagine that many physicists feel about physics envy the same way I feel about imagining biology as software. https://en.wikipedia.org/wiki/Physics_envy

Maybe someday we'll have a grand unified theory of everything imaginable and real. But even then distinctions should be made between levels of emergent phenomena. A habitable environment allows biology to exist. Biology allows intelligent beings to exist. Intelligent beings create tools that allow them to make even more tools that do predictable, or otherwise defined, sequences based on various inputs. Hardware is two levels above biology, and software is a level above that.


>Hardware is two levels above biology, and software is a level above that.

From a conceptual point of view that can be grabbed by some human mind, this all make perfect sense.

However, these hierarchies tell more about our way to handle sensedata than it reveals of the actual structure of the universe, if this does match anything relevant past our thoughts.


I'm not normally a hierarchical or categorical thinker, though I do like thinking of things as built up from other things.

It's important to recognize what thinking modality is best fit for a particular idea. You could instead describe this as a line of entropy. A pre-requisite chain. A web of interconnections. Whatever. And certainly it's grossly simplistic to describe it the way I did as being "two levels above", when these levels are based only on ad hoc categorical conveniences.

I agree with you. My describing it this way was a matter of convenience to describe an underlying truth of required precursors. When we have Von Neumann machines capable of evolving their code the link between software and the DNA/RNA system will be a better analogy than it is now. But an analogy is not an equivalence.


Are the same patterns punched into the cards as 100 years ago? This record is about software.


Sabre announced a 10-year deal with Google to migrate to the cloud. Coming soon, containerized microservices running System/360 assembler?

https://www.sabre.com/insights/releases/sabre-forges-10-year...


Wasn't that just system 360? It had a fairly high level of isolation between services possible.


It seems we've come... full circle.


Yea, it was pretty dope for its time.


I'm skeptical because Guinness World Records is a marketing agency masquerading as a ratings house / record keeper.

I googled "oldest software still in use" and the rest of the internet think it's MOCAS, the USA DOD contract management software, launched in 1958, two years earlier than Guinness' earliest guess for either of its options mentioned in the article: https://fossbytes.com/mocas-worlds-oldest-computer-program/

Sorry to be "that guy" but I'm just really cynical about Guinness and records in general. Plenty of their records really are just fun, and I can't find a profit angle for this article for either the IRS or whoever manages SABRE, so I guess I'm just being a snark.


> Guinness World Records is a marketing agency masquerading as a ratings house / record keeper

In case this is news to anybody, the basic shape of the grift is that while anyone can theoretically apply for a record and submit proof, the requirements are fairly stringent and complex. And wouldn't you know it, Guinness World Records offers various levels of "consulting" on setting/breaking a record, including defining the record to break, setting up an event, and flying out an adjudicator to witness the record being broken.


To clarify, by defining a record to break that includes making up new categories so that you automatically get the world record by being the only competitor. There are also records like "First Rubik's Cube", which are impossible to beat.


It was invented by a beer company to help settle debates in a pub. Guinness records are just a fun thing to appreciate even if not perfect.


I for one appreciate your "but actually" comment, the relentless pursuit of facts over fluff.

As an aside, I always thought "snark" was a real word, but apparently it's a neologism meaning "snide or sarcastic remark". It's also the name of various fictional creatures, including The Hunting of the Snark by Lewis Carroll, as well as in A Song of Ice and Fire.

https://en.wikipedia.org/wiki/Snark


Generally I feel the "Guinness World Records" is "valuable" in that "somebody somewhere put some level of thought and rigour into some record I would never have even realized existed".

Not an absolute definitive agency, and when there EXISTS a more official agency, take those; but it's a fun read of "these are reasonably close to the extreme in this obscure area I never thought of"


You're right to be cynical.

I have had 'continuously running' software system in house as old as they reference. From first hand experience, I'd imagine it's not something folks are crowing about.

Your reference is older than what I had in house.


It could be PR for the IRS? A bit of public attention to make a play for funding to replace or upgrade this system, etc?


Agreed. I would trust them regarding the world’s largest burrito or sourest lemon, but not this.


Makes me think of the Ship of Theseus.[1] How many changes can you make to legacy code and still consider it “legacy”?

[1] https://en.m.wikipedia.org/wiki/Ship_of_Theseus


In many places, infinitely many; legacy is just a convenient word meaning "we know it's old and bad, but we're keeping it."


Legacy doesn't always mean bad. It may have an older architecture or have some constraints that aren't always a problem. I work a few different product designs but they almost all come from a legacy product. It's not a bad legacy design, it does what it's supposed to within the environment it was designed for. Customers still buy the legacy design. It's cheaper and doesn't have the flexibility or add-on features of the newer designs so if they don't want them or can't use them, they'll buy the legacy product.


>”we know it’s old…”

I think the point I was trying to make is if a substantial amount has been changed, on what basis is it still considered “old”?

If every plank and nail of the ship is replaced during it’s voyage, is it still the same “old” ship that disembarked?


Philosophical Gedankenspiele aside, there are differences between a ship and software. I can best speak to the software side, as I'm not a man of the sea. For once, a ship has to carry its own weight, so you cannot without limit load more and more onto it, or replace parts by heavier parts. With software "evolving" (rather poor but common terminology, I know) in tandem with new hardware, on the other hand, programmers get more space and faster processing every year, so they are less constrained and can postpone radical but scary/risky refactoring. There is no force to radically take out old code, you can comment out a bit here and there and add lots of new fixes and extensions, increasing complexity and technical debt every year, every decade. The OP also mentioned underfunding of the organization, typically the budgets only permit small incremental changes to "keep the ship running": there is no possibility to start a fresh/modern implementation with a separate team in parallel due to the mind-boggling cost.

It would be interesting to run a diff of snapshots of any software in 1960 and its 2060 version, if e.g. SABRE or the IRS system may last that long.


And its worked for 60 years. I would take that over anything else.


Even if you replace every board in the ship, it's still constrained by its original design. These software systems could be similar in that even if every line of code were replaced, architectural patterns, file formats, etc. that were created for the original system will continue to influence the current design.


Or ribosomes in every living thing translates mRNA into protein. Over three billion years of generic evolution the ribosome amino acids differ considerably between far apart species, yet retain their function.

Before DNA sequencing became efficient, a particular protein inside the ribosome was used to study genetic divergence.


>Even if you replace every board in the ship, it's still constrained by its original design.

Is it though? Or are you assuming it has to be replaced life-for-like? If so, why don't we apply the same to software? I don't think that constraint is a given with hardware. Is a remodeled house constrained by it's original design? It seems to come down to how much you want to invest in "refactoring" the old house.

Your comment made me think of the evolution of the F/A-18. The new variants are utterly different than their original, yet are still technically the same airframe.


one word: 737MAX


Care to elaborate? I’m pretty familiar with the issue, but not sure what through line you’re making


> The new variants are utterly different than their original, yet are still technically the same airframe.

This is what Boeing had in the 737MAX. It was a different aircraft, but it looked like the 737-800.


The MAX was under an amended type certification, so they were acknowledging the changes. Boeing didn’t follow their own rules regarding safety design (like redundancy on critical items identified on their hazard analysis)


The wiki history of SABRE is an interesting read: https://en.wikipedia.org/wiki/Sabre_(travel_reservation_syst...

There is also an entry on IRS Individual Master File, but is short on information: https://en.wikipedia.org/wiki/Individual_Master_File


That brings back memories. In the late 90s, I travelled a lot, and managed to get an "Eaasy SABRE" account, which allowed me to book flights essentially like a travel agent, with a UI only slightly better than the one they used.

I remember spending whole evenings piecing together flights for a round-the-world trip at lowest cost. There was little or no error checking. I once almost booked a flight to Sydney, Nova Scotia, instead of the one down under.

Then, tragically, Travelocity shut down the service, leaving their crappy web-based frontend as the only online booking option until ITA came along.


wow, sounds neat. how was payment handled for booking?


IIRC, the service cost a small monthly fee, so you had to have a credit card on file with them. It may have been possible to just charge the booking fees to that credit card. However, I seem to recall there being a payment dialog at the end of the process, where you were prompted to enter a credit card number for payment.


For those viewing the comments before the article, be advised that it is not the case—as my not-yet-awake brain first thought on parsing this headline—that the software system belonging to the Guinness World Records is the oldest in continuous use.


>For those viewing the comments before the article

... any confusion is obviously your own fault


OP is talking TO people reading comments before the article - not saying s/he did so. Seems you fell into the same confusion hole as OP.


No, I didn't misread - I was just being less accommodating than GP. If those people are confused, they only have themselves to blame.


If you could pick any language to migrate these programs to, which one would you pick and why?

I've never used Java professionally, but that's probably what I'd pick. Seems to hit the sweet spot between time-tested, widely-used, enterprise-proven, performant, future-proof/portable, and well-understood. Seems from another comment that's what US DOD is betting on, as well.


How about the oldest software in common use? There's probably some parts of the 1973 C rewrite of Unix still around in modern Solaris, idk how common that is. tcsh also dates back to the '70s and is also falling into disuse.

Maybe TeX, written in 1982? Though pdfTeX displaced the classic TeX DVI compiler long ago, and all the major TeX distributions translate the Pascal source code to C. I don't know whether that means it's a different program from the one Knuth wrote 40 years ago.


BLAS is pretty old, with a first release in 1979. Even though modern computers likely don't use any of the old BLAS code, they still have implementations that respect the original API.

https://en.m.wikipedia.org/wiki/Basic_Linear_Algebra_Subprog...


Oh wow, I was clicking this link wondering if it was going to be exactly this. Karsten Nohl has a great CCC talk all about this [1] (I'll just copy their explanation of the talk)

Becoming a secret travel agent.

Travel booking systems are among the oldest global IT infrastructures, and have changed surprisingly little since the 80s. The personal information contained in these systems is hence not well secured by today's standards. This talk shows real-world hacking risks from tracking travelers to stealing flights.

Airline reservation systems grew from mainframes with green-screen terminals to modern-looking XML/SOAP APIs to access those same mainframes.

The systems lack central concepts of IT security, in particular good authentication and proper access control.

We show how these weaknesses translate into disclosure of traveler's personal information and would allow several forms of fraud and theft, if left unfixed.

[1] https://www.youtube.com/watch?v=vjRkpQever4


This is a great talk on it https://www.youtube.com/watch?v=8_t41xvPp1w from systems we love


"this code is specific to the System/360 Architecture, and so cannot be run on anything other than an IBM mainframe." This is not true. Emulators are pretty effective and can have passthoughs to new capabilities. Like the ability to play gameboy link cable games over the internet for example


There's a very complete System/360 through modern zSeries emulator [1] [2] that runs basically every IBM mainframe OS. IBM does not license their recent operating systems or languages to run on it. Though you can of course run Linux, or older versions of MVS or VM. Lack of OS and tool licensing is a showstopper for many users who might otherwise transition away from IBM hardware through emulation.

[1] https://en.wikipedia.org/wiki/Hercules_(emulator)

[2] https://sdl-hercules-390.github.io/html/


Are modern versions of IBM systems even popular compared to Linux? The reason these systems still exist is because they use legacy versions of software that could be covered by emulation. Any migration whatsoever would be contingent on the cost.


I'm sure z/OS is not nearly as popular as linux servers but there is a niche market for what it provides, just because it is old does not mean it is inferior.


Doesn’t z/Arch natively support S/360 binaries as well.


Imagine congress passed a law funding software upgrades for the IRS that came along with a bunch of tax code changes to "streamline" the upgrade process.


Oh man, I can remember product owners asking for something that wasn’t what they really wanted because they thought what they wanted would be hard to implement, but of course what they really wanted was easier to do than what they asked for.


This happens all the time.

A decade ago, a family friend with a band wanted help with burning their CDs.

Timidly, "hat in hand, eyes down", they asked if I could maybe, possibly, add 2 seconds of silence in front of their recorded tracks? It's OK if I couldn't, they'd understand.

"I finished it while we were talking"

"Really?? Amazing!! That's awesome! Didn't know you could do that!!

Now, can you also remove John's guitar from this song?"

They really had no clue that removing an instrument from a finished mix is HARD, as opposed to adding 2 seconds of silence (and for context this was years before the current deluge of "AI Powered" apps that might conceivably actually do that).


Reminds me of this: https://xkcd.com/1425/


> As much as 20 million lines of the IMF's code is reportedly written in Assembler – a major obstacle to any modernization

That's literally jobs for life for some (un)lucky team.


Preamble - I will be downvoted for sure, but let me still post it.

The oldest software system known to us, and in continuous use would be 'religion'. The hardware in this case is obviously humans. If you observe, the computer and its I/O all are derived from the hardware known to humans, and it is the human itself.

If you see, the hardware features rarely changed, but the software in this case updated and adapted to all situations and has been running for many centuries now. It probably can be related to STUXNET or affects only certain type of hardware. I know this is not the original post about, but this just came out as a thought.


Pretty sure sex came before religion.


No. Sex, culture and another much easier to grasp: tribes/herds.


Early in my career, I was offered a job, writing a word processor system in IBM 360 Assembler.

I'm fairly glad I declined.


Shit I’d be willing to work in mainframe assembler. Not like my career was going anywhere anyway, but most mainframe jobs either want established domain experts (people who have been there for decades and are intimately familiar with all the common mainframe technologies) or new grads they can underpay.


I suspect I was the latter, but I had already had quite a bit of experience with machine code and assembly (8085, 6800, 6502), by then.


It is more fun than dealing with AT&T x86 Assembly.


I think we simply underestimate the cost of software maintenance because we ignore software maintenance.

Migration to a new language is nothing but an extreme form of maintenance. And it's not even that expensive. We're just used to scales of a dozen or so programmers at best.

As a very conservative estimate, translation from assembly to, say, Java, should proceed at the pace of maybe 100 lines per day on average. So a team of 100 developers would need a year or so to translate a 20M lines application. Of course, there's also the effort to create tooling and the framework of the new application, but it's certainly doable.


That’s a hopelessly optimistic estimate. Just like 9 women cannot have a baby in a month, 100 of you cannot work 100x as fast as you do. I’d be surprised if the overall speed-up be anywhere more than 5-6x.

It might even be better if your team was smaller.

There’s a lot of overhead spent on communication. Probably even more so when converting legacy software.


I am willing to rewrite ALL the irs software in a language of their choice in return for ability to add a few lines of my own logic for when MY taxes are being processed. IRS, call me ;)


There was a protocol document which showed at least the data structures of the original IBM 7074 survive. I wouldn't be surprised at all if some of the machine code would survive as well, emulated easily.


Isn’t that tide prediction analog computer implemented with ropes and pulleys and invented before electricity still running in some places?


While that might count as “a computer”, it definitely doesn’t count as “software”.


There I was thinking my 90s code at Megabank that’s still running was an achievement. Fwiw it was and still is a simple time series db.


> the low level of funding the IRS has had for modernization

The IRS has been trying to modernize for decades.

Here's something from way back in the Clinton admin:

https://clintonwhitehouse4.archives.gov/pcscb/rmo_irs.html

But you'll never hear anyone there actually be responsible for the failures. They'll always whine about needing even more money.


The failures are due to the fact that the people who can get contracts don't have good programmers, and the people who have good programmers can't get contracts.


Some nonsense about having to run old assembler on IBM hardware. There are simulators these days, which would run many times faster on modern hardware.


This is downvoted, is it wrong? It would be very surprising to me if it weren't emulated.


I don't know, but just because it can be emulated does not mean it is equivalent. I have a friend who uses DOS software on his modern laptop and laments that it was faster running natively on a 486.


But we're talking old IBM software. That ran on a refrigerator with discrete logic. It had lights on the front that showed every register value. Under 1MHz if I recall.

An emulator could certainly beat the performance of something you could watch execute on incandescent lightbulbs.


it must have some hard wait time in it or something. Emulators on modern systems running DOS are absolutely many times faster at running old code than that 486 in 99.9% of the cases unless there is some weird timing/resource sharing issue


There might be licensing issues, see https://news.ycombinator.com/item?id=33744285.


Why can't they just.. nah, I'll stop there. :)


Last people managing this kind of old legacy software will inevitably die. The Young generation probably won't replace the entire industry, and from that point onward we would have a pretty massive rot and the forgotten knowledge rippling through the entire society.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: