Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Gene Amdahl has died (nytimes.com)
740 points by andrewbinstock on Nov 13, 2015 | hide | past | favorite | 111 comments


When an IBM salesman saw a coffee cup with the Amdahl Logo at a customers site, he was instructed to cut one million dollar of his price offer. That was the one million dollar coffee mug.

http://dealwhisperers.blogspot.co.il/2015/07/a-million-dolla...

here is a picture of it http://www.deadprogrammer.com/fud-you


Amdahl sales people telling customers "this mug will get you a $1m discount if you ask for it" is not the same as IBM instructing sales people to discount $1m if they see the mug.

Obviously, knowing that your client is considering a rival company would make you consider dropping your price, but unless you have a better source I'm going to assume it was just nice spin around that fact from Amdahl people, not an actual IBM rule.


> I casually rested on his desk the brochure for the Y Model car from the dealership next door.

I'm so doing that next time I buy a new car.


You have to wonder about the management at IBM that made him unhappy enough to leave in the first place. That kind of guy is going to do what he's going to do now matter what. Whether he does it for you or in competition against you is the only question.

There are very few lengths management shouldn't have gone to keep a guy like that happy at IBM.


He disagreed with their technical staff about how to design a mainframe.

The technical staff said that the operating system should run on microcode to abstract away the hardware. That way it would be easier for customers to migrate to new hardware as it became available. And they could easily add a new instruction if they needed to.

Gene said that it would be an order of magnitude faster if it ran directly on the hardware, and it wasn't that hard to support that API going forward.

Both proved right. Gene built computers that were massively faster than IBM's and perfectly compatible. IBM then added an instruction in micro-code and made all of their software use it. Gene's installed base all crashed on IBM's new code, while IBM's was fine. The US government launched an anti-trust lawsuit, which wound up binding IBM's hands for many years after.

IBM mainframes today still run on micro-code. And it still makes them massively slower than they need to be, but with better backwards compatibility. The mainframe world depends on a lot of programs from the 1960s and 1970s that runs, unchanged, today. Everyone else is using native instructions and runs faster.


I think you imply this, but to make it explicit: IBM mainframes run some instructions in microcode, others are native. The ones that are native are not slowed down by the existence of the microcode engine.

IBM analyzed programs-in-use to figure out which instructions bottlenecked real programs. These were the ones prioritized into native hardware. Others, where the bottleneck wasn't the central processor--like moving a block of memory from one location to another--were executed in microcode (at least back in the day: I was one of the engineers on the S/390 microcode engine.)


A good opportunity to mention the Popek-Goldberg criteria. If Amdahl's machines had been able to efficiently trap on the use of the new instruction (and execute a workaround using their existing instruction set) they might have been OK. Depends what the instruction was used for. Popek-Goldberg discusses the case where you want to virtualise your hardware: instruction set design becomes really important.


I wonder what the modern equivalents are. (Also, how to get an Amdahl mug today)


We at Cumulus have been told by our customers that having one of our rocket-turtle stickers on their laptop results in a much faster and easier negotiations about discounts with their Cisco sales rep. An Arista sticker probably works too.

It is one of the reasons we made the stickers. =)


I once put a vendor X sticker on my laptop before a sales meeting with vendor Y. It was the only sticker, so they didn't fail to notice it.


"In 1946 he married Marian Quissell, who grew up on a farm four miles from his.

He received his bachelor’s degree in 1948 from South Dakota State University, in Brookings, where his wife worked as a secretary. She had dropped out of Augustana College in Sioux Falls, S.D., after her freshman year to go to work to help pay for her husband’s education."

- What a nice story of love, sacrifice and reward given they appeared to have stayed together till the end.


Unless you personally know either of the people involved, I don't think you are in a position to make a judgement as to whether it was "love, sacrifice and reward" or "prevailing cultural pressures at the time that were always regretted later"


How does that make a difference? We're supposed to discount his wife's sacrifice because of potential social pressures?

If you really wanted to be a cold hearted bastard about it you could have done a cost benefit analysis on the potential earnings of a theoretical physicist vs a secretary in the early 1950's before you discounted it from being a financial decision, but then you couldn't have made an opportunistic political statement due to a man's death.


I thought that the prevailing cultural pressures at the time were about men working and women staying at home.


Well, the thrust of it was that the man's career is important, and the woman's isn't. So assuming they had no children at the time, it was entirely consistent with that for her to quit college and take a job with no career path in order to fund his education. (If they had children, of course, the ethic would have been that she should stay home to care for them... indeed, she would likely have had no alternative.)

I think frossie has a good point, actually. We can hope that it was as jusben1369 suggests, but we can't really know that. Maybe being married to Gene gave Marian all she wanted in life, and maybe it didn't.


Yeah, this comment shouldn't be downvoted so much even if it is offensive to some people.


It's being downvoted not (necessarily) because people think it's offensive, but rather because it's hypocritical.


Died? I'm pretty sure he's just been swapped into a context that we don't have access to and is running in parallel with us.

RIP


I have faith we'll eventually be able to restore all those important processes. It's really hard to destroy information. We just need a way to read it back.



I do not perceive myself to be Nikolai, but I do not trust my memories enough to claim I could not possibly be him, provided someone had the needed technologies to interfere with this timeline and my memories of it.


Thank you for this reference.

Explains a lot of Hannu Rajaniemi's "Quantum Thief" trilogy background ("Great Common Task" for those who read it):

https://en.wikipedia.org/wiki/The_Quantum_Thief


“By sheer intellectual force, plus some argument and banging on the table, he maintained architectural consistency across six engineering teams,” said Frederick P. Brooks Jr.

This is one of the more impressive things to me. Keeping that many people on the same page is an amazing feat.


Brooks's interview at the Computer History Museum mentions an argument over byte size: "Gene and I each quit once that week, quit the company, and Manny Piore got us back together." (http://archive.computerhistory.org/resources/access/text/201...)


Fred Brooks might've been thinking of him when he wrote "Thinkers are rare; doers are rarer; and thinker-doers are rarest." [1]

[1] Brooks, "The Mythical Man Month"


Indeed, combining the intellectual prowess with the necessary banging on the table is a valuable mix.


I met Gene Amdahl once, when I took a summer course in computer architecture at UCSC. We got a tour of the Amdahl plant. They had a huge prototype CPU built out of TTL, which they built before making custom ICs. Each board in the prototype became one IC. Each row of cards in the prototype became one board. Each rack became one row of cards. That was how you debugged hardware designs in 1975.


> That was how you debugged hardware designs in 1975.

soul of a new machine, spends couple of chapters discussing debugging the data-general machine as well. pretty interesting...


If anyone new to the industry (software or hardware) hasn't read this yet -- you really really should. We had an MV-8000 at college and it performed quite well, considering the loads we threw at it (I recall it running ADA, COBOL, and Pascal compilers all at the same time)

http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...


In 1985 we were still prototyping with TTL packages.

By 1988 we were using Altera PALs, a big improvement in density. (PALs might have been available sooner, but I was working on microwave systems in the intervening years).


1985 as well, as it's how the Atari ST prototype was built; a board per chip, with the boards arranged in a circle around a core. It looked sort of like a little Cray-1.



https://en.wikipedia.org/wiki/Fear,_uncertainty_and_doubt

FUD was first defined with its specific current meaning by Gene Amdahl the same year, 1975, after he left IBM to found his own company, Amdahl Corp.: "FUD is the fear, uncertainty, and doubt that IBM sales people instill in the minds of potential customers who might be considering Amdahl products."[


Gene Amdahl was remarkable person, both as an engineer and an entrepeneur.

The Computer History Museum has a very interesting interview with Gene Amdahl as part of its Oral History Collection - http://www.computerhistory.org/collections/catalog/102702492 - that is, quite frankly, more interesting than anything I could come up with right now.


The oral history is quite interesting and in depth. It really helps to get to know him. Thank you for sharing.


Honest question for those who got received CS/Engineering degrees in the last 10 years, did Amdahl's Law come up?

I've seen programming courses go away from assembler/c/c++ over the years to Java and then scripting languages at a couple local schools.

I wonder if the history/theory side changed as well.


I published a paper this year that used it directly and even proposed a modification of it to model job time for a distributed execution environment with a centralized master governing scheduling, task serialization, etc.

When you talk scale-out of any large system, or manycore whatever, Amdahl's law is absolutely fundamental, and when you hear someone say "perfect horizontal scalability," Amdahl's law governs when that will no longer be true.

People who do theory on distributed algorithms use it even if they don't realize it.


Amdahl's Law was taught in my Alma mater's "computer architecture" course. This course was primarily around CPU operation and law was specifically brought up in regards to instruction mixes and provided CPU execution units, but was then generalized.

This was the 2nd in a series of the four required courses "low level" courses (assembly programming, computer architecture, operating systems, compilers).

I am a systems software developer and Amdahl's Law is one of my standard interview questions.


A current undergrad and TA for CS courses at Cal (UC Berkeley) checking in here. The course I TA for, CS 61C Machine Structures (inst.eecs.berkeley.edu/~cs61c/) introduces the concept in our unit on parallelism. This is a required course for all undergraduate CS majors and is the third course in the introductory series.


Yes, I'm taking CS now and I saw it last semester in "Advanced Computer Architecture", a year-3 core module for CS and CE student in NTU, Singapore.


It showed up in the context of a grad school seminar. I was shocked no one mentioned such a fundamental concept during my undergrad or regular classes. I've used the "law" to make arguments professionally several times.


Amdahl's Law definitely came up in my undergrad CE/CS courses at USC. I'm not sure how you could teach a computer architecture or processor design class and not mention it.

And for what it's worth, we still covered C/C++ and assembly (MIPS + x86), in addition to Java.


I did a degree in High Performance Computing and Amdahl's Law came up there. It never came up back when I was working on my undergrad C.S degree, but I didn't finish my degree either, so maybe it would have eventually.


CS degree. Never heard of it. I graduated about 5 years ago.


Thanks. It's always interesting to see what people with comparable degrees encounter.


I'm not a CS/Eng person, rather an astrophysicist who spends a reasonable amount of my time working on simulations. Amdahl's law has been mentioned during lectures at week-long schools on numerical methods.


I finished undergrad 2 years ago.

It never really came up until I took the Parallel Computing course late into my degree.


An interesting blog post from Cleve @ TMW: http://blogs.mathworks.com/cleve/2013/11/12/the-intel-hyperc...


I did a masters of HPC in University of Edinburgh. Amdahl's Law was being mentioned from day 1.


For me it did, yes. I first saw Amdahl's Law in a parallel computing course. Many of my professors were of the old guard though, and they made sure we learned the fundamentals. I'm grateful for it.


I came across Amdahl's law repeatedly in my CS course. Interestingly, I also came across it when studying for a MBA, in the context of operations management and speeding up business processes.


EE degrees yes definitely. I wish CS students learnt it too. Every fancy language du jour eventually has to generate machine code after all.


Yes, I'm about to finish up my undergrad CS degree, and Amdahl's Law came up in at least three separate classes, most notably my computer architecture course and my operating systems course. Side note, we were assigned to read Soul of a New Machine and the Pentium Chronicles in the computer architecture course, and most everyone enjoyed the books.


Australian National University - was discussed extensively in our 4th year Parallel Systems course, but not really anywhere else. Might have scored a mention in our 2nd year core Concurrent and Distributed Systems class, but I wouldn't remember.


I'm just outside of this range as I graduated with a BS in CS back in 2004, but learned about it when I took a computer architecture class in 2002, and I still enjoy seeing examples of it in real life.


Only did a few CS courses but it did come up in both the freshman sequence of courses and again in the Intro to Computer Architecture course. It was most certainly still being taught at my Java school.


Yes, in Operating Systems and in a parallel programming course. And on a few other occasions it was described informally, don't remember if the name actually was mentioned then.


It comes up repeatedly in Computer Science and Computer Engineering coursework at University of Illinois. Assembly, C and C++ are unavoidable in both programs.


Definitely at CSULA. It's taught in OS, Multiprocessing in CS undergraduate, and in Multimedia and Networking in EE/CE at the grad level.


CS degree in Mackenzie - Brazil, the law was mentioned in Parallel Computing course. I took this class last year.

We have used C/C++ with MPI and OpenMP.


I graduated CS just a few months ago. It was mentioned a few times, thankfully! The concept encouraged me to think about parallelism more.


Finished my degree about a year ago. Amdahl's Law came up in two courses (one of them mandatory for first year students).


IN 2006 I saw it in my operating systems/architecture textbook (Bryant/O'Halloran)


It definitely came up in CMU's 15-418 Parallel Architecture class.


I went to Maryland and yeah it came up in the Concurrency course


Yes, Computer Architecture course for EE at UAEU.


I have a dissenting opinion: Amdahl's law is obvious and therefore it doesn't need to be explicitly taught.


I agree with your (apparently unpopular) opinion. I'd never heard of this as a named law before, but it's just a restatement of critical path analysis in a computing context.


> I agree with your (apparently unpopular) opinion. I'd never heard of this as a named law before, but it's just a restatement of critical path analysis in a computing context.

I suspect it's one of those things that seems obvious in hindsight or in the context of how computers work today. But when it was first discussed, it may not have been as obvious.


I understand it was used as a marketing punchline. Further, the gripe that systems wouldn't scale turned out to be a challenge rather than a death sentence.


For me, yes.


At university, we used an IBM 370 for most of our programming classes. The 370 assembler was interesting to program on after learning 6502 assembler in high school and using 6809 for an EE class. I still have my banana book.

Flandreau, S.D. is also the home of an indian boarding school. Its not exactly a big town even by local standards.

[edit] love the quote “He’s always been right up there with Seymour Cray or Steve Wozniak,”

We lost Cray too early (f'n drunk driver) much like Jay Miner. I guess I should be grateful that unlike a lot of other professions, we walk the earth at the same time as our legends.


The purpose of Amdahl's 1967 paper was to dissuade customers that multiprocessors were cost-effective (read: more expensive for him to produce at Amdahl Corp). In a certain sense, he was correct. Ironically, however, that same paper is now the most quoted in the parallel processing literature. Moreover, he never wrote down the "law" that is now attributed to him---a marketing coup.


Moreover, he never wrote down the "law" that is now attributed to him---a marketing coup.

https://en.wikipedia.org/wiki/Stigler's_law_of_eponymy strikes again.


Stigler's law - that is, the fact that eponyms often deny credit for an invention to many people who were relevant for it - is one of the reasons why I try to avoid eponymy where possible.

The other reason is that eponymy is often a symptom of laziness. Rather than finding a good, descriptive name for something, you end up with an eponym that is not at all evocative of the meaning behind it. For example, and without putting too much thought into it, Amdahl's law could have been called something like law of scaling or law of bottlenecks.


> The other reason is that eponymy is often a symptom of laziness. Rather than finding a good, descriptive name for something, you end up with an eponym that is not at all evocative of the meaning behind it.

The tongue-in-cheek flip side of this coin is that, when discovering something, one should give it a descriptive but complex name. That way, people are more likely to refer to it by the names of the discoverers than by the clumsy descriptive title the discoverers chose. :)


I completely agree with nhaehnle's law.


Most people don't grok that

(IBM).1960s+1970s = (Apple+Microsoft+Google+Dell+OpenSource).today

for many reasons, but mostly because of the 360 Series, upon which I and many my age got started.

I don't miss the COBOL ENVIRONMENT DIVISION or the punch cards much, but I do miss the technical brilliance which captured the industry.

RIP Sweet Genius


For those who may not have heard of him: https://en.wikipedia.org/wiki/Amdahl%27s_law


Amdahl's law is such a powerful idea. I use it to prioritize development items every week if not every day.

Always sad to see a story like this on hacker news. I'll be talking about his ideas for the rest of my career, guaranteed.


> With funding from Fujitsu, he formed the Amdahl Corporation, setting up offices in Sunnyvale, Calif.

I didn't realize that the tradition of "avoid SV VC and get funding for cheap from an overseas operating company" went so far back.


It also ruined the company. They spent a lot of money on a potential merger with StorageTek, only for it to go down the drain when Fujitsu claimed that it's legal right to a percentage on products sold would be applicable to the complete combined company as well (source: I work with a bunch of ex StorageTek guys).


IBM totally dominated the computer industry. You couldn't get American funding to take on IBM.


We really need to rid the world of Alzheimer as soon as possible. The bittersweet thing is that today's golden era titans of innovation may not live long enough for these anti-aging treatments to be deployed.


I highly suspect those titans of innovations - in many areas - just happen to be among the first to get to relatively low hanging fruits in a new area... This pattern repeats in many fields.

Not to diminish Amdahl's achievements, of course.


Sure. I mean, I expect to read similar obituaries one day for the computer science giants of today, like Jeff Dean (hopefully not for a very long time!).

Amdahl's bio makes it sound like he was an effective engineer and good at business too, but that his career was somewhat empty of success after he left Amdahl. That should not detract from his earlier achievements. Very few people have an unbroken track record of sustained success across their whole career.


He really invented a lot of good mainframe technology. Worked at IBM and then left to make his own company to compete with IBM making IBM Mainframe clones that ran faster and cost less than an IBM Mainframe.

I grew up in the era that used IBM 360/370 mainframes and I learned them in college for FORTRAN and COBOL and JCL with DOS/VSE. I don't think colleges teach mainframe technology anymore now that PCs have taken over.

Things moved from PCs to mobile devices so quick as well.

Edit;Typo


I wonder if there will ever be some kind of statue or memorial for one of those founding parents. Ada Lovelace, Alonzo Church, Grace Hopper, John McCarthy, Haskell Curry, Seymour Cray, Gene Amdahl... I'm sure that list is too short.


There's a small memorial for Grace Hopper in Arlington: http://parks.arlingtonva.us/locations/grace-murray-hopper-pa...


> Amdahl also benefited from antitrust settlements between IBM and the Justice Department, which required IBM to make its mainframe software available to competitors.

Anybody here who knows more about this? Did IBM have to sell the software for the same price, did they have to give it for free?


It meant that Amdahl's (and other manufacturer's) customers could buy IBM software and run it on non-IBM hardware.


I still don't know the answer:

https://en.wikipedia.org/wiki/History_of_IBM#1969:_Antitrust...

"After the unbundling, IBM software was divided into two main categories: System Control Programming (SCP), which remained free to customers, and Program Products (PP), which were charged for."

So what were Amdahl's customers able to get? Didn't they get System Control Software from IBM (1)? Was it free for them?

1) I've found the Computerworld 1 Nov 1976 article mentioning IBM's MVT 21.8 running on Amdahl's 470.


SCP remained bundled with the hardware, so you only got it if you bought the hardware, but if you bought the software it belonged to you so you could run it on whatever hardware you liked. I suppose if you bought both an IBM and a non-IBM mainframe you could take the SCP from IBM and run it on a non-IBM box, but I'm not sure if that was addressed in the ruling. The point was to decouple software sales from hardware sales when it came to commercial (sold) software.


Very old versions of IBM OSes (including MVS and VM) are available for free, much like Sun would subsequently give away SunOS.


I wonder if it's worth it for someone else to try to make a mainframe clone. I remember some company selling the Hercules emulator on an HP IA64 machine- PSI I think.

It would be fun to try to make an FPGA version.

Oh yeah, PSI: http://www.theregister.co.uk/2008/07/02/ibm_buys_psi/


Fond memories. I was working for a financial services company when we went from an IBM mainframe complex to an Amdahl one. Besides being responsible for data center automation I was facilities manager, so I was in charge of the project. It only took 3 weeks to complete the entire migration. Thank you Gene.



One minute of zeroes.


I think he deserves a HN Back Bar tribute.


It would seem that HN agree. Black bar is now up.


The black bar looks strange on my tablet, not like I remember it from last time. The black is now about 1/7th of the bar and the remainder is orange. I seem to recall that the entire bar was black last time they did it, which I think worked much better.


Best bit is the implementation: spacer gif in a table cell like it's 1999.


Pretty sure it hasn't changed.


Some evidence it hasn't changed (as far as I can see): https://www.flickr.com/photos/turoczy/5886505173


Please consider making the black bar a link. When I first saw it, it wasn't clear why there's a black bar. Especially later if the post goes to the second page.


might be a sideeffect of the mobile adaption they did a few weeks ago


Agreed, one of the giants.


Would it make sense to pin the relevant post to the top of the page when HN does a Black Bar tribute?


I wonder if it would be feasible to do this automatically when the top story contains the word "died" or synonyms thereof.


"Java is dead"-like posts would trigger it.

Also I don't think automatisation of remembrance gestures is a good thing.


Hmm, you're right about hyperbolic deaths.

But I rather like the idea of the black bar being effectively triggered by a consensus of HN users rather than waiting for an admin.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: