When an IBM salesman saw a coffee cup with the Amdahl Logo at a customers site, he was instructed to cut one million dollar of his price offer. That was the one million dollar coffee mug.
Amdahl sales people telling customers "this mug will get you a $1m discount if you ask for it" is not the same as IBM instructing sales people to discount $1m if they see the mug.
Obviously, knowing that your client is considering a rival company would make you consider dropping your price, but unless you have a better source I'm going to assume it was just nice spin around that fact from Amdahl people, not an actual IBM rule.
You have to wonder about the management at IBM that made him unhappy enough to leave in the first place. That kind of guy is going to do what he's going to do now matter what. Whether he does it for you or in competition against you is the only question.
There are very few lengths management shouldn't have gone to keep a guy like that happy at IBM.
He disagreed with their technical staff about how to design a mainframe.
The technical staff said that the operating system should run on microcode to abstract away the hardware. That way it would be easier for customers to migrate to new hardware as it became available. And they could easily add a new instruction if they needed to.
Gene said that it would be an order of magnitude faster if it ran directly on the hardware, and it wasn't that hard to support that API going forward.
Both proved right. Gene built computers that were massively faster than IBM's and perfectly compatible. IBM then added an instruction in micro-code and made all of their software use it. Gene's installed base all crashed on IBM's new code, while IBM's was fine. The US government launched an anti-trust lawsuit, which wound up binding IBM's hands for many years after.
IBM mainframes today still run on micro-code. And it still makes them massively slower than they need to be, but with better backwards compatibility. The mainframe world depends on a lot of programs from the 1960s and 1970s that runs, unchanged, today. Everyone else is using native instructions and runs faster.
I think you imply this, but to make it explicit: IBM mainframes run some instructions in microcode, others are native. The ones that are native are not slowed down by the existence of the microcode engine.
IBM analyzed programs-in-use to figure out which instructions bottlenecked real programs. These were the ones prioritized into native hardware. Others, where the bottleneck wasn't the central processor--like moving a block of memory from one location to another--were executed in microcode (at least back in the day: I was one of the engineers on the S/390 microcode engine.)
A good opportunity to mention the Popek-Goldberg criteria. If Amdahl's machines had been able to efficiently trap on the use of the new instruction (and execute a workaround using their existing instruction set) they might have been OK. Depends what the instruction was used for. Popek-Goldberg discusses the case where you want to virtualise your hardware: instruction set design becomes really important.
We at Cumulus have been told by our customers that having one of our rocket-turtle stickers on their laptop results in a much faster and easier negotiations about discounts with their Cisco sales rep. An Arista sticker probably works too.
"In 1946 he married Marian Quissell, who grew up on a farm four miles from his.
He received his bachelor’s degree in 1948 from South Dakota State University, in Brookings, where his wife worked as a secretary. She had dropped out of Augustana College in Sioux Falls, S.D., after her freshman year to go to work to help pay for her husband’s education."
- What a nice story of love, sacrifice and reward given they appeared to have stayed together till the end.
Unless you personally know either of the people involved, I don't think you are in a position to make a judgement as to whether it was "love, sacrifice and reward" or "prevailing cultural pressures at the time that were always regretted later"
How does that make a difference? We're supposed to discount his wife's sacrifice because of potential social pressures?
If you really wanted to be a cold hearted bastard about it you could have done a cost benefit analysis on the potential earnings of a theoretical physicist vs a secretary in the early 1950's before you discounted it from being a financial decision, but then you couldn't have made an opportunistic political statement due to a man's death.
Well, the thrust of it was that the man's career is important, and the woman's isn't. So assuming they had no children at the time, it was entirely consistent with that for her to quit college and take a job with no career path in order to fund his education. (If they had children, of course, the ethic would have been that she should stay home to care for them... indeed, she would likely have had no alternative.)
I think frossie has a good point, actually. We can hope that it was as jusben1369 suggests, but we can't really know that. Maybe being married to Gene gave Marian all she wanted in life, and maybe it didn't.
I have faith we'll eventually be able to restore all those important processes. It's really hard to destroy information. We just need a way to read it back.
I do not perceive myself to be Nikolai, but I do not trust my memories enough to claim I could not possibly be him, provided someone had the needed technologies to interfere with this timeline and my memories of it.
“By sheer intellectual force, plus some argument and banging on the table, he maintained architectural consistency across six engineering teams,” said Frederick P. Brooks Jr.
This is one of the more impressive things to me. Keeping that many people on the same page is an amazing feat.
I met Gene Amdahl once, when I took a summer course in computer architecture at UCSC. We got a tour of the Amdahl plant. They had a huge prototype CPU built out of TTL, which they built before making custom ICs. Each board in the prototype became one IC. Each row of cards in the prototype became one board. Each rack became one row of cards. That was how you debugged hardware designs in 1975.
If anyone new to the industry (software or hardware) hasn't read this yet -- you really really should. We had an MV-8000 at college and it performed quite well, considering the loads we threw at it (I recall it running ADA, COBOL, and Pascal compilers all at the same time)
In 1985 we were still prototyping with TTL packages.
By 1988 we were using Altera PALs, a big improvement in density. (PALs might have been available sooner, but I was working on microwave systems in the intervening years).
1985 as well, as it's how the Atari ST prototype was built; a board per chip, with the boards arranged in a circle around a core. It looked sort of like a little Cray-1.
FUD was first defined with its specific current meaning by Gene Amdahl the same year, 1975, after he left IBM to found his own company, Amdahl Corp.: "FUD is the fear, uncertainty, and doubt that IBM sales people instill in the minds of potential customers who might be considering Amdahl products."[
Gene Amdahl was remarkable person, both as an engineer and an entrepeneur.
The Computer History Museum has a very interesting interview with Gene Amdahl as part of its Oral History Collection - http://www.computerhistory.org/collections/catalog/102702492 - that is, quite frankly, more interesting than anything I could come up with right now.
I published a paper this year that used it directly and even proposed a modification of it to model job time for a distributed execution environment with a centralized master governing scheduling, task serialization, etc.
When you talk scale-out of any large system, or manycore whatever, Amdahl's law is absolutely fundamental, and when you hear someone say "perfect horizontal scalability," Amdahl's law governs when that will no longer be true.
People who do theory on distributed algorithms use it even if they don't realize it.
Amdahl's Law was taught in my Alma mater's "computer architecture" course. This course was primarily around CPU operation and law was specifically brought up in regards to instruction mixes and provided CPU execution units, but was then generalized.
This was the 2nd in a series of the four required courses "low level" courses (assembly programming, computer architecture, operating systems, compilers).
I am a systems software developer and Amdahl's Law is one of my standard interview questions.
A current undergrad and TA for CS courses at Cal (UC Berkeley) checking in here. The course I TA for, CS 61C Machine Structures (inst.eecs.berkeley.edu/~cs61c/) introduces the concept in our unit on parallelism. This is a required course for all undergraduate CS majors and is the third course in the introductory series.
It showed up in the context of a grad school seminar. I was shocked no one mentioned such a fundamental concept during my undergrad or regular classes. I've used the "law" to make arguments professionally several times.
Amdahl's Law definitely came up in my undergrad CE/CS courses at USC. I'm not sure how you could teach a computer architecture or processor design class and not mention it.
And for what it's worth, we still covered C/C++ and assembly (MIPS + x86), in addition to Java.
I did a degree in High Performance Computing and Amdahl's Law came up there. It never came up back when I was working on my undergrad C.S degree, but I didn't finish my degree either, so maybe it would have eventually.
I'm not a CS/Eng person, rather an astrophysicist who spends a reasonable amount of my time working on simulations. Amdahl's law has been mentioned during lectures at week-long schools on numerical methods.
For me it did, yes. I first saw Amdahl's Law in a parallel computing course. Many of my professors were of the old guard though, and they made sure we learned the fundamentals. I'm grateful for it.
I came across Amdahl's law repeatedly in my CS course.
Interestingly, I also came across it when studying for a MBA, in the context of operations management and speeding up business processes.
Yes, I'm about to finish up my undergrad CS degree, and Amdahl's Law came up in at least three separate classes, most notably my computer architecture course and my operating systems course. Side note, we were assigned to read Soul of a New Machine and the Pentium Chronicles in the computer architecture course, and most everyone enjoyed the books.
Australian National University - was discussed extensively in our 4th year Parallel Systems course, but not really anywhere else. Might have scored a mention in our 2nd year core Concurrent and Distributed Systems class, but I wouldn't remember.
I'm just outside of this range as I graduated with a BS in CS back in 2004, but learned about it when I took a computer architecture class in 2002, and I still enjoy seeing examples of it in real life.
Only did a few CS courses but it did come up in both the freshman sequence of courses and again in the Intro to Computer Architecture course. It was most certainly still being taught at my Java school.
Yes, in Operating Systems and in a parallel programming course. And on a few other occasions it was described informally, don't remember if the name actually was mentioned then.
It comes up repeatedly in Computer Science and Computer Engineering coursework at University of Illinois. Assembly, C and C++ are unavoidable in both programs.
I agree with your (apparently unpopular) opinion. I'd never heard of this as a named law before, but it's just a restatement of critical path analysis in a computing context.
> I agree with your (apparently unpopular) opinion. I'd never heard of this as a named law before, but it's just a restatement of critical path analysis in a computing context.
I suspect it's one of those things that seems obvious in hindsight or in the context of how computers work today. But when it was first discussed, it may not have been as obvious.
I understand it was used as a marketing punchline. Further, the gripe that systems wouldn't scale turned out to be a challenge rather than a death sentence.
At university, we used an IBM 370 for most of our programming classes. The 370 assembler was interesting to program on after learning 6502 assembler in high school and using 6809 for an EE class. I still have my banana book.
Flandreau, S.D. is also the home of an indian boarding school. Its not exactly a big town even by local standards.
[edit] love the quote “He’s always been right up there with Seymour Cray or Steve Wozniak,”
We lost Cray too early (f'n drunk driver) much like Jay Miner. I guess I should be grateful that unlike a lot of other professions, we walk the earth at the same time as our legends.
The purpose of Amdahl's 1967 paper was to dissuade customers that multiprocessors were cost-effective (read: more expensive for him to produce at Amdahl Corp). In a certain sense, he was correct. Ironically, however, that same paper is now the most quoted in the parallel processing literature. Moreover, he never wrote down the "law" that is now attributed to him---a marketing coup.
Stigler's law - that is, the fact that eponyms often deny credit for an invention to many people who were relevant for it - is one of the reasons why I try to avoid eponymy where possible.
The other reason is that eponymy is often a symptom of laziness. Rather than finding a good, descriptive name for something, you end up with an eponym that is not at all evocative of the meaning behind it. For example, and without putting too much thought into it, Amdahl's law could have been called something like law of scaling or law of bottlenecks.
> The other reason is that eponymy is often a symptom of laziness. Rather than finding a good, descriptive name for something, you end up with an eponym that is not at all evocative of the meaning behind it.
The tongue-in-cheek flip side of this coin is that, when discovering something, one should give it a descriptive but complex name. That way, people are more likely to refer to it by the names of the discoverers than by the clumsy descriptive title the discoverers chose. :)
It also ruined the company. They spent a lot of money on a potential merger with StorageTek, only for it to go down the drain when Fujitsu claimed that it's legal right to a percentage on products sold would be applicable to the complete combined company as well (source: I work with a bunch of ex StorageTek guys).
We really need to rid the world of Alzheimer as soon as possible. The bittersweet thing is that today's golden era titans of innovation may not live long enough for these anti-aging treatments to be deployed.
I highly suspect those titans of innovations - in many areas - just happen to be among the first to get to relatively low hanging fruits in a new area... This pattern repeats in many fields.
Sure. I mean, I expect to read similar obituaries one day for the computer science giants of today, like Jeff Dean (hopefully not for a very long time!).
Amdahl's bio makes it sound like he was an effective engineer and good at business too, but that his career was somewhat empty of success after he left Amdahl. That should not detract from his earlier achievements. Very few people have an unbroken track record of sustained success across their whole career.
He really invented a lot of good mainframe technology. Worked at IBM and then left to make his own company to compete with IBM making IBM Mainframe clones that ran faster and cost less than an IBM Mainframe.
I grew up in the era that used IBM 360/370 mainframes and I learned them in college for FORTRAN and COBOL and JCL with DOS/VSE. I don't think colleges teach mainframe technology anymore now that PCs have taken over.
Things moved from PCs to mobile devices so quick as well.
I wonder if there will ever be some kind of statue or memorial for one of those founding parents.
Ada Lovelace, Alonzo Church, Grace Hopper, John McCarthy, Haskell Curry, Seymour Cray, Gene Amdahl...
I'm sure that list is too short.
> Amdahl also benefited from antitrust settlements between IBM and the Justice Department, which required IBM to make its mainframe software available to competitors.
Anybody here who knows more about this? Did IBM have to sell the software for the same price, did they have to give it for free?
"After the unbundling, IBM software was divided into two main categories: System Control Programming (SCP), which remained free to customers, and Program Products (PP), which were charged for."
So what were Amdahl's customers able to get? Didn't they get System Control Software from IBM (1)? Was it free for them?
1) I've found the Computerworld 1 Nov 1976 article mentioning IBM's MVT 21.8 running on Amdahl's 470.
SCP remained bundled with the hardware, so you only got it if you bought the hardware, but if you bought the software it belonged to you so you could run it on whatever hardware you liked. I suppose if you bought both an IBM and a non-IBM mainframe you could take the SCP from IBM and run it on a non-IBM box, but I'm not sure if that was addressed in the ruling. The point was to decouple software sales from hardware sales when it came to commercial (sold) software.
I wonder if it's worth it for someone else to try to make a mainframe clone. I remember some company selling the Hercules emulator on an HP IA64 machine- PSI I think.
Fond memories. I was working for a financial services company when we went from an IBM mainframe complex to an Amdahl one. Besides being responsible for data center automation I was facilities manager, so I was in charge of the project. It only took 3 weeks to complete the entire migration. Thank you Gene.
The black bar looks strange on my tablet, not like I remember it from last time. The black is now about 1/7th of the bar and the remainder is orange. I seem to recall that the entire bar was black last time they did it, which I think worked much better.
Please consider making the black bar a link. When I first saw it, it wasn't clear why there's a black bar. Especially later if the post goes to the second page.
http://dealwhisperers.blogspot.co.il/2015/07/a-million-dolla...
here is a picture of it http://www.deadprogrammer.com/fud-you