While I appreciate you sharing this article, my free reads on New York Times have long been used up, and I don't plan to buy a subscription. But I've been to a book "factory" at Anadolu University in Eskisehir, Turkey. The printers ARE huge!
Die-hard Hershey's fan here. Maybe I'm a pleb, but I am not a fan of "artisanal" anything (these guys put the "anal" in "artisanal"). I only eat maybe one bar a week. I probably get ten times the amount of palm oil in my wife's cooking (she's a Filipina), and at 51, I can still run two miles better than most 30-year-olds. But hey, if you want to blow $5 or $10 on a chocolate bar, be my guest. I save my $s for books.
It never ceases to amaze me how computer programmers can take the simplest things and make them mind-numbingly complex. Oy ve (and yes, I'm a programmer)! But if it makes him happy, more power to him.
I apologize for the long quote, but I promise that if you read to the end, you will see that the discussion is germaine.
"I want to point out again the difference between writing a logical and a psychological language.
Unfortunately, programmers, being logically oriented, and rarely humanly oriented, tend to write and extol
logical languages. Perhaps the supreme example of this is APL. Logically APL is a great language and to
this day it has its ardent devotees, but it is also not fit for normal humans to use. In this language there is a game
of “one liners”; one line of code is given and you are asked what it means. Even experts in the language have
been known to stumble badly on some of them.
A change of a single letter in APL can completely alter the meaning, hence the language has almost no
redundancy. But humans are unreliable and require redundancy; our spoken language tends to be around
60% redundant, while the written language is around 40%. You probably think the written and spoken
languages are the same, but you are wrong. To see this difference, try writing dialog and then read how it
sounds. Almost no one can write dialog so that it sounds right, and when it sounds right it is still not the
spoken language.
The human animal is not reliable, as I keep insisting, so low redundancy means lots of undetected errors,
while high redundancy tends to catch the errors. The spoken language goes over an acoustic channel with
all its noise and must caught on the fly as it is spoken; the written language is printed, and you can pause,
back scan, and do other things to uncover the author’s meaning. Notice in English more often different
words have the same sounds (“there” and “their” for example) than words have the same spelling but
different sounds (“record” as a noun or a verb, and “tear” as in tear in the eye, vs. tear in a dress). Thus you
should judge a language by how well it fits the human animal as it is—and remember I include how they are
trained in school, or else you must be prepared to do a lot of training to handle the new type of language you
are going to use. That a language is easy for the computer expert does not mean it is necessarily easy for the
non-expert, and it is likely non-experts will do the bulk of the programming (coding if you wish) in the near
future.
What is wanted in the long run, of course, is the man with the problem does the actual writing of the code
with no human interface, as we all too often have these days, between the person who knows the problem
and the person who knows the programming language. This date is unfortunately too far off to do much
good immediately, but I would think by the year 2020 it would be fairly universal practice for the expert in
the field of application to do the actual program preparation rather than have experts in computers (and
ignorant of the field of application) do the progam preparation.
Unfortunately, at least in my opinion, the ADA language was designed by experts, and it shows all the
non-humane features you can expect from them. It is, in my opinion, a typical Computer Science hacking
job—do not try to understand what you are doing, just get it running. As a result of this poor psychological
design, a private survey by me of knowledgeable people suggests that although a Government contract may
specify the programming be in ADA, probably over 90% will be done in FORTRAN, debugged, tested, and
then painfully, by hand, be converted to a poor ADA program, with a high probability of errors!"
- Dr. Richard Hamming, "The Art of Doing Science and Engineering..." Written in the late 1990s
It's an interesting comment, and I just read that chapter. But I find it amusing that he's calling Ada (correct spelling, BTW, not ADA) non-humane in comparison to Fortran. Unless it's numerical code, Fortran is pretty non-humane. Computed goto anyone?
Well, being a mathematician, and having written important texts on numerical analysis, numerical programming was probably foremost in Hamming's mind. Interestingly, he doesn't accuse C of the same issues. I don't really have an opinion one way or the other. I just remembered the quote, and thought I'd share it. Hamming was a pretty awesome dude, so I reference him from time to time.
Sure, FORTRAN then. The language that Hamming was referencing.
Sadly, though, obsolete doesn't mean absent. I saw plenty of ostensibly professional code in the early 2010s that was developed using computed go to. It was "delightful" and totally humane code.
An excellent book that needs to be read carefully by many college students, including engineers, scientists, political scientists, and business students.