Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ada is not just another programming language (1986) (sci-hub.se)
81 points by kickitover on Oct 30, 2021 | hide | past | favorite | 66 comments


I've worked in a lot of different languages, and picked up Ada 2012 this year. I found it very productive and wrote a tool I use daily for professional development in it. It's verbose, but it's amazing how much all the work it forces you do up front saves you down the line. Built-in range checks for types, and pre/post conditions and invariants are interesting to work with. There's even a tool similar to Cargo, called Alire (https://alire.ada.dev/). It has its quirks, but overall I've been very impressed.


I like how in ada you can specify custom range integers


Pascal had subrange types before Ada, not only for integers but characters too.

    var day : 1 .. 31;
        letter : 'A' .. 'Z';
And you could specify ranges on array subscripts as well:

    var weight : array[-5 .. 5] of integer;


Wirth removed this again in Oberon, because experience with Pascal and Modula showed that it is not worth it.


Not sure that is the actual reason. (Also define worth it)

Wirth had developed that original, but misguided (IMO) notion that the faster a compiler can compile itself the better it is.

Consequently he has removed a lot from latest iterations of its languages/compilers

Pascal had an enum-like feature we could define an enumerated type.

Wirth removed it because... `CONST` should be enough for everyone.

Obviously it has simplified the compiler and increased the self-compilation speed. But I would not call that an improvement.


If you read the paper, From Modula to Oberon by Wirth, I already linked in a pervious comment when answering the question about subranges, you can also find the reasoning behind the removal of enumerations.

"Enumeration types appear to be a simple enough feature to be uncontroversial. However, they defy extensibility over module boundaries. Either a facility to extend given enumeration types has to be introduced, or they have to be dropped. A reason in favour of the latter, radical solution was the observation that in a growing number of programs the indiscriminate use of enumerations(and subranges) had led to a type explosion that contributed not to program clarity but rather to verbosity. In connection with import and export, enumerations give rise to the exceptional rule that the import of a type identifier also causes the (automatic) import of all associated constant identifiers. This exceptional rule defies conceptual simplicity and causes unpleasant problems for the implementor."

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.578...


I’m not sure this was the reasoning. None of the languages in the direct Algol heritage have any sort of parametric polymorphism, including for range bounds, so working with arrays (which were, at least in Pascal, the primary application of range types) was either constant and repetitive fighting with the compiler or unsafe escape hatches all the way. (Usually both.) I believed this was the primary motivation for the removal of ranges.

(Of course, it’s literally impossible to make polymorphic range bounds work completely, due to the undecidability of Peano arithmetic. The most convincing attempt I’ve seen that doesn’t just go full dependent types on you is ATS, and even that is not exactly the friendliest of environments.)

Another system whose evolution was explicitly guided by self-compilation speed is Chez Scheme, and it’s a fine one. Generally it seems to me that it’s a valid optimization principle, but not a global one: you can fall into an unfortunate local minimum if you start in the wrong place. ... Well, so what, it’s not like there are any infallible design principles in programming.


That's very interesting. Subranges is a feature that I have always liked in Pascal and other languages.

You do have any more detail about the reasoning why Wirth thought this feature was "not worth it" for Oberon? Were programmers not using this feature in Pascal and Modula-2? Did Wirth consider it more noise or complexity in those languages?


Found this paper "From Modula to Oberon" by Wirth himself

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.578...

"Subrange types were introduced in Pascal (and adopted in Modula) for two reasons: (1) to indicate that a variable accepts a limited range of values of the base type and to allow a compiler to generate appropriate guards for assignments, and (2) to allow a compiler to allocate the minimal storage space needed to store values of the indicated subrange. This appeared desirable in connection with packed records. Very few implementations have taken advantage of this space saving facility, because the additional compiler complexity is very considerable. Reason 1 alone, however, did not appear to provide sufficient justification to retain the subrange facility in Oberon."

"... was the observation that in a growing number of programs the indiscriminate use of enumerations (and subranges) had led to a type explosion that contributed not to program clarity but rather to verbosity."

Instead the proposed solution is to use SET.

"With the absence of enumeration and subrange types, the general possibility of defining set types based on given element types appeared as redundant. Instead, a single, basic type SET is introduced, whose values are sets of integers from 0 to an implementation-defined maximum."


Thank you :-)

It seems subranges could be something of a double-edged sword. Too many enumerations and subranges in a program adds complexity. However, without them I presume the guards need to be coded manually elsewhere in the program.

It's really fascinating to see the reasoning a programming language designer makes when choosing what feature to include - or exclude.


> without them I presume the guards need to be coded manually elsewhere in the program.

That is my conclusion as well, reading the Oberon manual a SET seems just to be a normal set but for integers only, thus you have to manually check if your integer value is part of the set within the procedure itself, however that check will be a runtime check, not a compile time check if you had a proper enumeration type (I assume). And to my understanding Oberon doesn't have any error handling, so how exactly you would promote a out of range check to an error is unknown to me.

I guess that you would probably want to define constants to fill the set with, thus adding some more noise.

  CONST RED = 1;
  CONST GREEN = 2;
  CONST BLUE = 3;
  SET COLORS = (RED, GREEN, BLUE)
So you can use it a procedure call

  draw(RED);
Compare to Pascal enumeration

  type COLORS = (RED, GREEN, BLUE);
It feels like Wirth was somewhat obsessed with the idea of type extension with Oberon, it should be possible to import any module (program) and extend it without restrictions and enumerations created a problem how you would extend it.

If you have a color module defining the enumeration COLORS together with a bunch of procedures accepting COLORS, then in another custom module you want extend COLORS with the color YELLOW plus some YELLOW specific procedures.

But if you pass an extended enumeration E' to a procedure that accepts the base enumeration E, which should be legal similar how you can pass T' as T, the procedure would receive illegal values, in this case YELLOW. What you need to do instead is create your own CUSTOM_COLORS enumeration, manually include every color from COLORS and add YELLOW, and when calling procedures within the colors module type cast from CUSTOM_COLORS to COLORS enumeration. Perhaps this is the type explosion Wirth was referring to.

Out of range would still be problem if you extend a SET, but you avoid type casts.

This is how I interpret Wirth reluctance to enumerations (and subranges).

https://www.inf.ethz.ch/personal/wirth/Oberon/Oberon07.Repor...


Nim borrows from both. It's nice using it for embedded work:

    proc set_mode(mode: range[0..3]) ## SPI mode (0-3)


Now let's say your name includes one of these weird 'Ä' characters...


Yes, but Ada extended the number of usable types for ranges, e.g. including fixed-point numbers.

All those that existed in Pascal also exist in Ada.


Ah yes, it's been a while since I've touched pascal.. Good times


other than "older" languages in the Pascal family, there is an extended "subset" feature in Raku[0], which is pretty cool.

For number ranges you would say

    subset HandCountable of Int where 1 <= * <= 5
[0] https://ohmycloudy.github.io/24.html


I've always wondered - does Ada track or enforce these ranges?

Like, if I have a variable `i` in range 0...10 and write the Ada equivalent of `i++`, does Ada know that from there on out `i` is in range 1...11? Something like TypeScript's type narrowing?


Yes, it's enforced by any _runtime_ operation. See: https://www.adaic.org/resources/add_content/standards/05rm/h...


The range wouldn’t change in Ada. The type for a variable is fixed at compile time. It will ensure (at runtime and partially at compile time) that the value doesn’t go outside the range, though.


Usually with runtime checks, but it can also in some situations be done statically, such as with SPARK.


Ada was the programming language used when I took computing science in Glasgow, Scotland in 2002.


Western Washington University used it for intro CS classes until about 8 years ago. They switched to doing the professor's choice between Java and Python which most people seem to prefer now. Some of the professors do still use it for classes on concurrency and proofs.

I really appreciated the sort of errors the compiler gives you but remember struggling to find good beginner level documentation. I kept landing on more specification-style docs which were more confusing than helpful. Also diving straight into Java after Ada was a shock.


It was Pascal, Haskell* and C back in the 1990s.

* and variants of Haskell, e.g. Gofer


Fyt like the day in Glasgow?


Ada is a language that has been on my "todo list" of languages for awhile (just after FORTH). I generally hate this term, but Ada does kind of look like a language that was "ahead of its time", specifically in regards to concurrency.


While studying how concurrency is done in Ada is certainly instructive and even today many languages and multi-threading libraries have more poor support for concurrent programming than Ada had since the beginning, a language that was really ahead of its time regarding concurrency was IBM PL/I.

15 years before Ada, in 1964, IBM PL/I had facilities for multitasking that e.g. included a wait function that could do everything that can be done with WaitForMultipleObjects / WaitForSingleObject. The POSIX threads suck because they lack such a powerful wait function and many other multi-threading libraries, e.g. the C++ standard library, are forced to offer only what is available everywhere, so they do not have anything that is missing in the POSIX threads.


Any kernel based wmo is bot going to work well for fastpathed primitives. For example on windows you can't wmo for critical sections or keyed events.

But you can implement an pure userspace or hibrid user/kernel wmo even on top of something like unix poll just fine, so saying that the standard lacks it because of POSIX is wrong.


The UNIX wait system call, introduced in 1970 or 1971, after porting UNIX to PDP-11, provided a subset of the features of PL/I wait, e.g. it allowed waiting until any of the children of a process terminates.

So UNIX had very early some kinds of waiting for multiple events. A decade later, various other waiting functions, e.g. select and poll were introduced, for networking.

I did not say anything about UNIX processes, but about the standard POSIX threads, the normal way to use multiple threads in a single process under UNIX-compatible operating systems. For example, with pthreads, you can join only a single thread. You can join all threads in an inefficient way, joining them one-by-one, but there is no way to join the first thread that happens to terminate, which is actually the most frequent thing that you want to do.

You can achieve this in a convoluted and inefficient way, using various forms of communication between threads, but this is a very basic feature that should have existed in the standard.

Taking into account that various methods of waiting for multiple events had existed for decades in UNIX, it is even more baffling that this was omitted in the POSIX threads standard. Like I have said, this has also constrained the later standards, e.g. the C++ standard.


Wait doesn’t wait on multiple objects. It just check the process table to see if any children have exited. If they haven’t it waits for a signal... any signal! It also sets a handler for sigchild. Unix in general can’t do the same for and arbitrary object because a) it requires that the waitable only signal a single thing, and b) there are only a small number of signals! So wait is a little special. Generally wait on multiple objects requires a general system to handle asynchronous ipc, which Unix doesn’t have (unlike say windows).

Also if you want to wait on multiple events, use a semaphore!


Ada has every feature you could ever want to express programmatic intent.


A technical and non-technical article on the development of the Ada programming language.


Well it's essentially a PR piece, says lots of reasons why Ada is better than everything without actually showing any concrete examples


The Wikipedia article has some good stuff here: https://en.m.wikipedia.org/wiki/Ada_(programming_language)


I'm very aware of what Ada is like (I was doing my thesis on compiler design back when it was still mostly a creature of committees) - I was more criticising the article, not the language


Ahh ok


Given that everybody has limited time to learn new programming languages, does HN recommend learning Rust or Ada?


The consensus is probably Rust and even as an Ada fan/admirer I’d recommend it if you only have time for one. But you probably have time for at least a basic intro to Ada, and Adacore has some good and quick(ish) introductions at https://learn.adacore.com.


Not related to Ada but I just find it so fantastic that we still have sci-hub and we can conveniently share paywalled documents. I love sci-hub.


I apologize for the long quote, but I promise that if you read to the end, you will see that the discussion is germaine.

"I want to point out again the difference between writing a logical and a psychological language. Unfortunately, programmers, being logically oriented, and rarely humanly oriented, tend to write and extol logical languages. Perhaps the supreme example of this is APL. Logically APL is a great language and to this day it has its ardent devotees, but it is also not fit for normal humans to use. In this language there is a game of “one liners”; one line of code is given and you are asked what it means. Even experts in the language have been known to stumble badly on some of them.

A change of a single letter in APL can completely alter the meaning, hence the language has almost no redundancy. But humans are unreliable and require redundancy; our spoken language tends to be around 60% redundant, while the written language is around 40%. You probably think the written and spoken languages are the same, but you are wrong. To see this difference, try writing dialog and then read how it sounds. Almost no one can write dialog so that it sounds right, and when it sounds right it is still not the spoken language.

The human animal is not reliable, as I keep insisting, so low redundancy means lots of undetected errors, while high redundancy tends to catch the errors. The spoken language goes over an acoustic channel with all its noise and must caught on the fly as it is spoken; the written language is printed, and you can pause, back scan, and do other things to uncover the author’s meaning. Notice in English more often different words have the same sounds (“there” and “their” for example) than words have the same spelling but different sounds (“record” as a noun or a verb, and “tear” as in tear in the eye, vs. tear in a dress). Thus you should judge a language by how well it fits the human animal as it is—and remember I include how they are trained in school, or else you must be prepared to do a lot of training to handle the new type of language you are going to use. That a language is easy for the computer expert does not mean it is necessarily easy for the non-expert, and it is likely non-experts will do the bulk of the programming (coding if you wish) in the near future.

What is wanted in the long run, of course, is the man with the problem does the actual writing of the code with no human interface, as we all too often have these days, between the person who knows the problem and the person who knows the programming language. This date is unfortunately too far off to do much good immediately, but I would think by the year 2020 it would be fairly universal practice for the expert in the field of application to do the actual program preparation rather than have experts in computers (and ignorant of the field of application) do the progam preparation.

Unfortunately, at least in my opinion, the ADA language was designed by experts, and it shows all the non-humane features you can expect from them. It is, in my opinion, a typical Computer Science hacking job—do not try to understand what you are doing, just get it running. As a result of this poor psychological design, a private survey by me of knowledgeable people suggests that although a Government contract may specify the programming be in ADA, probably over 90% will be done in FORTRAN, debugged, tested, and then painfully, by hand, be converted to a poor ADA program, with a high probability of errors!"

     - Dr. Richard Hamming, "The Art of Doing Science and Engineering..." Written in the late 1990s


It's an interesting comment, and I just read that chapter. But I find it amusing that he's calling Ada (correct spelling, BTW, not ADA) non-humane in comparison to Fortran. Unless it's numerical code, Fortran is pretty non-humane. Computed goto anyone?


Well, being a mathematician, and having written important texts on numerical analysis, numerical programming was probably foremost in Hamming's mind. Interestingly, he doesn't accuse C of the same issues. I don't really have an opinion one way or the other. I just remembered the quote, and thought I'd share it. Hamming was a pretty awesome dude, so I reference him from time to time.


Fortran (name since 1990) has SELECT CASE to make Computed GO TO obsolete. Maybe you are thinking of FORTRAN?


Sure, FORTRAN then. The language that Hamming was referencing.

Sadly, though, obsolete doesn't mean absent. I saw plenty of ostensibly professional code in the early 2010s that was developed using computed go to. It was "delightful" and totally humane code.


Ada is just another programming language.


True, but nevertheless Ada is an important language in the history of programming languages.

After 1970, there were less and less innovations in programming languages, in the sense of features that have not existed in any earlier languages.

Many new languages have been introduced since then and some of them might be better than most previous languages, but usually the new languages offer only new combinations of features that have previously existed in different languages and not anything really new.

Ada is important, because in 1979 it included a few features never provided before. Some of those features have been introduced only recently in more popular languages, while others are still missing from most languages.

For anyone who wants to create or improve some programming language, Ada is on a long list of mandatory programming languages that must be understood well, before attempting to do anything that is intended to be better. Sadly, there are many examples of ugly misfeatures in recent programming languages, which demonstrate that their authors were not aware about which was the state of the art 40 years or 50 years ago, and they solved again, but badly, problems that were solved well already in the distant past.

Unfortunately Ada also had defects, some of which were mandated by the Department of Defense requirements.

The defect that is most universally accepted is that it is too verbose.


>> The defect that is most universally accepted is that it is too verbose.

Another huge barrier (especially to early adoption) was the cost of Ada toolchains.

Even today, there are proprietary Ada implementations that cost thousands of dollars per seat.


There are also C and C++ toolchains that cost similar amounts (if you want to use them for safety critical systems). But they do have more free or open source options than Ada does. Fortunately FSF GNAT is free and unencumbered (unlike AdaCore's release of GNAT GPL).


> But they do have more free or open source options than Ada does. Fortunately FSF GNAT is free and unencumbered (unlike AdaCore's release of GNAT GPL).

Sorry, I don't quite get what you're trying to say here. You mean unencumbered by being free or open source?


GNAT GPL removes the runtime exception, so if you build something with it linked to its standard library, it's also supposed to be open sourced. This means you can't (in a legal/technical sense, not a true prohibitive sense) make closed source software with it. FSF GNAT doesn't remove that exception, so it can be used in releasing closed source software.

That's the encumbrance that GNAT GPL imposes and FSF GNAT does not.


Got it, thanks!


>Some of those features have been introduced only recently in more popular languages, while others are still missing from most languages.

Can you give an example of an ADA feature missing from most languages? I know ADA is supposed to be good for writing reliable software, are there any important features related to that which other languages could adopt?


A feature that for a long time was missing from most languages, but which has been adopted by many during the last decade, is to accept separator characters in numbers, to improve the readability of long numerical constants.

Ada introduced this in 1979, by allowing "_" in numbers. Cobol, in 1960, allowed hyphens in identifiers, for better readability. Because hyphens can be confused with minus, IBM PL/I, in 1964, replaced hyphen with low line, which remains in use until today in most programming languages. Ada extended the usage from identifiers to numbers. Most languages have followed Ada and also use "_" for this purpose, except C++ 2014, which based on a rationale that I consider to be extremely wrong, has substituted the low line with single quote.

While this is an example of an Ada feature that could be easily adopted in any other language, other features are more difficult to adopt without important changes in the language, so they did not spread much.

An example is the specification of the procedure/function parameters as being of 3 kinds, in, out and inout.

This feature was not invented by the Ada team, but by one of the authors of the DoD IRONMAN language specifications, maybe by David Fisher, but the DoD documents do not credit any authors.

In the predecessor of Algol, IAL 1958, the procedure parameters had to be specified as in or out. However this feature was dropped in ALGOL 60. Nevertheless, there was a programming language, JOVIAL, which, unlike most programming languages, was derived directly from IAL 1958, and not from the later version, ALGOL 60.

So JOVIAL inherited the specification of parameters as in or out. JOVIAL happened to be used in many DoD projects, and because of this it influenced the initial versions of the DoD requirements, which eventually resulted in Ada.

The first DoD requirements included the specification of in and out parameters, but in the beginning the authors did not have a good understanding about how the parameter specification should be used, so the DoD requirements had a bad wording, implying that this specification is meant to determine whether the parameters shall be passed by value or by reference.

After several revisions of the DoD requirements, in 1977, the IRONMAN requirements were issued, which were very much improved. By the time when IRONMAN was written, the authors had realized that whether the parameters are passed by value or by reference is an implementation detail that must be decided by the compiler and which must be transparent for the programmer.

Moreover, they realized that 3 categories must be specified, i.e. out and inout must be distinct, because the semantic is very different and the compiler must do different actions to implement them correctly.

Many current programming languages are much more complicated than necessary because they lack this 3-way distinction of parameters.

The language most affected by this is C++, which has struggled for 30 years, from 1980 until 2011, until it has succeeded to include in the language the so called "move semantics", to avoid redundant constructions and destructions of temporaries. Even if now the extra temporaries may be avoided, this requires a contorted syntax.

All such problems could have been trivially avoided since the beginning, if C++ had taken from Ada the "out" and "inout" specifications. During the transition from C with Classes to C++, in 1982-1984, C++ was nonetheless strongly influenced by Ada in the introducing of overloaded functions, overloaded operators and generic functions (templates).

While C++ has introduced the reference parameters, to avoid to write large quantities of "&" and "*", like in C, it would have been much better to apply the Ada method, where it is completely transparent whether the parameters are passed by value or by reference and the programmers never have to deal with "&", unless they use explicit pointers and pointer arithmetic, wherever pointers are really needed for their extra features, not for telling the compiler how to do its job.

This purpose is as obsolete as the use of the keyword "register" for telling the compiler where to allocate variables. Even the C/C++ compilers ignore the fact that the programmer writes that an input parameter shall be passed by value and they pass it by reference anyway if the parameter is too large. This should have been the rule for any kind of parameters.


Thanks, interesting! However I can't help but notice that the in/out/inout stuff wouldn't do much for a modern dynamic language like Python or Ruby... are there features of Ada that those languages could do well to adopt?


> many examples of ugly misfeatures in recent programming languages

I've seen more misfeatures that seem to come from people not knowing the history of JCL. The lesson would be to plan for expansion of the semantic space.

> The defect that is most universally accepted is that it is too verbose.

It feels less verbose than Java or C#.


How verbose is it compared to others like Go or Rust ?


Somewhat more verbose.

A great part of the verbosity is due to the fact that unlike CPL/BCPL/B/C and the languages inspired by them, which have replaced the Algol statement parentheses begin and end with "{" and "}" or similar symbols, Ada uses relatively long words as statement parentheses, e.g. loop and end loop.

On the other hand, a good feature of Ada is that it followed Algol 68 in having different kinds of statement parentheses for different program structures, so you do not need to spend any time in wondering whether a closing "}" matches the one opened by a "for" or by an "if", many lines above.


Moreover, besides the long statement parentheses, the other main source of verbosity in Ada is that Ada does not use abbreviations.

Most programming languages use a set of abbreviations that have appeared in either PL/I or Algol 68, but Ada does not use them, for example Ada uses constant, procedure, character, integer instead of const, proc, char, int.


Even setting aside keywords, the real verbosity with Ada is that nearly everything is explicit, not implicit. In C you have many implicit conversions between (similar) types, in Ada these are always explicit. In C++ you have implicit instantiation of templates when they get used, in Ada you must explicitly instantiate a generic before it's used.

On the other hand, arrays carry their range information with them and you don't need to pass that explicitly like in C. And having types with explicit ranges means you can use them and trust that they'll work correctly (which may include erroring out when used incorrectly, like adding 1 to the largest value), but in most other languages you'd have to include explicit range checks at (potentially) numerous locations throughout the code (did we start with a correct value, did we end with a correct value).

Tradeoffs.


You are right.

However the Department of Defense requirements prohibited any kind of implicit conversions, without making any distinction between safe conversions, which preserve the value and which are reversible, and unsafe conversions, like truncations or roundings or signed to unsigned conversions.

The complete lack in Ada of some very frequently needed implicit conversions is annoying and it does not decrease the likelihood of bugs, but it increases the likelihood of bugs due to code bloat that can obscure the erroneous absence of some meaningful operation.

However, this defect is on DoD, not on the Ada authors.


Honestly, I don't get why people care so much about typing loop and end loop etc in their code. But after using nim where I can just ignore the default style and write everything in snake_case (which brings me joy as I find camelCase harder to read and annoyingly ugly), I think more language should be style agnostic (with tools for converting between styles for others' reading benefit). Ada could support curlies AND verbal curlies allowing people who don't think that there is any benefit in loop/end loop to just use {}.


With good tooling, it might be possible to have autocomplete once you write the b of begin or the l of loop, which would reduce the typing verbosity.


Like autocompletion, also syntax coloring becomes more important with verbose programming languages, to help distinguish the relevant text between the large areas occupied by keywords.


That's a good point too.


Yes, but Spark is almost unique.


Maybe in the same way that every language is "just another programming language". Ada clearly has unique features.


No, that would be JAPL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: