That hasn't really been my experience, I have found an abundance of documentation (sometimes getting lost down the rabbit hole). But I have kept clear of the corporate buzzword solutions, that might be why.
Consider predicate types, or first-class null safety through unions.
If statements introduce propositions, which type systems could take advantage.
example:
let idx = randomInt();
if (0 <= idx && idx < arr.length) { // proposition introduced
// idx now has type: int where 0 <= idx && idx < arr.length
// irrefutable proof of safe indexing by type propositions
let value = arr[idx];
}
I encourage you to explore why generics exist in the first place by exploring topics such as Parametric Polymorphism, Higher Kinded Types, & Higher Kinded Polymorphism.
Better to start with the problem than the most abstract formulation of its solution-- which only makes sense after successive encounters with ever more complex problems.
Simply, of course, we can begin with why it should be that data structures have operations in common -- rather than, say, having each their own specific versions.
Well: the polymorphism `+` allows us to express a common idea, that of "appending". For each of these specific types: int, string, array we can speak in the application-domain of "appending" whilst in the programming domain of "+"ing if we introduce an interface for `+`, that of "appendable",
interface Appendable[A] {
A + A -> A
A + 0 -> A
}
This interface allows us to use `+` generically in a principled way, the technical name for Appendable is `Monoid`, but i prefer Appendable (incidentally, Monad is `Sequencable` or `Nestable`).
Polymorphism is just the ability to speak as generically in the programming-domain as we speak in the application domain, ie., to use the double-meanings of ordinary thinking in programming.
Yours is a beautiful example of how to shoot yourself in the foot with reckless generalization. Addition of integers is commutative; concatenation of strings is not. That's not "a common idea". Compilers can transform a+b into b+a for one, if it suits them, but not for the other.
While integers with addition are an additive monoid and thus commutable it's still also a valid multiplicative monoid (or semigroup as described in the parent) and handy to be able to be passed to code that just needs an associative closed operation, maybe with an identity value.
Integer addition is a magma, semigroup, monoid, and group. It's _also_ a commutative version of each of those structures, but we can still definitely use it with functions only expecting multiplicative structures just fine.
The issue I take with using + for a multiplicative monoid like string concatenation is that readers of code should expect it to be commutative. Using x or * is probably better, using ⓧ is mathematically nice but a pain to type, so something like ++, <>, .., or some other new operator just for concatenation is imo best.
I'm not sure I see the reckless generalisation. I said `+` meant append, not add. We can use addition in the case of integers to serve as `append`, but its a special case of appending where A+B == B+A.
In python, where I take the example, `+` performs that role,
[a() + a() for a in (str, bool, int, list, tuple)]
Just a nit, for floats it is not commutative either, not sure how much freedom do compilers have here. E.g. try summing many floats in decreasing value - chances are that adding a very small number in the end will not even change the value.
Addition of ints is associative, addition of floats is not, yet generalizing over these two numeric types is a core ability people expect from generics.
I don't believe it's true that algebraic data types "came from" FP. The name did, but the concept has existed since before high level languages existed. Actually before computers existed since it's in Type Theory. And before that it's in ordinary human thinking for thousands of years.
I mean, Lambdas came from the Lambda calculus which predates FP. Type inferencing was also formulated as part of typed Lambda calculus. Pattern matching definitely came about prior to FP; SNOBOL (and its predecessor COMIT) is probably the first language that really made it 'a thing'.
Very few things FP uses came from FP; they came from underlying Math and CS, and are themselves specific formulations and implementations that fit within the context of broader well understood concepts.
The initial adopter/FOMO/hype phase is over.