Hacker Newsnew | past | comments | ask | show | jobs | submit | ljloisnflwef's commentslogin

It's the boom/bust cycle we've experienced with Ruby on Rails, Node.js, Scala, MongoDB.

The initial adopter/FOMO/hype phase is over.


Gaming was over / Early Sell signals:

* Valve going full cosmetics

* Half Life 3 never being released

* Biannual Call of Duty titles

* The end of community hosted servers & server browsers

* The rise of Early Access & DLCs

* John Carmack leaving id software


Welcome to Kubernetes!

I don't know what went wrong but the ecosystem prefers software presented as a black box and shrouded with mystery.

If you contrast this with Envoy, it has stellar documentation -- Envoy wants you to know how it works!

Envoy is used and abused by Istio (k8s wrapper) which has terrible documentation, history of horrific upgrade/update story, (& software quality).


That hasn't really been my experience, I have found an abundance of documentation (sometimes getting lost down the rabbit hole). But I have kept clear of the corporate buzzword solutions, that might be why.


The majority of my life has been out of my hands or control.

So it doesn't feel like "my life", rather on a ride I can't get off or interface with.

For me to find value in something, it should grant at minimal autonomy.


The only real rules are those the universe provides.

We may not get to choose our place in the cosmos, but on our own planet?

You have autonomy here.


Most people have a choice to die instead of working, but most people would also not consider that a choice.


I wonder how much that is "in your head"?


In this example, it's why Java added sealed interfaces, otherwise left with ∞ - 1 possibilities.

  if (!(node instanceof DomNode.Element)) {
    return new RenderNode.Noop(/* ... */);
  }
  Layout layout = node.layout;
  return new RenderNode.Styled(layout, /* ... */);


It's true that we return a `RenderNode.Noop` in ∞ - 1 cases, but that's okay because we're not making any demands of `node` there.


Yes and no.

Consider predicate types, or first-class null safety through unions.

If statements introduce propositions, which type systems could take advantage.

example:

  let idx = randomInt();
  if (0 <= idx && idx < arr.length) { // proposition introduced
    // idx now has type: int where 0 <= idx && idx < arr.length
    // irrefutable proof of safe indexing by type propositions
    let value = arr[idx];
  }


We don't hold language authors accountable, and accept mediocrity.


I encourage you to explore why generics exist in the first place by exploring topics such as Parametric Polymorphism, Higher Kinded Types, & Higher Kinded Polymorphism.

The truth will set you free.


Better to start with the problem than the most abstract formulation of its solution-- which only makes sense after successive encounters with ever more complex problems.

Simply, of course, we can begin with why it should be that data structures have operations in common -- rather than, say, having each their own specific versions.


Why not elaborate some more? I'm bored, so...

    2 + 2  == 4
    "Hello" + "World" == "Hello World"
    [2, 2] + [3, 4] == [2, 2, 3, 4]

Should we bother reusing `+` for this? Why not,

    2 intPlus 2  == 4
    "Hello" strPlus "World" == "Hello World"
    [2, 2] arrayPlus [3, 4] == [2, 2, 3, 4]
    
Well: the polymorphism `+` allows us to express a common idea, that of "appending". For each of these specific types: int, string, array we can speak in the application-domain of "appending" whilst in the programming domain of "+"ing if we introduce an interface for `+`, that of "appendable",

    interface Appendable[A] {
      A + A -> A
      A + 0 -> A
    }  
    
This interface allows us to use `+` generically in a principled way, the technical name for Appendable is `Monoid`, but i prefer Appendable (incidentally, Monad is `Sequencable` or `Nestable`).

Polymorphism is just the ability to speak as generically in the programming-domain as we speak in the application domain, ie., to use the double-meanings of ordinary thinking in programming.


Yours is a beautiful example of how to shoot yourself in the foot with reckless generalization. Addition of integers is commutative; concatenation of strings is not. That's not "a common idea". Compilers can transform a+b into b+a for one, if it suits them, but not for the other.


While integers with addition are an additive monoid and thus commutable it's still also a valid multiplicative monoid (or semigroup as described in the parent) and handy to be able to be passed to code that just needs an associative closed operation, maybe with an identity value.

Integer addition is a magma, semigroup, monoid, and group. It's _also_ a commutative version of each of those structures, but we can still definitely use it with functions only expecting multiplicative structures just fine.

The issue I take with using + for a multiplicative monoid like string concatenation is that readers of code should expect it to be commutative. Using x or * is probably better, using ⓧ is mathematically nice but a pain to type, so something like ++, <>, .., or some other new operator just for concatenation is imo best.


As above, I was thinking of python in my choice of the `+` example.


I'm not sure I see the reckless generalisation. I said `+` meant append, not add. We can use addition in the case of integers to serve as `append`, but its a special case of appending where A+B == B+A.

In python, where I take the example, `+` performs that role,

    [a() + a() for a in (str, bool, int, list, tuple)]
etc.


Just a nit, for floats it is not commutative either, not sure how much freedom do compilers have here. E.g. try summing many floats in decreasing value - chances are that adding a very small number in the end will not even change the value.


A nit's nit, float addition is commutative. It is not associative.


Addition of ints is associative, addition of floats is not, yet generalizing over these two numeric types is a core ability people expect from generics.


There's genuinely good things that came from FP that creeped into popular languages:

* Algebraic data types

* Pattern matching (with exhaustiveness)

* Lambdas / Closures

* Type inference


I don't believe it's true that algebraic data types "came from" FP. The name did, but the concept has existed since before high level languages existed. Actually before computers existed since it's in Type Theory. And before that it's in ordinary human thinking for thousands of years.


As a usable programming construct though it did.

I mean, Lambdas came from the Lambda calculus which predates FP. Type inferencing was also formulated as part of typed Lambda calculus. Pattern matching definitely came about prior to FP; SNOBOL (and its predecessor COMIT) is probably the first language that really made it 'a thing'.

Very few things FP uses came from FP; they came from underlying Math and CS, and are themselves specific formulations and implementations that fit within the context of broader well understood concepts.


Other options:

* Use linear / affine / session types / move semantics

* Describe an interface, then weave side effects through it, such as the State Monad.

* Use lenses


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: