I take the "premium" experience of Apple hardware for granted, and if I wanted to buy a Windows laptop, I'd have no idea which brands make similar quality laptops for Windows. I'm shocked at the Windows laptops I see in the wild.
Is that what Microsoft Surface laptops are? Did Microsoft get in the game themselves to make sure a premium-quality laptop is available for Windows?
Yes about the surface part. They thought all other 2-in-1 touch devices were terrible and needed to do something.
But Microsoft stopped caring about hardware a long time ago. I would never recommend overpriced Surface Pro to anyone unless you have a special need for that form factor. As to Surface Laptop, you have so many better choices.
I think the "wheat domesticated humans" argument is about changes to our behavior, our culture and social structures, rather than genetic change. It isn't domestication in the evolutionary sense. It would be like keeping zebras on a farm with horses and doing your best to tame and train them. You might be able to change their behavior so that they behaved differently from wild zebras, but it wouldn't be domestication unless you bred them over generations to produce a population that was genetically different from wild zebras.
> I don't like the term "enums" because of the overloading between simple integers that indicate something (the older, more traditional meaning)
I disagree with this. I'm old as hell, and I learned programming in a context where enums were always ints, but I remember being introduced to int enums as "we're going to use ints to represent the values of our enum," not "enums are when you use ints to represent a set of values." From the very beginning of my acquaintance with enums, long before I encountered a language that offered any other implementation of them, it was clear that enums were a concept independent of ints, and ints just happened to be an efficient way of representing them.
"Enum" is literally defined as a numbering mechanism. While integers are the most natural type used to store numbers, you could represent those numbers as strings if you really wanted. The key takeaway is that a enum is a value, not a type.
The type the link was struggling to speak of seems to be a tagged union. Often tagged union implementations use enums to generate the tag value, which seems to be the source of confusion. But even in tagged unions, the enum portion is not a type. It remains just an integer value (probably; using a string would be strange, but not impossible I guess).
Disagree. Enums are named for being enumerable which is not the same thing as simply having an equivalent number.
It’s incredibly useful to be able to easily iterate over all possible values of a type at runtime or otherwise handle enum types as if they are their enum value and not just a leaky wrapper around an int.
If you let an enum be any old number or make the user implement that themselves, they also have to implement the enumeration of those numbers and any optimizations that you can unlock by explicitly knowing ahead of time what all possible values of a type are and how to quickly enumerate them.
What’s a better representation: letting an enum with two values be “1245927” or “0” or maybe even a float or a string whatever the programmer wants? Or, should they be 0 and 1 or directly compiled into the program on a way that allows the programmer to only ever need to think about the enum values and not the implementation?
IMO the first approach completely defeats the purpose of an enum. It’s supposed to be a union type, not a static set of values of any type. If I want the enum to be tagged or serializable to a string that should be implemented on top of the actual enumerable type.
They’re not mutually exclusive at all, it’s just that making enums “just tags” forces you to think about their internals even if you don’t need to serialize them and doesn’t give you enumerability, so why would I even use those enums at all when a string does the same thing with less jank?
> Enums are named for being enumerable which is not the same thing as simply having an equivalent number.
Exactly. Like before, in the context of compilers, it refers to certain 'built-in' values that are generated by the compiler; which is done using an enumerable. Hence the name. It is an implementation detail around value creation and has nothing to do with types. Types exist in a very different dimension.
> It’s supposed to be a union type
It is not supposed to be anything, only referring to what it is — a feature implemented with an enumerable. Which, again, produces a value. Nothing to do with types.
I know, language evolves and whatnot. We can start to use it to be mean the same thing as tagged unions if we really want, but if we're going to rebrand "enums", what do we call what was formally known as enums? Are we going to call that "tagged unions" since that term now serves no purpose, confusing everyone?
That's the problem here. If we already had a generally accepted term to use to refer to what was historically known as enums, then at least we could use that in place of "enums" and move on with life. But with "enums" trying to take on two completely different, albeit somewhat adjacent due to how things are sometimes implemented, meanings, nobody has any clue as to what anyone is talking about and there is no clear path forward on how to rectify that.
Perhaps Go even chose the "itoa" keyword in place of "enum" in order to try and introduce that new term into the lexicon. But I think we can agree that it never caught on. If I, speaking to people who have never used Go before, started talking about iotas, would they know what I was talking about? I expect the answer is a hard "no".
Granted, more likely it was done because naming a keyword that activates a feature after how the feature is implemented under the hood is pretty strange when you think about it. I'm not sure "an extremely small amount" improves upon the understanding of what it is, but at least tries to separate what it is from how it works inside of the black box.
It feels obvious that that's where the term originated, but I've never seen it used as a definition. In a mathematical context, something is enumerable if it can be put into 1:1 correspondence with the integers, but it doesn't need to be defined by a canonical correspondence. This suggests that being a finite (in a programming context where the set of ints is finite) set of discrete values is the defining feature, not the representation.
> In most applications of int enums, the particular integers can be chosen at random
I’m not sure the definition of "enum" enforces how things are identified. Random choice would be as good as any other, theoretically. In practice, as it relates to programming, random choice is harder to implement due to collision possibilities. Much simpler is to simply increment an integer, which is how every language I've ever used does it; even Rust, whose implementation is very similar to Go's implementation.
But it remains that the key takeaway is that the enum is a value. The whole reason for using an enum is for the sake of runtime comparison. It wouldn't even make sense to be a type as it is often used. It is bizarre that it keeps getting called one.
Sum types can be put into 1:1 correspondence with the integers, barring the inclusion of a non-enumerable type in a language's specification that can be used in sum types. However I would observe that this is generally a parlor trick and it's fairly uncommon to simply iterate through a sum type. As is so often the case, the exceptions will leap to mind and some people are already rushing to contradict me in the replies... but they are coming to mind precisely because they are the exceptions. Yes, you can sensibly iterate on "type Color = Red | Green | Blue", I've written code to do the equivalent in various languages many times and most complicated enumerations (in the old sense) I do come equipped with some array that has all the legal values so people can iterate over them (if they are not contiguous for some reason), so I know it can be done and can be useful. But the instant you have a general number type, or goodness help you, a generalized String type, as part of your sum type, you aren't going to be iterating on all possible values. And the way in which you can put the sum types into a 1:1 correspondence won't match your intuition either, since you'll need to diagonalize on the type, otherwise any unbounded array/string will get you "stuck" on the mapping and you'll never get past it.
So while you can theoretically argue it makes sense to call them an "enum" I don't like it precisely because "enumerating" the "enum" types (being sum types here), in general, is not a sensible operation. It is sensible in specific, but that's not really all that special. We don't generally name types by what a small percentage of the instances can do or are, we name them by what all instances can do or are. A degenerate sum type "type Value = Value" is still a sum, albeit a degenerate one of "1", but nobody ever enumerates all values of "type Email = Email { username :: String, domain :: String }". (Or whatever more precise type you'd like to use there. Just the first example that came to mind.) There are also cases where you actively don't want users enumerating your sum type, e.g., some sort of token that indicates secure access to some resource that you shouldn't be able to get, even in principle, by simple enumerating across your enum.
If it's called an "enum" I want to be able to "enum"erate it.
I was very happy with Pocket. After Mozilla discontinued it, I switched to Instapaper, which I barely use, for reasons I don't fully understand. All I know is that the Instapaper home screen feels unhelpful and off-putting to me.
The public health discourse about protein is in a weird place right now. The recommendations are higher than ever, yet people are constantly told not to think about protein, or to worry about excess protein intake instead.
Case in point: the Mayo Clinic article titled "Are you getting enough protein?"[0]
It claims that protein is only a concern for people who are undereating or on weight loss drugs, yet it cites protein recommendations that many people find challenging to meet (1.1g/kg for active people, more if you're over 40 or doing strength or endurance workouts.) To top it off, it's illustrated with a handful of nuts, which are pretty marginal sources of protein. It's bizarrely mixed messaging.
Different libraries composing well together is the default assumption in most of software development. Only in Javascript have people given up on that and accepted that libraries don't work together unless they've been specifically designed, or at least given a compatibility layer, for the framework they're being used in.
Qt widgets don't work together with GTK widgets, and nobody considers this a crisis. I'm pretty sure you can't use Unreal engine stuff in Unity. GUIs require a lot of stuff to compose together seamlessly, and it's hard to do that in a universal way.
HTMX achieves its composability by declining to have opinions about the hard parts. React's ecosystem exists because it abstracts client-side state synchronization, and that inherent complexity doesn't just disappear. When you still have to handle the impedance mismatch between "replace this HTML fragment" and "keep track of what the user is doing", you haven't escaped the complexity. You've just moved it to your server, and you've traded a proven, opinionated framework's solution for a bespoke one that you have to maintain yourself.
If anything, the DOM being a shared substrate means JS frameworks are closer to interoperable than native GUI toolkits ever were. At least you can mount a React component and a Vue component in the same document. They're incompatible with each other because they're each managing local state, event handling, and rendering in an integrated way. However, you can still communicate between them using DOM events. An HTMX date picker may compose better, but that's just because it punts the integration to you.
> I lost my enthusiasm when the community decided they wanted to use it as Haskell on the JVM
It's not the whole community, not by a long shot. Don't judge Scala by the Scala subreddit.
Most new things you'll see written about Scala are about solving difficult problems with types, because those problems are inexhaustible and some people enjoy them, for one reason or another. Honestly I think this shows how easy and ergonomic everything else is with Scala, that the difficulties people write about are all about deep type magic and how to handle errors in monadic code. You can always avoid that stuff when it isn't worth it to you.
The type poindexters will tell you that you're giving up all the benefit of Scala the moment you accept any impurity or step back from the challenge of solving everything with types, and you might as well write Java instead, but they're just being jerks and gatekeepers. Scala is a wonderful language, and Scala-as-a-better-Java is a huge step up from Java for writing simple and readable code. It lets you enjoy the lowest hanging fruit of functional programming, the cases where simple functional code outshines OO-heavy imperative code, in a way that Java doesn't and probably never will.
I'd love to replace Python with something simple, expressive, and strongly typed that compiles to native code. I have a habit of building little CLI tools as conveniences for working with internal APIs, and you wouldn't think you could tell a performance difference between Go and Python for something like that, but you can. After a year or so of writing these tools in Go, I went back to Python because the LOC difference is stark, but every time I run one of them I wish it was written in Go.
(OCaml is probably what I'm looking for, but I'm having a hard time getting motivated to tackle it, because I dread dealing with the tooling and dependency management of a 20th century language from academia.)
Have you tried Nim? Strong and static typed, versatile, compiles down to native code vía C, interops with C trivially, has macros and stuff to twist your brain if you're into that, and is trivially easy to get into.
That looks very interesting. The code samples look like very simple OO/imperative style code like Python. At first glance it's weird to me how much common functionality relies on macros, but it seems like that's an intentional part of the language design that users don't mind? I might give it a try.
Yes, Go can hardly be called statically typed, when they use the empty interface everywhere.
Yes, OCaml would be a decent language to look into. Or perhaps even OxCaml. The folks over at Jane Street have put a lot of effort into tooling recently.
> Yes, Go can hardly be called statically typed, when they use the empty interface everywhere.
How often are you using any/interface {}? Yes, sometimes it's the correct solution to a problem, but it's really not that common in my experience. Certainly not common in ways that actually make life hard.
Also, since generics, I've been able to cut down my use of the empty interface even further.
You can replace Python with Nim. It checks literally all your marks (expressive, fast, compiled, strong-typing). It's as concise as Python, and IMO, Nim syntax is even more flexible.
I bounced off OCaml a few years ago because of the state of the tooling, despite it being almost exactly the language I was looking for.
I'm really happy with Gleam now, and recommended it over OCaml for most use cases.
I always assumed a runtime specialized for highly concurrent, fault-tolerant, long-running processes would have a noticeable startup penalty, which is one of the things that bothers me about Python. Is that something you notice with Gleam?
Did you consider using F#? The language is very similar to OCaml, but it has the added benefit of good tooling and a large package ecosystem (can use any .NET package).
I've heard a lot of good things about F#, but I've also heard that C# has taken all the best features from F# and now development has slowed down. I don't know how true that is. It's also just some irrational anti Microsoft bias, even though I know .NET runs fine on Linux now, the idea still felt weird to me. I suspect if I'd actually tried F# I would have stuck with it.
I have looked at the Fable compiler for F# which lets you compile F# to Rust which is very cool!
Rust might be worth a look. It gets much closer to the line count and convenience of the dynamic languages like Python than Go, plus a somewhat better type system. Also gets a fully modern tooling and dependency management system. And native code of course.
I suppose you could try typescript which can compile to a single binary using node or bun. Both bun and node do type stripping of ts types, and can compile a cli to a single file executable. This is what anthropic does for Claude code.
I think the downside, at least near-term, or maybe challenge would be the better word, is that anything richer than text requires a lot more engineering to make it useful. B♭ is text. Most of the applications on your computer, including but not limited to your browser, know how to render B♭ and C♯, and your brain does the rest.
Bret Victor's work involves a ton of really challenging heavy lifting. You walk away from a Bret Victor presentation inspired, but also intimidated by the work put in, and the work required to do anything similar. When you separate his ideas from the work he puts in to perfect the implementation and presentation, the ideas by themselves don't seem to do much.
Which doesn't mean they're bad ideas, but it might mean that anybody hoping to get the most out of them should understand the investment that is required to bring them to fruition, and people with less to invest should stick with other approaches.
> You walk away from a Bret Victor presentation inspired, but also intimidated by the work put in, and the work required to do anything similar. When you separate his ideas from the work he puts in to perfect the implementation and presentation, the ideas by themselves don't seem to do much.
Amen to that. Even dynamic land has some major issues with GC pauses and performance issues.
I do try to put my money where my mouth is, so I've been contributing a lot to folk computer[1], but yeah, there's still a ton of open questions, and it's not as easy as he sometimes makes it look.
That's fair. It's still pre-alpha, and under heavy development, but it's working on taking the best of dynamicland[1] and trying to take it a lot further.
In terms of technical details, we just landed support for multithreaded task scheduling in the reactive database, so you can do something like
When /someone/ wishes $::thisNode uses display /display/ with /...displayOpts/ {
and have your rendering loop block the thread. Folk will automatically spin up a new thread when it detects that a thread is blocking, in order to keep processing the queue. Making everything multithreaded has made synchronizing rendering frames a lot tricker, but recently Omar (one of the head devs) made statements atomic, so there is atomic querying for statements that need it.
In terms of philosophy, Folk is much more focused on integration, and comes from the Unix philosophy of everything as text (which I still find amusingly ironic when the focus is also a new medium). The main scripting language is Tcl, which is sort of a child of Lisp and Bash. We intermix html, regex, js, C, and even some Haskell to get stuff done. Whatever happens to be the most effective ends up being what we use.
I'm glad that you mention that the main page is unhelpful, because I hadn't considered that. Do you have any suggestions on what would explain the project better?
Although, one could make the argument that staff notation is itself a form of text, albeit one with a different notation than a single stream of Unicode symbols. Certainly, without musical notation, a lot of music is lost (although, one can argue that musical notation is not able to adequately preserve some aspects of musical performance which is part of why when European composers tried to adopt jazz idioms into their compositions in the early twentieth century working from sheet music, they missed the whole concept of swing which is essential to jazz).
> one could make the argument that staff notation is itself a form of text, albeit one with a different notation than a single stream of Unicode symbols
Mostly this is straightforwardly correct. Notes on a staff are a textual representation of music.
There are some features of musical notation that aren't usually part of linguistic writing:
- Musical notation is always done in tabular form - things that happen at the same time are vertically aligned. This is not unknown in writing, though it requires an unusual context.
- Relatedly, sometimes musical notation does the equivalent of modifying the value of a global variable - a new key signature or a dynamic notation ("pianissimo") takes effect everywhere and remains in effect until something else displaces it. In writing, I guess quotation marks have similar behavior.
- Musical notation sometimes relates two things that may be arbitrarily far apart from each other. (Consider a slur.) This is difficult to do in a 1-D stream of symbols.
> although, one can argue that musical notation is not able to adequately preserve some aspects of musical performance
Nothing new there; that's equally true of writing in relation to speech.
The replied to comment seemed skeptical to treat musical notation as text. But any reasonable definition of "text" should include musical notation.
Otherwise it would be hard to include other types of obvious text, including completely mainstream ones such as Arabic. They are all strings of symbols intended for humans to read.
Feel free to disagree but I don't understand the argument here, if there is any. Lots of people read both Arabic and musical notation, it's a completely normal thing to do.
any reasonable definition of "text" should include musical notation
Then many a dictionary must be unreasonable [0]:
text
1. A discourse or composition on which a note or commentary is written;
the original words of an author, in distinction from a paraphrase, annotation, or commentary.
6. That part of a document (printed or electronic) comprising the words [..]
7. Any communication composed of words
n 1. the words of something written
Musical notes do not form words, and therefore are not text. (And no, definition 1 does not refer to musical notes). The written down form of music is called a score, not a text.
For complex music, sure, but if I'm looking up a folk tune on, say, thesession.org, I personally think a plain-text format like ABC notation is easier to sight-read (since for some instruments, namely the fiddle and mandolin, I mainly learn songs by ear and am rather slow and unpracticed at reading standard notation).
Think about the article from a different perspective: several of the most successful and widely used package managers of all time started out using Git, and they successfully transitioned to a more efficient solution when they needed to.
Is that what Microsoft Surface laptops are? Did Microsoft get in the game themselves to make sure a premium-quality laptop is available for Windows?
reply