Checkout PythonCall.jl and juliacall (on the python side). Not to mention that now you can literally write python wrappers of Julia compiled libraries like you would c++ ones.
> you can literally write python wrappers of Julia compiled libraries like you would c++ ones
Yes, please. What do I google? Why can't julia compile down to a module easily?
No offense but once you learn to mentally translate between whiteboard math and numpy... it's really not that hard. And if you were used to Matlab before Mathworks added a jit you were doing the same translation to vectored operations because loops are dog slow in Matlab (coincidentally Octave is so much better than Matlab syntax wise).
And again python has numba and maybe mojo, etc. Because julia refused to fill the gap. I don't understand why there's so much friction between julia and python. You should be able to trivially throw a numpy array at julia and get a result back. I don't think the python side of this is holding things back. At least back in the day there was a very anti-python vibe from julia and the insistence that all the things should be re-implemented in julia (webservers etc) because julia was out to prove it was more than a numerical language. I don't know if that's changed but I doubt it. Holy wars don't build communities well.
>> you can literally write python wrappers of Julia compiled libraries like you would c++ ones.
> Yes, please. What do I google? Why can't julia compile down to a module easily?
That said Julia's original design focused on just-in-time compilation rather than ahead-of-time compilation, so the AOT process is still rough.
> I don't understand why there's so much friction between julia and python. You should be able to trivially throw a numpy array at julia and get a result back.
Yeah, it didn’t have the explosive success that rust had. Most probably due to a mixture of factors, like the niche/academic background and not being really a language to look at if you didn’t do numerc computing (at the beginning) and therefore out of the mouth of many developers on the internet.
And also some aspects of the language being a bit undercooked. But, there’s a but, it is nonetheless growing, as you probably know having read the releases, the new big thing is the introduction of AOT compilaton. But there’s even more stuff cooking now, strict mode for having harder static guaratees at compile time, language syntax evolution mechanisms (think rust editions), cancellation and more stuff I can’t recall at the moment.
Julia is an incredibly ambitious project (one might say too ambitious) and it shows both in the fact that the polish is still not there after all this time, but it is also starting to flex its muscles. The speed is real, and the development cycle is something that really spoiled me at this point.
You should give Julia a go, can be written completely static if desired (not as static as rust of course, but compared to python it’s miles ahead). Can be made fast, very fast. AOT compilation with trimmed executables is coming in 1.12
Doesn't Julia suffer from very long startup times? One of the things I use Python for is CLI programs. The startup time isn't great either, but Julia was even worse last I tested.
Julia v1.12, the unreleased version which is currently in release candidate stage (and has had a longer RC stage than expected but should be done at least by the end of the year) has the ability to fully ahead of time compile and generate binaries, like you would expect from a language like C. It also trims these binaries so that they are a reasonable size given the elements of the runtime that are used. Thus for good type-stable code you get callable binaries without JIT overhead, and this leads to a much better CLI experience. It will take a bit of time so that there's more package tooling and support built around this feature, but this is the path a lot of the ecosystem CLI tooling is now going and that will be a pretty dramatic shift in the usability for these kinds of use cases which Julia had traditionally ignored.
aot will help a lot. In cases of simple programs you can also start Julia with no optimizations, which trade off the startup latency for runtime speed.
It is often the case, that one’s perspective is a personal synthesis of external ideas. The act of quoting great past authors is also a way of recognizing where your influences come from. To describe by association how you think, or aspire to.
These are very difficult topics to properly talk about and correctly express all the nuance in the feeling that you try to convey, and many authors are quoted because they nailed a particular description, evocative of the feeling an author is trying to express and that he feels he can’t do a better job at explaining.
Similarly to how you can narrate a story through a sequence of pictures you can narrate an idea through a sequence of raw concepts, encapsulated in quotes.
This reminds me of a little project [1] I had fun with to try and compute the sharpness of a position and/or evaluate whole lines. Indeed, the notion of sharpness (which I believe it’s different from complexity) although easy to intuit I’ve never found a satisfactory implementation.
As a player I like to think of sharpness as a measure of the potential consequences of a miscalculation. In a main line dragon, the consequence is often getting checkmated in the near future, so maximally sharp. In a quiet positional struggle, the consequence might be something as minor as the opponent getting a strong knight, or ending up with a weak pawn.
Whereas complexity is a measure of how far ahead I can reasonably expect to calculate. This is something non-players often misunderstand, which is why they like to ask me how many moves ahead I can see. It depends on the position.
And I agree, these concepts are orthogonal. Positions can be sharp, complex, both or neither. A pawn endgame is typically very sharp; the slightest mistake can lead to the opponent queening and checkmating. But it's relativity low in complexity because you can calculate far ahead using ideas like counting, geometric patterns(square of the pawn, zone of the pawn, distant opposition etc) to abstract over long lines of play. On the opposite side, something like a main line closed Ruy Lopez is very complex(every piece still on the board), but not especially sharp(closed position, both kings are safe, it's more of a struggle for slight positional edges).
Something like a king's indian or benoni will be both sharp and complex. Whereas an equal rook endgame is neither(it's quite hard to lose a rook endgame, there always seems to be a way to save a draw).
I did something very similar some years ago while learning metal [1], I recall them being called "boids". I spent days just playing with the various parameters, luckily my implementation was not as pretty as the one offered in the OP, otherwise I would have lost weeks instead.
The original boids, or "bird-oid objects", was an algorithm and program to simulate emergent flocking behaviour from simple rules in birds. It has spawned a kind of genre, or at least a multitude of copies/derivatives, often collectively referred to as "boids".
I've been vibe coding boids implementations because my toddler likes to look at them. Here's one where the attributes of the boids (alignment, etc) are hooked up to frequency generators like a sequencer: https://neuroky.me/boidsine.html?boidCount=1700&hue=sine3&se...
I can move them to GitHub or something, but they are currently hosted in my pantry. Please be gentle :)
You should give Julia a shot.
That’s basically that. You can start with super dynamic code in a REPL and gradually hammer it into stricter and hyper efficient code. It doesn’t have a borrow checker, but it’s expressive enough that you can write something similar as a package (see BorrowChecker.jl).
The “proportionality constant” is doing a lot of work in that claim. A lot of “constant” parameters are swept under the rug. If you fix enough stuff that claim is indeed correct, although I agree a bit simplistic
This feels a bit disingenious.
All the languages brought as an example need some sort of handling of the `not found` case. In C++ and Go you need to check against null pointers (or don't but then encounter segafults), Haskell and Rust you are forced to unwrap the value. C also has to check against the error code or incur in errors down the line or worse subtler logic errors.
Missing this types of checks is also source of errors in dynamic languages, adding `1+None` as well as `1+nothing` will return an error if not handled properly.
If you are absolutely sure your element will be in the array you have to encode it, for example
x = something(findindex([1,2,3], val)) # return value is Int now, or an error.
or even
x === nothing && error("oh no")
are enough for the compiler to correctly infer that x will just be an Int, and remove the Union.
Also, the claim that the only way to check for unfavourable Union types is to visually scan each function is just plainly false. There are many tools to do this sort of checks, just to name a few: @code_warntype, Cthulhu, DispatchDoctor
I do agree though that Julia has mainly users from academia, and therefore is less polished on the more pure software engineering aspects and tooling.
But the disclaimer at the end feels just like the author is dismissing the whole language on false premises due to frustration with lack of support/ecosystem for his specific use case.
reply