I love Julia-the-language, but the ecosystem is... weird. I'm not at all a fan of how some packages try to install non-Julia stuff for you behind the scenes by default as mentioned in TFA. It's a surprising default and it makes a mess of your system that is hard to clean up.
GNU R and Common Lisp (especially Quicklisp) already made these design mistakes, and let's please not repeat them. The initial of friendliness "do everything interactively right in the REPL" quickly gives way to "arggghhh".
Edit: on second thought, I guess there is a fine line to be drawn here. We are already accustomed to R, Python, Node, Ruby, etc. building C stuff at package installation time. I don't have a great sense of where to draw the line, but somehow I feel the Julia convention crosses it.
> it makes a mess of your system that is hard to clean up
This was true in the past, but the Julia ecosystem has rapidly moved away from installing anything via system package managers or messing with the system state in hard to understand ways.
Instead, everything goes into the .julia directory under the control of the Julia package manager, and the dependencies for any given project can be reproduced on another machine or OS by copying the Project and Manifest. The same goes for binary dependencies (eg, the result of compiling C/fortran/rust/go code) for which there's some amazing cross compilation tooling and build infrastructure — see https://binarybuilder.org/
At this point, I think Julia projects enjoy really first class portability and reproducibility.
Yeah I agree. Comparing Julia to the mess that is eg python environments, this seems loads better. And I think it’s a reasonable state for something where your deps might not be in the language. Maybe specifying things in a more nix-like way would be better but I think it’s pretty good as is and doesn’t pollute your system with random installations
Care to elaborate? I am fairly familiar with the “story” in terms of packaging binary dependencies for Julia, but I am struggling to put my finger on the problem you describe.
BinDeps [1] which was the first stab at it was very much akin to what you describe in that it would attempt to build or install binary dependencies which would potentially affect the state of your system beyond Julia itself. While favoured by operating system package managers, it puts an enormous burden on package creators as you need to be aware of the state (also across versions) and inner working of each operating system, their package managers, and which dependencies that they pull in.
This lead to the current approach, BinaryBuilder [2], where binary dependencies are described, cross-compiled, and then distributed and managed in a read-only “story” by the latest iteration of the Julia package manager. While I admit that this comes with drawbacks such as security updates to dependencies falling upon the Julia package maintainers, it more than makes up for it in terms of usability and reproducibility for the end user.
Indeed, the use of BinayBuilder was an important step forward. Before we had the problem that different package would require different versions of e.g. zlib (https://github.com/JuliaGraphics/Gtk.jl/pull/387). BinayBuilder made the package installation mush more robust.
> We are already accustomed to R, Python, Node, Ruby, etc. building C stuff at package installation time. I don't have a great sense of where to draw the line, but somehow I feel the Julia convention crosses it.
I'm definitely not accustomed to this, and absolutely hate this behavior. Very frequently installing a dependent package or compiling one via one of these package managers is frequently a good way to build it badly, without respecting compiler flags, ignoring system paths, and so on. It's also hard to customize or force the usage of a shared system library which has been built for the purpose.
Ask the package managers of any linux distro how annoying is to handle unbundling in these scenarios.
Last time I checked, installing some of the tidyverse packages from CRAN resulted in other packages being directly cloned from github and installed in the process. That sort of behavior really shouldn’t be allowed.
R packages on CRAN need to be able to install without internet connection so I'm skeptical that that's the case. CRAN packages also don't allow for dependency on R packages that are (only) on github.
I’ll try to look up details, but I’m fairly confident. I discovered it because I was installing packages on a server which was firewalled, and had the CRAN repository proxied.
I had this initial impression with PyCall from Julia, and would point to my own Conda env. But I quickly discovered when I ran my code on another machine, that it was a bit annoying to have thought I installed all the dependencies only to find the python ones missing. The auto install addresses this use case (for distribution/ sharing).
I don't understand the problem, when Julia installs binary dependencies those are installed locally (relative to Julia) in .julia, not on the system itself
GNU R and Common Lisp (especially Quicklisp) already made these design mistakes, and let's please not repeat them. The initial of friendliness "do everything interactively right in the REPL" quickly gives way to "arggghhh".
Edit: on second thought, I guess there is a fine line to be drawn here. We are already accustomed to R, Python, Node, Ruby, etc. building C stuff at package installation time. I don't have a great sense of where to draw the line, but somehow I feel the Julia convention crosses it.