Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it a common practice to commit dependencies into the project repo? If so what type of projects do that?


It’s not unusual to do it in C or C++, just because the dependency management story sucks so bad (there’s a lot more options than there used to be, but still no one defacto standard like pip or cargo).


There's this concept of a shared library, you should check it out.


How does compiling your dependency differently help with dependency management?


That's the neat part: you don't

You do not compile your dependencies


Even if they don't distribute binaries for your target?


So ... no dependency management is the best way to do dependency management? Just download the .so files from somewhere?

Sounds awesome!


It is well known that committing dependencies is a bad thing. See sibling posts. It's worth considering what ignoring that best practice gets you.

For example, you've been handed a bug. The customer is important and is running a version of your code from three years ago. You have source control, so you check it out and try to build it. What stuff might you expect?

1/ It uses docker and the image isn't online any more. I've had this one.

2/ One library dependency you used to use has been deleted from the internet. Also had this.

3/ Another dependency is still available, but it uses a dependency which isn't. Not yet.

4/ You managed to gather all the code and it refuses to compile with a modern toolchain

5/ As above, but this time the modern toolchain makes a different program to last time

6/ Another dep has dubious ideas of semver and the current copy doesn't behave like the old

7/ Actually anything using semver is considered deeply suspicious in itself

That's off the top of my head. I think there's probably a long list of variants on the source tree isn't sufficient information to recreate old versions. The reliance on old compiler bugs feels particularly realistic to me, but then C++ people mostly check in our dependencies. I've definitely checked out npm projects from a few months earlier and discovered they don't run any more.

Compare to the silly, paranoid, I've-checked-in-gcc-and-linux alternative. You check out code from N revisions ago and it all builds and runs, exactly like it used to, provided you can find hardware which looks adequately the same as it used to. I've heard rumours of warehouses of new-in-box sun workstations waiting for their time to replace the current ones too.

On balance, I reckon the industry best practice of grabbing whatever code some server gives back with an associated version number is a nonsense and obsessively committing the entire dev and run state into source control is the right thing. But I'm clearly in a minority.


The type of projects where you want to be able to run them in 20 years.


I think you meant run away from them, and immediately.


It depends. You may want to protect against the dependency disappearing from a public repository, or being changed by a malicious actor, or your internal repo is faster to clone and build, or... I'm just saying there are very valid reasons to vendor a dependency. There are also drawbacks: some folks vendor and then make small modifications... that's forking, good luck keeping it up to date. You also have more work to do to vendor new versions but that's easily automated.


I think projects in C would be a good contender here.


Common? Perhaps in less than ideal or legacy situations. Best practice, definitely not.

A lot of website projects I have worked on in the past included composer or node dependencies in the repository. It really slows down the whole git system.


This was at a big tech codebase. Everything is imported into the mono repo so it can be consumed by the Almighty Buck build system.

Python, C, Cpp.. it all goes in!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: