"Sebastian’s team needs docker for a simple reason: they build Debian packages for anything they send to production. But they don’t want to SSH into a build server, clone the git repo, build the package, then copy & upload it. This just takes too long and is annoying."
I hope I'm misunderstanding this. Instead of improving their central build infrastructure, they gave up on it and have the developers produce builds for production on their individual computers? That seems like a step backwards to me. The benefits of a continuous integration are well-known; that's the logical place to produce your artifacts whether they be debian packages or anything else.
Here are a couple projects demonstrating how to build deb and rpm packages and publishing them to repositories from Jenkins.
Nobody said we gave up on continuous integration. Quite the contrary! We use jenkins and other CI tools on a daily basis.
Once we have tools to build debian packages easily in a controlled environment (containers), the CI server will be able to use the same tools to build and test the final package. The advantage here is that a developer can test the whole workflow and build local test/dev packages with the same tools and environment as the CI server.
I hope I'm misunderstanding this. Instead of improving their central build infrastructure, they gave up on it and have the developers produce builds for production on their individual computers? That seems like a step backwards to me. The benefits of a continuous integration are well-known; that's the logical place to produce your artifacts whether they be debian packages or anything else.
Here are a couple projects demonstrating how to build deb and rpm packages and publishing them to repositories from Jenkins.
http://jenkins-debian-glue.org/
https://github.com/jhrcz/jenkins-rpm-builder