Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Sebastian’s team needs docker for a simple reason: they build Debian packages for anything they send to production. But they don’t want to SSH into a build server, clone the git repo, build the package, then copy & upload it. This just takes too long and is annoying."

I hope I'm misunderstanding this. Instead of improving their central build infrastructure, they gave up on it and have the developers produce builds for production on their individual computers? That seems like a step backwards to me. The benefits of a continuous integration are well-known; that's the logical place to produce your artifacts whether they be debian packages or anything else.

Here are a couple projects demonstrating how to build deb and rpm packages and publishing them to repositories from Jenkins.

http://jenkins-debian-glue.org/

https://github.com/jhrcz/jenkins-rpm-builder



Nobody said we gave up on continuous integration. Quite the contrary! We use jenkins and other CI tools on a daily basis.

Once we have tools to build debian packages easily in a controlled environment (containers), the CI server will be able to use the same tools to build and test the final package. The advantage here is that a developer can test the whole workflow and build local test/dev packages with the same tools and environment as the CI server.


I'm glad to hear I was reading too much into the summary. So you plan to use docker as a replacement for tools like pbuilder?


Yes packages are built in a fresh container every time. It's has a lot of functionality in common with pbuilder.

Of course, building packages is just one of many use cases. You also want to test the installation of the new package, run integration tests...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: