Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Django Best Practices — Updated for 1.4 (lincolnloop.com)
80 points by superchink on Sept 13, 2012 | hide | past | favorite | 19 comments


I was excited for this guide, but unfortunately it was very short and light on details.

I'm a novice Django programmer, and was surprised that I gained very little knowledge by looking at this document. I wish I could report happier news.

I should be keeping a list of Django annoyances and questions, because then I could propose a list of topics for Django best practices that I'd like to see answered. But I can't remember them off the top of my head.


If you think of anything, please file an issue https://github.com/lincolnloop/django-best-practices/issues. It's written for the novice-intermediate audience which can sometimes be hard to do when you've been working in it daily for years.


I have few issues with this:

https://github.com/lincolnloop/django-layout

This structure leaves manage.py in the root path. Which mean other dirs will pollute this area, like run/, deploy/, log/, config/, lib/, bin/, cronjobs/, etc. You dirs woud stack to a long list.

I would suggest something like this:

    root/
      src/
        manage.py
      lib/
      run/
      docs/
      config/


Yeah, a lot of this is personal preference and obviously, there's a million ways to do it. Our perspective is that manage.py is a script (you never import it as a Python module) so we bump it out and install to your path using setup.py.


Yeah, this is a Django thing, still, you can put your Django project one directory deep in your project tree and do as you said.


Thanks! Good starting point. I'd like a section on using django on cloud services, e.g. AWS Elastic beanstalk. Another best practise I'd want is scheduling scripts with https://docs.djangoproject.com/en/dev/howto/custom-managemen... and Cron, or using Celery. There is much confusion on those topics I think.

Edited some spelling...


I'm not giving best practises as I'm myself fairly new to Django, this is just a couple of packages to take a look at.

I've been deploying two projects to AWS very recently and we've been using django-pipeline and django-storages (with s3 boto storage) for asset management. ./manage.py collectstatic and all your static files are up on S3. With a bit of finagling around[1] you can even have user uploads hit there seamlessly as well.

[1]: http://stackoverflow.com/questions/10390244/how-to-set-up-a-...

EDIT: Pipeline isn't necessary for storing static files on S3, but if you want to compile SASS/LESS/cs files or any transforms really it works really well.


Thank you!


What's confusing you? (I'm trying to help, not being snarky)

If you want to invoke some Django-based logic at regular intervals without having to install Celery (and monitoring and a decent queue) you'll opt for a management command. The link you posted should help you out here. Invoking the script yourself or telling cron to invoke it for you shouldn't be hard if you know about cron.

With regards to Celery: I think the tutorial and docs are pretty clear on how to use it and how to set it up.


Thanks for your comment =) Nothing is confusing me there, but try to search for "django script cron" and you'll see people suggesting setting up urls to start the script, having an external script that import settings and many other complex things. The craziest thing I've seen (but probably useful for some cases) is to have the regular requests from google-bots invoke scripts... That's why I think it's a good thing to suggest managment commands when they're so easy and integrates nicely with your apps. Celery however is good for more complex usecases. I was merely making a suggestion for best-practise.


I imagine most of those icky suggestions are coming from people who used to run php on hosts with a lot of restrictions. You're right, this should definitely be a part of a best practices guide.


Thanks for the feedback! I filed a couple of issues to address this:

* https://github.com/lincolnloop/django-best-practices/issues/... * https://github.com/lincolnloop/django-best-practices/issues/...

If you think of any more, feel free to file your own.


Thanks! Filed my schedule scripts comment as an issue. Looking forward to see this develop. Hope I can contribute.


This recommends checking in your (instance-specific) settings to version control. I know some people really push for the opposite (http://www.12factor.net/config, canonical example is Heroku).

Myself, I'm swinging between the two still, haven't found a convincing argument either way.


We have canonical settings in our settings.py, while every environment imports local settings from another conf. If it is necessary to have subsidiary settings also source controlled, we could do that and have the environment symlink its preference. Of course, additionally, the secondary settings could have the same dance as the primary settings and allow for another level of local configuration.

  # allow overriding of settings with $HOME/.sodahead
  rc = os.path.join(os.environ["HOME"], ".sodahead")
  if os.path.isfile(rc):
    with open(rc, "r") as fi:
      dotsodahead = imp.load_source("dotsodahead", rc, fi)
      settings = dict([(k, v) for k, v in vars(dotsodahead).items()
          if not k.startswith("__")])
      globals().update(settings)


I agree that apps/modules/libraries should have code separate from configuration. I see no problem with letting configuration be part of a _project_, however. Environment variables are too implicit in a lot of cases, and generally you end up having code checked into some other repository that makes sure the environment variables are set properly. I prefer to skip the middle man and have clearly demarcated sections of configuration directly in my projects.


I'm in the same boat. I whole-heartedly agree that production secret keys, passwords, etc. shouldn't be checked into version control as plain-text.

Even if you use environment variables, you need them stored somewhere (a secrets.py file, a .env file, a upstart script) so they get onto the machine at startup. At the moment we're kicking around GPG encrypting the data and only decrypting it on the machines that need it.


<shameless-plug> I've prepared something similar and posted it to HN some time ago: http://news.ycombinator.com/item?id=4488787

Feel free to cross-pollinate! </shameless-plug>


This is close to being a direct channel to my brain. Perfect!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: