Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The way we handle that for our Python deploys is to have a separate "deploy" git repo which includes complete .tar.gz files of all of our dependencies, then have our pip requirements.txt file point to those file paths rather than using external HTTP URLs.

To avoid packages sneakily trying to download their own dependencies from the internet we run pip install with a "--proxy http://localhost:9999 argument (where nothing is actually running on that port) so that we'll see an instant failure if something tries to pull a dependency over the network.



We do something very similar, but like you said there are the occasional sneaky devils trying to download their own dependencies. Nine times out of ten it seems like it's some version of distribute that they insist on fetching.

The non-existant proxy trick seems useful, I'll have to try that out.


We run our own internal PyPi server. Makes it super easy to install our own packages over Pip, as well as ensure that dependancies are nice and clean.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: