So, wanted to hack a bit on an old, barely finished project today. Since I don't think i ever ran it under linux, the first thing I did after grabbing the source from github was to run my test suite, which exploded because I had not installed my dependencies. Well, duh.
I got into the habit of using virtual environnements for most projects, so i created one and installed the missing libs (and learned, after cursing at myself for a short while, why having an up-to-date requirements.txt file is a good idea). ran the tests again, and... Still failed. The dependencies still couldn't be found. Importing them in the shell worked just fine, but nosetests just couldn't find them.
I figured the nosetests command ran on a system level and ignored the virtualenv. I could verify this by running the test suite without it, using the
python -m unittest discover command, which passed without a single failure.
First solution i came up with was a simple
pip install nose; Since pip is included by default in a new virtualenv, I figured it would just install another instance of it in the right environment which would then run within it, and therefore acknowledge this env's installed packages.
Well, no sugar. I'm guessing than even when run from within an active virtualenv, pip still checks for installed packages system wide - In any case, it did detect an up-to-date version of nose and stopped there without installing the new instance.
Turns out the solution is indeed to install nose inside the current virtualenv, but you have to use the -I flag to "force" the reinstallation (which will go to the virtualenv, since its run with its own pip instance), so
pip install -I nose (with the target environnement activated, of course) does the trick.
I've added the command to my
postmkvirtualenv hook script, because i'm pretty sure i'll forget all about it before I run into the problem again, and I'm posting this here as a reminder should this happen again on a new system.