Neuronvisio is a software that I wrote quite long time ago to visualize computational models of neurons in 3D. At the point in time when I was actively developing it, few services and software were not existing:
- conda was not available (kickass package manager able to deal also with binary dependencies)
- read the docs were not avaiable (auto-updated docs on each commit)
- github pages didn’t have nice themes available (it was there, but you had to do all the heavy lifting, and I was hosting there the docs, which were updated manually.)
To be able to have a smooth way to release and use the software, I was using Paver as a management library for the package, which served very well until it broke, making Neuronvisio not installable via pip anymore. Therefore I’ve promised to myself that, as soon I had a little bit of time, I was going to restructure the thing and make Neuronvisio installable via conda, automatically pulling all the dependencies needed to have a proper environment working out of the box. Because it will be nice.
Read the docs and github pages
This one was relatevely easy. Neuronvisio docs were always built using sphinx, so host them on read the docs was going to be trivial. Therefore the idea was to point neuronvisio.org to the neuronvisio.rtfd.org and job was done.
Not so fast!
So in the classic yak shaving, which you can read here, or watch the gif below:
Yak shaving: recursively solving problems, with the classic case where your last problem is miles away from where you have started.
It turns out that apex domains cannot point to subdomain (foo.com cannot resolve to zap.blah.com), because DNS protocol does not like it and the internet will burn (or email will get lost, which is pretty much the same problem), so you can only point a subdomain (zip.foo.com) to a subdomain (zap.blah.com).
Therefore my original idea, to use the sphinx generated website as entry point was not a possibility. I could still point neuronvisio.org to whatever I was hosting on the
gh branch of the neuronvisio repo. It couldn’t be the docs, because I wanted them automatically updated, so I had to design some kind of presentation website for the software. As I said, Github Pages is now sporting some cool themes, so I’ve picked up one, and just used some bits from the intro page.
At the end of this, I had a readthedocs hook which was recreating the docs on the fly at each commit, without manual intervention required; a presentation website written in Markdown using githubpages infrastructure, everything hosted and responsive with the proper domain set in place. Note that I even didn’t start on the package. Yeah \o/.
Creating the package
To create the conda package for Neuronvisio I had to create the meta.yaml and the build.sh. It was pretty easy to create given the fact Neuronvisio is a python package, and it was already using setup.py. The docs are good, and googling around, with a lot of test and try, I’ve got the package done in no (too much) time.
Solving the dependencies
Neuronvisio has a lots of dependencies, but most of them were already packaged for conda. The only big dependencies I was missing was the NEURON package and the Interview library. So I created a PR on the community maintained conda-recipes repository. As you can see from the PR and the commit, this was not easy at all and it was super complicated.
It turned out to be impossible to make a proper package for neuron which works out of the box. What we’ve got so far is the support for python and interview out of the box, however not the
hoc support. This is due to the way NEURON figures out the prefix of the hoc file at compilation time, and due to the re-location done by conda when the package is installed, this tend to differ and it’s not easy to be patched.
Anyway, there is a workaround, which is to export
$NEURONHOME environment variable and you are good to go.
After all this a new shiny release of Neuronvisio is available (0.9.1), which it’s goal is to make the installation a bit easier, and get all the dependencies ready to go with one command.
Leave a Reply
You must be logged in to post a comment.