The Sunday graduate student python workshop was a great success! Huge thanks to Nathan Goldbaum and Andrew Pontzen for graciously providing the Jupyter notebooks they used in their talks. You can download the slides and demo material from here.
The 2016 Great Lakes Cosmology Conference will begin on Sunday, June 19, with a graduate student workshop on scientific python, and two popular analysis toolkits: pynbody and yt. Pynbody is a simulation analysis tool for working (primarily) with particle-based datasets, such as the outputs of SPH or N-body simulations. yt is a community-developed analysis and visualization toolkit for volumetric data. yt has been applied mostly to astrophysical simulation data, but it can be applied to many different types of data including seismology, radio telescope data, weather simulations, and nuclear engineering simulations.
The workshop will begin with an introduction to python and the Jupyter notebook, and then have two in-depth presentations on yt and pynbody. In order to make things easy for you, it would be helpful if you brought a laptop with some tools pre-installed, to follow along with the tutorials. This post is a brief set of instructions on how I'd recommend you set things up. If you are a super class-A hacker, and you can do all of this in your sleep, feel free to skip these instructions.
When building a cosmological zoom-in simulation, the strategy is to run a low resolution, dark matter only simulation, select halos of interest, trace the particles that form that halo back to the initial conditions, and then build a new set of ICs with the regions that form the halo given higher resolution. In order to minimize the number of high-resolution elements needed in the zoom-in IC, the volume that is refined should hug the particles of interest as tightly as possible. Unfortunately, because the cosmic web is composed of sheets and filaments, the regions in an IC that need refinement can have complex shapes (they often look like prawn crackers). This means that simple shapes that enclose the region (cubes, ellipsoids and convex hulls are often used) will frequently contain many times the volume just traced by the particles, giving much larger (and computationally expensive). THERE HAS TO BE ANOTHER WAY!
In a previous post, I described how
Setting up the docker instance
Data volumes for file sharing
We will need somewhere to place our simulation outputs, configuration files, and
ultimately, the output of sunrise. We will also need to let our docker
container read and write to this directory. All you need to do in order to do
this is use the
-v host_directory:container_directory switch when starting
your container. This will mount
host_directory in the docker container at the
container_directory mountpoint. In other words, we start our docker image
with something like this command:
docker run -t -i -v ~/sunrise_data:/sunrise_data bwkeller/sunrise:latest /bin/bash
Sunrise is a popular "Monte-Carlo Radiation Transfer code for calculating absorption and scattering of light in astrophysical situations" (it's hosted on bitbucket here). Unfortunately, it is notoriously finnicky to get installed, as it relies on specific point release versions of nearly a dozen different libraries. This is exactly the sort of problem that Docker is supposed to solve, at scale: distributing packages along with all of the library infrastructure they require in one self-contained image. If you are skeptical about docker's performance, check out this paper IBM Research has published: in nearly every metric, docker performance is within 5% of native bare-metal
I'm going to build a docker image with a working install of sunrise, that should save me and my fellow grad students days of wrestling with angry, old C++ libraries. Details below the fold.
What a pain in the butt this machine has been. For a machine that comes with Linux pre-installed, this thing sure doesn't play nicely with it. Out of the box, it comes with Ubuntu 14.04 installed, and mine came with the A03 BIOS. When I first booted, the installation tool crashed halfway through, leaving me with a semi-working system. I had to re-flash the machine back to the factory settings, and then try again (luckily it didn't crash the second round through).
Even after getting Ubuntu set up, there is a horrendous bug in the touchpad driver that makes the cursor jerk wildly around every few minutes. I tried to solve this issue by upgrading the OS to 14.10, but that had the unfortunate effect of nuking the networking and leaving the machine in an unbootable state! YAY! Finally, doing a fresh install of 15.04 actually got everything working. With this, the thing is actually pretty sweet. The HW is amazing, from the screen to the battery to the keyboard. After a day of frustration, I'm pretty satisfied. I'll be posting updates to my experiences here.