Here we see the beginnings of volume rendering via yt in Blender through the use of AstroBlend. This video starts with the data preloaded and shows how the data can be volume rendered in real time - including changing the color scale, location and orientation of the user's view.

The first thing to note is that all of this is done in memory, meaning that the data is read into Blender and deposited in a uniformly spaced 3D cube with yt, and no external voxel file is necessary. However, this required doing some tricky-icky-probably-sinful things, including overwritting the VoxelData structure with a python-generated ctypes structure, and messing with the texture panel so that the texture previews don't show as this is an action that would crash Blender due to our overwriting of the VoxelData structure. Oops. I'll hopefully be chatting with the Blender developers about the possibility of passing a pointer or something to the VoxelData structure in the future, so stay tuned...

Another thing to note: this is a uniformly spaced 3D cube, and as such, we couldn't get to the highest resolution of our data without running out of memory. Attempts at AMR cubes are still in the works as well, so another reason to check for updates!