First of all, my apologies for the lack of posts. Part of it was the demands of my paid work: For some time, I’ve been working on the PS2 version of Rainbow Six: Lockdown, and those of you in the know are aware that the ship date was pushed out from March to September. While most of that was just polish, doing that final 1% takes a fair amount of effort. So only recently have I felt like digging into new engineering challenges and blogging again. The other part was that I didn’t have much to write about, but I have some ideas which I’ll put up in the next few weeks.
As far as the lighting improvements I discussed before, I did end up implementing an indirect lighting solution using photon maps. Since I was concerned with indoor environments, straight occlusion maps didn’t provide quite the right effect. Corners that you’d expect to be filled by transferred indirect light looked a little too dark. And the system I had was set up to do raycasting already, so a GPU solution was looking a little too ambitious.
To try to overcome the blotchiness of the photon map, I used a technique called final gathering. In this, rather than use the photon map samples directly, we cast multiple rays from each lightmap sample point, and where they hit the environment we grab a radiance sample from the photon map. The end result was better, but still somewhat blotchy. Due to the reason listed above, I haven’t had a lot of time to dedicate to improving that, but I think there may be some bugs in the sampling code somewhere.
Final gathering does increase the number of raycasts dramatically, so to speed that up I went with the suggestions of the two papers mentioned here, and changed the original voxel-based system to a kd-tree. That provided about a 10-fold speed-up, and helped the overall lighting calculations tremendously. I also found that a good deal of sloth was being created by virtual memory thrashing, so I reduced the memory used by the data so that the application could fit in physical memory. This required compressing and uncompressing some data which would normally take more time, but because page swaps were minimized it actually ran significantly faster.
I need to plug two books that really helped me throughout this process. The first is Henrik Wann Jensen’s book, Realistic Image Synthesis Using Photon Mapping, which got me up and running with a basic photon mapping system. The other is the best practical guide I’ve found for building a physically-based lighting system: Physically Based Rendering: From Theory to Implementation by Matt Pharr and Greg Humphries. It is the first reference I found that truly makes Monte Carlo sampling theory clear to me, and it does a great job of covering the mathematics and physics of lighting along with how to practically implement them in code. Say, reminds me of another book that I’m fond of…
That’s about it for now. In the next few weeks I’ll be covering some more in-depth book errata, a brief discussion of some math and science shows on TV, and a possible podcast idea that’s rattling around in my head. Until later…