Assignment 5 Project Idea Proposal
Author: Sohil Shah (sohils)
For this project, I would like to implement a bidirectional path tracing algorithm by extending the assignment 3 code. The main goal of this would be to render correct images with lower variance according to the same wall clock time. This should also help greatly with effects such as caustics due to the faster convergence of the integral.
Plan
To accomplish this project, these are the things that need to be done:
1. The pathtracer will be modified to shoot rays from both the light sources and the camera.
2. The rays from light sources and the camera will then be independently traced.
3. Endpoints from each of the two types of ray paths will be connected to compute contribution of radiance along each path.
4. Finally, each path will be weighted according to the probability of choosing that path connection.
Ideal Result
A correct implementation of this project will have the following features:
1. The project must converge to the same correct and unbiased images produced by the assignment 3 pathtracer.
2. Bidirectional pathtracing must converge faster than the original pathtracer.
3. Bidirectional pathtracing must converge in fewer samples per pixel than the original pathtracer.
Images such as these shown below describe how bidirectional should look compared to regular pathtracing. The goal of this project is to match these results.
Source: https://adventuresinrendering.wordpress.com/
Potential Problems
The problems anticipated here mostly lie with the calculation of the probability of choosing a path. However, there is enough research already available on the internet about the subject that this should not pose an issue. If you have any comments on parts of this project that might pose a problem or could be particularly difficult please contact me.
Actual Results and Implementation
The result of this project is a fairly successful bidirectional pathtracer. This pathtracer seems to do well rendering more interesting caustic effects and results in lower variance for many scenes. Its shortcomings are higher render time for identical sample count (which is expected) and high variance when a scene has a lot of delta materials. This is likely due to some mistakes in calculating pdfs when the path contains deltas. More work would have to go into calculating these distributions to reduce variance when a scene has lots of mirrors or glass. However, the implementation does give a lot of nice results.
Build instructions
Building is same as assignment 3 code. Just go to the directory, and:
mkdir build ; cd build ; cmake .. ; make
The command line arguments are also the same for the bidirectional pathtracer as the assignment 3 code. For quick and decent results, I recommend something like:
./pathtracer -t4 -s16 -l1 -m16 ../dae/sky/CBgems.dae
Result images
Bidirectional: 2048 samples
Standard Pathtracing: 4096 samples
Here we can see a lower amount of noise with fewer samples. The bidirectional render finished quicker than the standard pathtracing version (about 1.3x faster). Even with fewer samples, the bidirectional render can be seen to have lower variance in areas that are hard to sample such as this shadow.
Bidirectional: Much better caustic sampling
One thing we notice with the bidirectional renderer is that caustics are much more prominent. We can see effects that were not visible with standard pathtracing.
Bidirectional Caustics
Standard Pathtracing Caustics
This scene looks much better lit with bidirectional rendering.
Bidirectional Caustics
Standard Pathtracing Caustics
Notice the caustic from the mirror dragon on the floor of the box. This was not visible before.
Known Implementation Problems
Though I would say this project was a success, there are a couple of known issues with it.
First, the integrator may be biased as the images produced are often significantly different in lighting than those of the standard pathtracer. Certain areas are light more and others are darker. It is hard to prove whether this raytracer is biased or not.
Second, although it converges in fewer samples than the standard pathtracer, rendering those samples takes much longer. It turns out that through testing, to get an image with a certain variance, the pathtracer takes about 2-3x more samples. However, the bidirectional pathtracer can render the same number of samples in about 1.5x-2x slower. Therefore, the speedup is usually not too significant.
The bidirectional pathtracer produces high variance output images when a lot of delta materials are used. This is the biggest issue I see so far.
Based on these issues, I believe the best use for my implementation of the bidirectional pathtracer would be to render caustics. Maybe combine this with some other method that works better for delta materials to make it better. I think the issue is mainly with how the probabilities were calculated when delta materials were in the path.
References
A large portion of my research was from this paper by Eric Veach: https://graphics.stanford.edu/courses/cs348b-03/papers/veach-chapter10.pdf
Also used:
- http://www.cescg.org/CESCG98/PDornbach/
- https://adventuresinrendering.wordpress.com/2014/01/21/bidirectional-path-tracing-and-multiple-importance-sampling-part-1/
- https://adventuresinrendering.wordpress.com/2014/01/22/bidirectional-path-tracing-and-multiple-importance-sampling-part-2/
- http://celarek.at/2014/06/global-illumination-rendering-using-path-tracing-and-bidirectional-path-tracing/