(L) [2014/10/03] [ost
by shiqiu1105] [An error in pbrt? About MLT start-up bias] Wayback!Hi
I have been reading the MLT implementation in pbrt and trying to understand the math as much as I can.
One thing that confuses me is that, here in the start up phase, the initial sample X0 is sampled in the following way.
1.jpg
where it says we need to weight all contributions with w.
However, when adding contribution, the w is ignored.. Is this a mistake?
2.jpg
I derived it myself, and it seems like the w should equal to b. Should we multiply another b to all the contributions?
Also, Keleman's paper seems to be using a different weighting scheme for large step and rejection samples.
Compared the approach in pbrt, which one is better?
Thanks,