Re: Stochastic Progressive Photon Mapping back

Board: Board index ‹ Ray tracing ‹ !Real time ‹ Off-line

(L) [2009/12/17] [SirSmackalot] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> patlefort wrote:I am getting 35 MP/s on my ASUS Radeon 4870 non overclocked. Unrolling the loop makes no difference for me.
Which version of driver are you using? I really want to see that running here.
(L) [2009/12/17] [patlefort] [Re: Stochastic Progressive Photon Mapping] Wayback!

I'm using ATI drivers 9.9 which I downloaded from [LINK http://game.amd.com/us-en/drivers_catalyst.aspx].
This is on Windows XP 32 bits.
(L) [2009/12/17] [thachisu] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> patlefort wrote:I am getting 35 MP/s on my ASUS Radeon 4870 non overclocked. Unrolling the loop makes no difference for me.
Interesting. The unrolling thing is perhaps a lack of some code optimization in NVIDIA's compiler then.
(L) [2009/12/17] [thachisu] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> SirSmackalot wrote:System: Win 7, x64, E6600, 4 GB.
Maybe it has something to do with Win 7?
(L) [2009/12/23] [SirSmackalot] [Re: Stochastic Progressive Photon Mapping] Wayback!

Well, just installed the brand new driver (9.12 with hotfix) from ATI and it works [SMILEY :)].
I dont know what was wrong with the old one, but its gone.
Win 7, 64bit, Core2Duo E6600@3,3Ghz, 4890. Program running at 41 Million Photons/s.
(L) [2010/02/15] [Hassoon] [Re: Stochastic Progressive Photon Mapping] Wayback!

Hello,
First of all, great work!
For those of you who are still bumping into glut32.dll errors on a win 64-bit OS, it is because it looks for the dll file under C:\Windows\System32.
Simply copy the dll file from there and paste it into C:\Windows\SysWOW64, and you should be fine.
(L) [2010/03/03] [David Olsson] [Re: Stochastic Progressive Photon Mapping] Wayback!

I have a question about how you should handle big scenes, like a landscape. Consider the following situation. Half of the rays for a pixel/region hit a close object, say a tree and the rest of the hit a distant mountain. How do you choose a good starting radius?
Also assume the density of photons is proportional to some inverse exponential distance to the camera. Then the close hits will shrink the radius and basicly starve the distant hits from any photons.
Is there a solution to this, perhaps make the radius proportional to the distance the ray have traveled?
Btw, have anyone a good solution for the photon hash map with big scenes, perhaps by using a non linear transform before reading a storing photons i the hashmap?
(L) [2010/03/03] [graphicsMan69] [Re: Stochastic Progressive Photon Mapping] Wayback!

I haven't implemented this, but maybe a factor based on the world-space footprint of a pixel?  I'm not sure if you could directly use that, since if you hit a leaf and the mountain in the same pixel, you'd have a huge footprint.  You could probably use image space projected derivatives to estimate how big the "footprint" would be at the leaf and at the mountain, and you could average those?
I'd be interested to hear how this is handled in practice as well.
(L) [2010/03/04] [thachisu] [Re: Stochastic Progressive Photon Mapping] Wayback!

To be honest, I think SPPM is not a good option for rendering outdoor scenes. It would be nice if we can render various scenes with a single algorithm, though :->
For the initial radius estimation, you might be able to use ray differentials ([LINK http://graphics.stanford.edu/papers/trd/]) to get approximate size of pixel footprint in a scene. It is not a perfect solution as graphicsMan69 pointed out, and I do not have a good solution for that right now. I guess you might just want to use a tree data structure for big scenes anyway, not the hashed grid.
(L) [2010/03/04] [jbarcz1] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> graphicsMan69 wrote:I haven't implemented this, but maybe a factor based on the world-space footprint of a pixel?  I'm not sure if you could directly use that, since if you hit a leaf and the mountain in the same pixel, you'd have a huge footprint.  You could probably use image space projected derivatives to estimate how big the "footprint" would be at the leaf and at the mountain, and you could average those?
I'd be interested to hear how this is handled in practice as well.
I've been thinking about this problem too.  
Using ray differentials would still yield a big footprint for the mountain and a small one on the leaf.  You might get artifacts as a result, since pixels that hit both will gather photons over a much wider area than the ones that hit the leaf only.  For those sub-samples that hit the leaf, you could gather photons from all across the leaf's surface, and perhaps over adjacent leaves.  The pixel next door, on the other hand, hits the leaf exclusively, and so gathers from a much narrower area.  The discontinuity might be sharp enough to be visible.  If part of the leaf is in sunlight and part of it isn't, you might see a green halo.
What might solve it is to create multiple PPM sites per pixel, using some sort of clustering based on footprint radius.  So, for the super-sampled pixels, you'd gather one set of photons for the mountain and one for the leaf, and track two different sets of statistics (weighted by relative pixel coverage).  You'd have to fire a bunch of rays in advance in order to set it up.   
There's still the inherent sampling problem.  Unless you're very clever about the photon distribution, you'll have a hard time getting adequate photon density for a closeup of a leaf in an outdoor scene.  In theory, you can always jack up the photon count until you get enough of them, but since you're dividing the flux by the number of photons you emit, I'd be worried about losing too much precision.  The more samples you fire, the fewer of them you'll actually be able to use, because your radii have shrunk.  So unless you get pretty close to the right answer early on, you end up dieing from precision loss before you ever get there.
(L) [2010/03/04] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

Why do you care so much about the initial radii? Pick whatever you want, initial size is not that crucial. The big problem is getting photons inside [SMILEY :)]
..although getting the photons there is of course related to the initial radius, it's more a question like do you want something totally wrong or nothing at all [SMILEY :)] Plus, I don't think progressive photon mapping is the right way to go if you have large open scenes. In fact, my experience tells me it's very good only for caustics.
(L) [2010/03/05] [David Olsson] [Re: Stochastic Progressive Photon Mapping] Wayback!

I will try with distance based radius, and non linear transform of the hashmap and see if it works. Let you know.
 >> ingenious wrote:Why do you care so much about the initial radii? Pick whatever you want, initial size is not that crucial. The big problem is getting photons inside
..although getting the photons there is of course related to the initial radius, it's more a question like do you want something totally wrong or nothing at all  Plus, I don't think progressive photon mapping is the right way to go if you have large open scenes. In fact, my experience tells me it's very good only for caustics.
What would you recommend for large open scenes?
Btw has anyone tried final gather with sppm?
(L) [2010/03/05] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> David Olsson wrote:What would you recommend for large open scenes?
Btw has anyone tried final gather with sppm?
Again, depends on the type of scene and illumination, but if you have a city or some landscape with an environment map, the effect of indirect illumination and its complexity are rather small. The direct illumination from the environment map is more important be solved efficiently. Thus, simple path tracing with next event estimation will suffice giving you the most important effects - direct illumination, ambient occlusion, color bleeding. Plus, it's simple to implement and fast. One efficiency tweak for such scenarios would be for example to have more light source samples for the primary rays, and less and less for the next bounces.
(L) [2010/03/05] [thachisu] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> ingenious wrote:Plus, I don't think progressive photon mapping is the right way to go if you have large open scenes. In fact, my experience tells me it's very good only for caustics.
Let me clarify one thing  [SMILEY :)]  I totally agree that progressive photon mapping is most likely not the fastest way to render large open scenes. I am not against you. At the same time, I personally think that it is probably not so difficult to make progressive photon mapping perform ok-ish on such scenes as well, say, using Metropolis (not MLT) method for photon tracing. In the end, what I am aiming for is a single method that works reasonably fast and produces correct solutions of the rendering equation in any non-pathological settings (i.e., robust), and I shamelessly believe progressive photon mapping is one step toward this goal.
In fact, to achieve this goal. we do not need to use *a single method*, so it is probably interesting to investigate how we can optimally combine multiple rendering techniques similar to MIS. If PPM is good for caustics, why not just separate caustics, render it by PPM and render other components by whatever? :->
(L) [2010/03/05] [patlefort] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> thachisu wrote:In fact, to achieve this goal. we do not need to use *a single method*, so it is probably interesting to investigate how we can optimally combine multiple rendering techniques similar to MIS. If PPM is good for caustics, why not just separate caustics, render it by PPM and render other components by whatever? :->

This is what I think too, one should be able to choose how to sample certain area or materials or lights by different methods. Like sampling lightning on terrains and buildings with path tracing and sampling caustic capable objects with photon mapping.
(L) [2010/03/06] [reit] [Re: Stochastic Progressive Photon Mapping] Wayback!

Hi there and thanks for a your great work Toshiya.
I am trying to implement your method using cuda, but i am struggling a bit with the hashmap .. So maybe i will just go for a
uniform grid. But anyways i am officially a fanboy [SMILEY :-)]
(L) [2010/03/06] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> thachisu wrote:Let me clarify one thing   ...
...what I am aiming for is a single method that works reasonably fast and produces correct solutions of the rendering equation in any non-pathological settings (i.e., robust), and I shamelessly believe progressive photon mapping is one step toward this goal.
In fact, to achieve this goal. we do not need to use *a single method*, so it is probably interesting to investigate how we can optimally combine multiple rendering techniques similar to MIS. If PPM is good for caustics, why not just separate caustics, render it by PPM and render other components by whatever? :->
Don't get me wrong - I love (S)PPM [SMILEY :)] I meant that in my experience it's very good only for caustics as it is. That's because photon mapping is well suited for high frequency illumination, where there's very little coherence between neighboring samples, and because PPM converges to the right solution using a limited amount of memory. It's IMO the best and easiest to implement method for correct caustic simulation. With the side note that you really have to use double precision for the radius  [SMILEY :D]
Running a few tests with your cute smallppm implementation with increasing number of photons it's very interesting to observe how caustics converge much faster than the diffuse illumination, quite the opposite result than you'd get with path tracing for instance [SMILEY :)] And of course it makes perfect sense to combine/extend PPM with other techniques.
Lastly, for me PPM has been probably the most inspiring GI paper recently. And it's simply because what people say is so darn true - the best ideas are simple. And that's what a researcher seeks in papers - inspiration, not ready solutions. Remember the real-time GPU GI paper on SIGGRAPH 2009? Respect for all the effort those guys put into that whole system, but I did not see any strong novel contributions in that paper.
Edit: Actually the reason PPM is very good for caustics and not good for diffuse illumination is because for caustics you cannot do much better than "wait" for specularly bounced particles to hit the areas of interest and cache them, while for lower frequency illumination a gathering approach is more efficient in general. For the same reason actually in standard photon mapping you perform final gathering.
(L) [2010/03/18] [Guest] [Re: Stochastic Progressive Photon Mapping] Wayback!

4M Photons/s on & 8600M GT! GReat (but would be greater if the scene was interactively navigable [SMILEY :)] )
(L) [2010/03/18] [stew] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> ingenious wrote:Edit: Actually the reason PPM is very good for caustics and not good for diffuse illumination is because for caustics you cannot do much better than "wait" for specularly bounced particles to hit the areas of interest and cache them, while for lower frequency illumination a gathering approach is more efficient in general. For the same reason actually in standard photon mapping you perform final gathering.
A good next step would be progressive irradiance caching for final gathering. One could start with a large maxerror and a low number of gather rays for each sample location, then progressively add gather rays to the samples and reduce the maxerror. Combined with SPPM, I think one could end up with a neat interactive solution for both caustics and diffuse IDL.
(L) [2010/04/15] [Zelcious] [Re: Stochastic Progressive Photon Mapping] Wayback!

70MP/s on a NVIDIA Fermi GTX 480. Only had 15MP/s until I turned off vertical sync. A bit worried there [SMILEY :)]
(L) [2010/05/10] [Guest] [Re: Stochastic Progressive Photon Mapping] Wayback!

I was reading on another forum ([LINK http://forum.cgpersia.com/f27/japanese-gpu-renderer-11560/]) about a very fast GPU renderer developed by Toshiya Hachisuka, which is in fact a few years old, but it delivers superb quality in just a few seconds on ancient 8-year-old hardware (ATI 9700 Pro)! A demo can be downloaded at http:/www.bee-www.com/parthenon/.  
Even more interesting is a translated quote from Mr Hachisuka himself ([LINK http://www.nvidia.com/object/fellowship_Toshiya.html who is now a research fellow at Nvidia]) about a new GPU renderer that he is currently developing:
(edited out quote)
It's a pity that the Parthenon GPU renderer is not being developed any longer, I would have loved to see it make full use of today's hardware. Secondly, I'm not sure why the GPU rendering algorithms that he is referring to are inferior to CPU based algos. I presume it means that GPU renderers currently use brute force path tracing which is very inefficient compared to bidir + MLT used by CPU renderers. Judging from the description at Nvidia's page, I think this new GPU renderer will implement these efficient algorithms in combination with sppm to make it robust and generally applicable.
(L) [2010/05/10] [toxie] [Re: Stochastic Progressive Photon Mapping] Wayback!

parthenon had to use weird tricks and workarounds to get any performance at all on GPUs (i.e. abusing the rasterizer).. so i doubt that anything of this should still be useful for todays GPU generations..
(L) [2010/05/10] [thachisu] [Re: Stochastic Progressive Photon Mapping] Wayback!

Guest:
First of all, it is not generally a good manner to quote private e-mail conversation without any confirmation in advance :->
I therefore do not assure correctness, accuracy or validness on any of your quoted and translated comments. If possible,
delete it from your post. I am not mad, but please be careful next time (I know who you are)!
Second of all, I agree with toxie. Parthenon does use tricks to get it working reasonably well on a wide
range of hardware including very old one. Recent GPU renders fit better if you have a high-end GPU from recent generations.
(L) [2010/05/10] [Guest] [Re: Stochastic Progressive Photon Mapping] Wayback!

Sorry Toshiya, maybe a mod can delete that part from my post
(L) [2010/05/27] [Diyer2002] [Re: Stochastic Progressive Photon Mapping] Wayback!

Hi guys.
I've got some problem when implemented the SPPM.  As the paper mentioned, sppm assigns the  shared radius and flux to each pixel, and update them using the hitpoints generated by each eye-trace pass. But in a scene that contain  materials that contain both diffuse and specular BRDFs, when tracing ray from a pixel, there could generate multi hitpoints, so how should I do the update operation then? The demo seems  using pure materials and don't handle this situation.
Sorry for my poor English.
(L) [2010/06/04] [kotletas] [Re: Stochastic Progressive Photon Mapping] Wayback!

[SMILEY [-o<]
(L) [2010/06/05] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> Diyer2002 wrote:I've got some problem when implemented the SPPM.  As the paper mentioned, sppm assigns the  shared radius and flux to each pixel, and update them using the hitpoints generated by each eye-trace pass. But in a scene that contain  materials that contain both diffuse and specular BRDFs, when tracing ray from a pixel, there could generate multi hitpoints, so how should I do the update operation then? The demo seems  using pure materials and don't handle this situation.
You should trace the camera path as in path tracing and generate only one hit point. That is, when your primary ray hits a surface with multi-BRDF material, then you just randomly decide (ideally importance-based) which BRDF to sample.
More formally, you handle length 2 camera paths (with 3 vertices) using standard stochastic sampling, photon tracing handles light paths with arbitrary length, and PPM makes the connection in between. Thus, the BDRF sampling in the middle vertex of the camera path has actually nothing to do with PPM.
(L) [2010/06/05] [thachisu] [Re: Stochastic Progressive Photon Mapping] Wayback!

Just in case someone else wonders the same thing, here is my answer (I already told it to Diyer2002 through PM since I got the same question by PM).
---------
The easiest solution is to just pick one of the materials based on Russian Roulette. The demo is using this technique for handling specular reflection and specular refraction of a dielectric material. It is also possible to handle multiple hit points per pixel by having a list of hit points per pixel. We then just perform range queries at those points and collect all the photons before updating photon statistics per pixel.
EDIT:  coincidence! [SMILEY :o]
(L) [2010/06/05] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> thachisu wrote:EDIT:  coincidence!
Are you also subscribed to the RSS feed?  [SMILEY :D]
I am actually interested what happens if your sampled (gather) ray hits a light source [SMILEY :)] I guess you should account for such contributions separately... An even more interesting situation is when you have a distant non-delta light source, such as a directional light source that models the sun (i.e. emits not only in a single direction, but in a cone of directions). Since you don't perform density estimation on the primary hit point and you cannot hit the light source, then if the ray doesn't hit anything you have to check whether its direction falls inside the emission cone and account for the contribution [SMILEY :)]
Maybe the cleanest and actually more efficient solution after all will be to handle all direct illumination separately.
Or perform density estimation on the primary hit point but only accounting for direct photons. But you'd need to maintain separate statistics for this, I guess.
Wait, this sounds a bit complicated. Maybe I'm missing something. It's too late in the night here [SMILEY :)]
(L) [2010/06/05] [Guest] [Re: Stochastic Progressive Photon Mapping] Wayback!

Hello.
I'm also interested in implementing (S)PPM and have a question.
How to handle multiple light sources with different power and spectrum in photon tracing step?
I mean how to sample lights when tracing a new photon? According to light power or maybe take samples equally from all lights?
Sorry for bad english.
(L) [2010/06/05] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> Guest wrote:Hello.
I'm also interested in implementing (S)PPM and have a question.
How to handle multiple light sources with different power and spectrum in photon tracing step?
I mean how to sample lights when tracing a new photon? According to light power or maybe take samples equally from all lights?
Sorry for bad english.
This has again nothing to do with (S)PPM. Ideally, you want to emit photons proportionally to the total power of each light source. That is, if you have two lights and one emits 2x more power than the other, you will shoot 2x more photons from it.
(L) [2010/06/05] [Guest] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> ingenious wrote:This has again nothing to do with (S)PPM. Ideally, you want to emit photons proportionally to the total power of each light source. That is, if you have two lights and one emits 2x more power than the other, you will shoot 2x more photons from it.
First, i thought in usual PM total power for every light source is divided by number of photons emitted from this light. But in PPM unnormalized flux from all photons is accumulated at hit point and later is divided by number of all emitted photons from all lights. If light source B has more power and more samples taken then light A will be undersampled. This seems wrong to me.
Second, i'm not sure how to define this "total power" for lights in RGB model. Let say light A has RGB = 10, 10, 0 and light B has 0, 10, 20. Which weights should be assign to these lights ?
(L) [2010/06/05] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> Guest wrote:First, i thought in usual PM total power for every light source is divided by number of photons emitted from this light. But in PPM unnormalized flux from all photons is accumulated at hit point and later is divided by number of all emitted photons from all lights. If light source B has more power and more samples taken then light A will be undersampled. This seems wrong to me.
Second, i'm not sure how to define this "total power" for lights in RGB model. Let say light A has RGB = 10, 10, 0 and light B has 0, 10, 20. Which weights should be assign to these lights ?
It can still be done, also in combination with importance sampling of the lights. The idea basically is to treat all lights as a single one. Here's an example:
* Light 1 has total power of 2
* Light 2 has total power of 4
So, total emitted power from all lights is 6. You shoot 12 photons with power 6, and the number of photons from each light is proportional to its total power, i.e. 4 photons from Light 1 and 8 photons from Light 2. Of course, you will do this not directly, but every time you shoot a photon you will randomly choose the light source to shoot from using a 1D CDF with the total power of each light. Then you normalize the energy of each photon by the total number of photons, thus each photon will have power 0.5. Which correctly adds up for each light source.
I've used scalars for the total power here, which can be computed by taking the luminance of the total RGB power. This is only relevant for the 1D CDF. The power of the photons is always RGB.
Regarding the estimation of the total power emitted from each light source. If your lights are specified with radiance (i.e. flux per unit area per unit solid angle), then you need to integrate over the total emission solid angle and area of the light source. It sounds more complicated that it actually is to implement, if you use standard analytical light source models. PBRT has examples how to do that (look for a function with name like totalPower() in the Light interface).
(L) [2010/06/05] [graphicsMan69] [Re: Stochastic Progressive Photon Mapping] Wayback!

It works generally to shoot out based on importance of the light (importance sampled monte carlo).  You still have to keep track of the number to divide by, but you don't have power balance issues.
(L) [2010/06/10] [Ryen] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> ingenious wrote:Diyer2002 wrote:I've got some problem when implemented the SPPM.  As the paper mentioned, sppm assigns the  shared radius and flux to each pixel, and update them using the hitpoints generated by each eye-trace pass. But in a scene that contain  materials that contain both diffuse and specular BRDFs, when tracing ray from a pixel, there could generate multi hitpoints, so how should I do the update operation then? The demo seems  using pure materials and don't handle this situation.
You should trace the camera path as in path tracing and generate only one hit point. That is, when your primary ray hits a surface with multi-BRDF material, then you just randomly decide (ideally importance-based) which BRDF to sample.
More formally, you handle length 2 camera paths (with 3 vertices) using standard stochastic sampling, photon tracing handles light paths with arbitrary length, and PPM makes the connection in between. Thus, the BDRF sampling in the middle vertex of the camera path has actually nothing to do with PPM.
Thanks for your reply. I've implemented a framework for SPPM. But the result seems not right. Here are some images:
I shot 20,000 photons every pass, and only use SPPM to compute illumination. Below is the result after first pass:
yafray - 5 times - 1 pass.png
after 35 passes: the ceiling was smoother but other parts also contain noise.
yafray - 5 times - 35 pass.png
after 75 passes: nothing seems changed. the grain-like noise still left.
yafray - 5 times - 75 pass.png

Seems I do somethings wrong such that the 75-passes image does not change. It should make some difference though.  [SMILEY :(]
What is most likely the cause of this noise?  Is it a  trandition PM problem I met?   
Any suggestion will be aprreciate.  [SMILEY :)]
(L) [2010/06/10] [Ryen] [Re: Stochastic Progressive Photon Mapping] Wayback!

If I set initial radius larger, the noise was removed a lot, but the color bleeding artifact is also more obvious. so it is a not a best solution:
yafray (4).png
Theoretically speaking, no matter how large the initial radius I set, the result should all convergence to the same, but above post show after 75 passes, nothing changed. Maybe there is something wrong when I emit Photons (cause correlated problem) I guess.
(L) [2010/06/10] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

Your implementation should not necessarily be wrong. PPM in general converges slowly for diffuse indirect illumination, while, as you can see, you have a pretty decent caustic already after the first iteration. And yes, (S)PPM is also prone to noise. Setting a larger initial radius blurs out some noise, but also introduces larger start-up bias, which takes then longer to diminish.
(L) [2010/06/10] [Dade] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> Ryen wrote:If I set initial radius larger, the noise was removed a lot, but the color bleeding artifact is also more obvious. so it is a not a best solution:
Ryen, how are you testing if an hit point is affected by a photon bounce or not ? I ask because your problem could be related to how you test the normals at the hit point; it looks like you have color bleeding between different surfaces (and this should be avoided by normal test).
Yesterday, I did my very first SPPM rendering and it working quite well:
sppm.jpg
The caustic reflected on the mirror looks beautiful (i.e. LSDSE path)  [SMILEY :D]
P.S. I trace 2,000,000 photon every pass, it is a huge difference form your 20,000  [SMILEY :?:]
(L) [2010/06/10] [Ryen] [Re: Stochastic Progressive Photon Mapping] Wayback!

>> Dade wrote:Ryen wrote:If I set initial radius larger, the noise was removed a lot, but the color bleeding artifact is also more obvious. so it is a not a best solution:
Ryen, how are you testing if an hit point is affected by a photon bounce or not ? I ask because your problem could be related to how you test the normals at the hit point; it looks like you have color bleeding between different surfaces (and this should be avoided by normal test).
Yesterday, I did my very first SPPM rendering and it working quite well:
sppm.jpg
The caustic reflected on the mirror looks beautiful (i.e. LSDSE path)  
P.S. I trace 2,000,000 photon every pass, it is a huge difference form your 20,000  
@ingenious
Unfortunately, I had to say there must be something wrong in my implementation. Because after I do more passes, the result image introduce new noise  [SMILEY :(]   (become bump-like at the blue wall.)
Same initial radius as the last image, after 88 pass. The above is about 16 passes:
yafray - 8 times - 88 pass.png
What may be the problem?  Shoot too little photon every pass?
@Dade
Hi Dade!  Very beatiful result.
I want to know how many passes do you use to get this result? And what data structure do you use to hold photons? Did you only use SPPM to compute the illumination?  
As a newbie, I need your advice deadly.  Thanks forward.  [SMILEY :)]
(L) [2010/06/10] [ingenious] [Re: Stochastic Progressive Photon Mapping] Wayback!

@Ryen: it is normal for the noise to appear, and its frequencies get higher over time. If you have a huge initial radius, everything will look relatively smooth in the beginning, but as the radii shrink over time, different neighboring points collect different random photons which results in noise. Show an image after 200 iterations (and pleas increase the number of photons per iteration).

back