Re: Ray Tracing is the Future and ever will be back
Board:
Board index
Raytracing
Links & Papers
(L) [2013/08/07] [tby friedlinguini] [Re: Ray Tracing is the Future and ever will be] Wayback!>> McAce wrote:Scroll down to Technical Reports; it refers to Understanding the Efficiency of Ray Traversal on GPUs -- Kepler and Fermi Addendum: [LINK https://mediatech.aalto.fi/~timo/publications/aila2012hpg_techrep.pdf]
Just ruin my smug sense of self-satisfaction, why don't you?
 >> dbz wrote:Slide 5 from that presentation called 'Real Time Path Tracing' is interesting. It mentions a 35x speedup over a GTX 680 is needed before path tracing can be done in real time. With gpu speed increasing by say 50% each generation of 1.5 years, it would take about 13 years before path tracing can be done in real time on a gpu. However, that ignores increases in screen resolution(4K screens in the near future, maybe 16K screens at that time) and in scene complexity.
It also ignores future algorithmic advances, not to mention noise reduction techniques that become more practical at higher resolutions.
(L) [2013/08/07] [tby dbz] [Re: Ray Tracing is the Future and ever will be] Wayback!Which noise reduction techniques?
Actually, 13 years is not really that bad. It is overseeable and a few years could be shaved off by using multiple gpu's. I like the realistic attitude (35x speedup over today's hardware required) NVIDIA has adopted, much better than the marketing from the past.
(L) [2013/08/07] [tby graphicsMan] [Re: Ray Tracing is the Future and ever will be] Wayback!I think the real problem at this stage is that path tracing is not enough; you really have to have more advanced algorithms in the face of "hard" scenes.  Which advanced algorithm depends on which kind of "hard" your scene is [SMILEY :)]  Some of these algorithms are less or un-suitable  for GPU architectures today.  I do hope that will change in the next few years though.  My guess is that in less than 13 years, you will be able to interactively path trace with good quality for easy scenes.
(L) [2013/08/07] [tby friedlinguini] [Re: Ray Tracing is the Future and ever will be] Wayback!>> dbz wrote:Which noise reduction techniques?
E.g., adaptive wavelet rendering, various multi-scale filtering methods, or maybe even some offshoot of that wacky gradient-domain MLT stuff.
(L) [2013/08/08] [tby Dade] [Re: Ray Tracing is the Future and ever will be] Wayback!>> graphicsMan wrote:I think the real problem at this stage is that path tracing is not enough; you really have to have more advanced algorithms in the face of "hard" scenes.
In my opinion, path tracing is enough for the game market and NVIDIA and any other major vendors are likely to be mostly interested to that market.
After all, we have access to cheap GPU computing devices thanks to the game market.
 >> graphicsMan wrote:My guess is that in less than 13 years, you will be able to interactively path trace with good quality for easy scenes.
I agree, a 10 years time window where to achieve hard real-time path tracing seems realistic to me too.
(L) [2013/08/09] [tby dbz] [Re: Ray Tracing is the Future and ever will be] Wayback!>> friedlinguini wrote:dbz wrote:Which noise reduction techniques?
E.g., adaptive wavelet rendering, various multi-scale filtering methods, or maybe even some offshoot of that wacky gradient-domain MLT stuff.
That adaptive wavelet rendering paper looks really interesting (if you can live with the artifacts, which games most probably can). The paper mentions about a 10x speedup vs path tracing with a much higher number of samples per pixel for approximately equal image quality. It does not look like it could easily be implemented on the gpu though(probably the priority queue needs to be stored in cpu ram) and I wonder why nobody is using it considering it has been around since 2009.
(L) [2013/08/09] [tby friedlinguini] [Re: Ray Tracing is the Future and ever will be] Wayback!>> dbz wrote:That adaptive wavelet rendering paper looks really interesting (if you can live with the artifacts, which games most probably can). The paper mentions about a 10x speedup vs path tracing with a much higher number of samples per pixel for approximately equal image quality. It does not look like it could easily be implemented on the gpu though(probably the priority queue needs to be stored in cpu ram) and I wonder why nobody is using it considering it has been around since 2009.
I haven't seen AWR used in a production renderer, but it has spawned a number of academic papers based on the same idea (blur the image with a few fixed radii, pick the [un]blurred with the best estimated bias/variance tradeoff for each pixel, and use the result to figure out where to add more samples). See [LINK http://www.ece.ucsb.edu/~psen/Papers/EG13_RemovingMCNoiseWithGeneralDenoising.pdf] for an example of a recent one. Strictly speaking, you don't need to use a priority queue to get the job done. The goal is to drive more samples to parts of the image with a higher error estimate. On a GPU you could periodically calculate an error estimate per pixel, derive a desired number of additional samples per pixel, and then normalize that to some fixed total number of additional samples per iteration. That last gather operation is less GPU-friendly than the other steps, but it's still cheap and executed relatively infrequently.
(L) [2013/08/10] [tby hobold] [Re: Ray Tracing is the Future and ever will be] Wayback!>> dbz wrote:With gpu speed increasing by say 50% each generation of 1.5 years, it would take about 13 years before path tracing can be done in real time on a gpu.
This is assuming that Moore's Law was still in effect. But it has slowed down considerably since the days of 90 nanometer structure sizes (that is the generation of Pentium 4, or PowerPC G5 CPUs). Worse yet, the three main consequences of Moore's Law, namely
1. faster clock speed
2. lower cost
3. lower energy consumption
with each new generation of fabrication technology no longer follow simultaneously from the shrinking of structure sizes. Power consumption was first to break (in Pentium 4 as mentioned). Then clock frequencies stopped improving exponentially (compare, for instance, Sandy Bridge, Ivy Bridge, and Haswell speed bins). Cost is going to break next, foreshadowed by the difficulties of the silicon foundries to transition from 300mm wafers to 450mm wafers.
Focusing on algorithm improvements is a safer bet today than waiting for Moore's Law to deliver more brute force.
(L) [2013/08/10] [tby friedlinguini] [Re: Ray Tracing is the Future and ever will be] Wayback!>> hobold wrote:This is assuming that Moore's Law was still in effect. But it has slowed down considerably since the days of 90 nanometer structure sizes (that is the generation of Pentium 4, or PowerPC G5 CPUs). Worse yet, the three main consequences of Moore's Law, namely
1. faster clock speed
2. lower cost
3. lower energy consumption
with each new generation of fabrication technology no longer follow simultaneously from the shrinking of structure sizes. Power consumption was first to break (in Pentium 4 as mentioned). Then clock frequencies stopped improving exponentially (compare, for instance, Sandy Bridge, Ivy Bridge, and Haswell speed bins). Cost is going to break next, foreshadowed by the difficulties of the silicon foundries to transition from 300mm wafers to 450mm wafers.
Focusing on algorithm improvements is a safer bet today than waiting for Moore's Law to deliver more brute force.
Moore's Law hasn't really fallen off yet, unless you're only looking at serial performance. True, the 10Ghz processors that were promised with Pentium 4 never materialized, but then came dual-core and quad-core CPUs, not to mention the explosion of GPU performance. Path tracing maps well to parallel processing; the main trick these days is keeping all the silicon fully utilized.
(L) [2013/08/10] [tby hobold] [Re: Ray Tracing is the Future and ever will be] Wayback!>> friedlinguini wrote:Moore's Law hasn't really fallen off yet, unless you're only looking at serial performance. True, the 10Ghz processors that were promised with Pentium 4 never materialized, but then came dual-core and quad-core CPUs, not to mention the explosion of GPU performance. Path tracing maps well to parallel processing; the main trick these days is keeping all the silicon fully utilized.
The parallel processors, most notably GPUs, have become more power hungry and more expensive at the top end with every generation. In the case of multicore CPUs, the core growth for consumer models has stopped at four cores. The typical workloads in the consumer market do not seem to scale much beyond that.
Server processors continue to gain more cores per die. But some server workloads scale so well that there is now a trend to move to an even higher number of  smaller, slower, more energy efficient cores with the same or slightly lower aggregate peak throughput.
Moore's Law by the letter, namely that the number of logic gates will continue to grow (with no mention of any performance metric), will hold for another while. But we're already at the point where you get only one improvement of the three: performance, energy efficiency, cost. We used to get all three improved, or at least two out of three, but that hasn't been the case for a while now.
Additionally, the improvements which we did get recently no longer lie on an exponential curve. Instead we are in a phase of slow linear improvements, which I take to mean that we are near the inflection point of a saturation curve. That would indicate an asymptotical limit of growth in the relatively near future (a lot nearer than the time it took from the first electronic computers to the inflection point, I am guessing).
I am not even sure that realtime 3D graphics is embarrassingly parallel enough, given the development of price and power consumption of the last few generations of high end GPUs.
(Clarification: I am not trying to win an argument here. If what I say is not convincing, then I could simply be wrong. I am trying to give an advance warning to those who are still betting on the Moore's Law of old.)
(L) [2013/08/12] [tby dbz] [Re: Ray Tracing is the Future and ever will be] Wayback!>> friedlinguini wrote: See [LINK http://www.ece.ucsb.edu/~psen/Papers/EG13_RemovingMCNoiseWithGeneralDenoising.pdf] for an example of a recent one.
This looks pretty amazing, even the denoising as a simple post-process(MLD in the paper) looks very good. This could be a very important contribution to real-time gi if the method works as well in general as on the scenes in the paper. Thanks for bringing this to my attention, I will most certainly try this method.
(L) [2013/08/12] [tby graphicsMan] [Re: Ray Tracing is the Future and ever will be] Wayback!I could be wrong, but I thought I might mention that I *believe* this work is IP encumbered...
(L) [2013/08/12] [tby dbz] [Re: Ray Tracing is the Future and ever will be] Wayback!Really? They patented it or something? I didn't see anything mentioned about that in the paper or the website where I downloaded it from.
(L) [2013/08/12] [tby graphicsMan] [Re: Ray Tracing is the Future and ever will be] Wayback!I can't find any reference to it now.  I wonder if I'm thinking of another filtering technique... perhaps I'm just smoking crack [SMILEY :)]
back