(L) [2014/10/02] [ost
by Emphy] [Re: Experimental Development Framework (C++ and ray tracing)] Wayback!>> cgribble wrote:Not surprisingly, I like rtVTK:  [LINK http://www.rtvtk.org/]
I've been overhauling the entire system, which includes an Embree plugin, but that's been delayed due to some other projects.  However, it should be straightforward to integrate your own ray tracer---based on Embree or otherwise---in the existing release (v1.3.7).
Seems very interesting. Admittedly I only moderately skimmed over the paper, but I couldn't really find out whether it's gonna work with my own implementation of a CPU path tracer (whatever that might end up to be) and Embree? Let's just say I build my own CPU path tracer which shoots rays and so on (obviously) using Embree to return intersections etc.  Can I just do calls to rtVTK to trace rays and visualize them "completely" independent of "whatever" code I'm gonna write? Ofcourse within the confinements of what a PT should do. Let's just say I color all the pixels my own way (OpenGL) and want to render the ray visualization on top, is that possible?
Also, I imagine that using Embree's BVH or whatever is kind of hard to visualize?
Thanks for your time!
(L) [2014/10/03] [ost
by joulsoun] [Re: Experimental Development Framework (C++ and ray tracing)] Wayback!>> Emphy wrote:It looks quite nice indeed, however it just seems to me it already does it all... so not much learning for me to be done
If you intend to research new areas of algorithmically optimizing a renderer, it makes sense to start from an already mature renderer. In my experience, you'll be more likely to spend your time developing your framework otherwise, instead of actually reaching the point where you can try cool ideas.
The neat thing about Mitsuba is that it already has a vast number of different integrators implemented, so the architecture is pretty damn well thought out to be able to let these integrators coexist in the same framework.
I guess it all depends on if you're aiming for boilerplate optimization techniques (code/hardware optimization), or higher level algorithmic optimization.
(L) [2014/10/05] [ost
by cgribble] [Re: Experimental Development Framework (C++ and ray tracing)] Wayback!>> Emphy wrote:cgribble wrote:Not surprisingly, I like rtVTK:  [LINK http://www.rtvtk.org/]
I've been overhauling the entire system, which includes an Embree plugin, but that's been delayed due to some other projects.  However, it should be straightforward to integrate your own ray tracer---based on Embree or otherwise---in the existing release (v1.3.7).
Seems very interesting. Admittedly I only moderately skimmed over the paper, but I couldn't really find out whether it's gonna work with my own implementation of a CPU path tracer (whatever that might end up to be) and Embree? Let's just say I build my own CPU path tracer which shoots rays and so on (obviously) using Embree to return intersections etc.  Can I just do calls to rtVTK to trace rays and visualize them "completely" independent of "whatever" code I'm gonna write? Ofcourse within the confinements of what a PT should do. Let's just say I color all the pixels my own way (OpenGL) and want to render the ray visualization on top, is that possible?
You can visualize rays from any source, whether by instrumenting your own ray tracer with calls to the rl API to capture rays as they are generated, or by iterating over an already computed buffer of ray data and adding the rays as a pre-visualization step, or whatever.  And yes, you can render the ray visualization over a framebuffer computed in any manner you choose---whether actually ray traced or not.  (For example, I often debug new features in my batch path tracer by visualizing rays generated offline over a simple OpenGL rendering of the scene to explore the rays+scene interactively.)
 >> Emphy wrote:Also, I imagine that using Embree's BVH or whatever is kind of hard to visualize?
Thanks for your time!
That may be true---I haven't dug deeply enough into the Embree API to know whether they expose any details (e.g., node bounding boxes) of their acceleration structures directly via the API or not.  If they do, adding a plugin for the rtVTK pipeline to visualize the structure would be straightforward; otherwise, you'd have to hack Embree to extract the necessary BVH data first.
I'd be happy to discuss the details of using rtVTK components and concepts in your context.  My contact information is available at the rtVTK web site, so drop me a direct line if you'd like.  (I'm happy to discuss here, too, if others would find the details interesting.)