Stunning ATI tech demo using ray tracing back

Board: Board index ‹ Ray tracing ‹ Visuals

(L) [2008/06/18] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

[IMG #1 Image]
Short clip here: [LINK http://www.amd.com/us-en/assets/content_type/DigitalMedia/AMD_Ruby_S04.swf]
Developer explaining the voxel based rendering using ray tracing:
[LINK http://www.youtube.com/watch?v=ROAJMfeRGD4&feature=related]
(Click on "watch in high quality")
[IMG #1]:[IMG:#0]
(L) [2008/06/18] [toxie] [Stunning ATI tech demo using ray tracing] Wayback!

so anybody knows if this is like "real" realtime or just a GPU rendering a video offline?
(L) [2008/06/18] [Stereo] [Stunning ATI tech demo using ray tracing] Wayback!

Hmmm, in the video at YouTube they show that it's possible to look around freely. So I'd say it's really real real-time.  [SMILEY :)]
(L) [2008/06/18] [monkeyman] [Stunning ATI tech demo using ray tracing] Wayback!

which one on youtube?
(L) [2008/06/18] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

I think that demo is rendering using the ping-pong/humus GI fake global illumination technique ( DX10.1 cubemap array renders + SH )... and the reflections are made using render to cubemaps per object. The voxel technique is referring to that cubemap grid array, nothing more nothing less.
And yep.... looks nice... but it has a trick... notice only a few things in the scene are dynamic ( 1 car, the mech and Ruby )... all the other objects aren't moving... because that will imply to update a lot of cubemap arrays... and that could be too slow.
(L) [2008/06/18] [monkeyman] [Stunning ATI tech demo using ray tracing] Wayback!

>> monkeyman wrote:which one on youtube?
 wow I guess I am more tired than I thought...
Also the road or at least some rocks are also dynamic.
(L) [2008/06/19] [lycium] [Stunning ATI tech demo using ray tracing] Wayback!

holy shit!!
(L) [2008/06/19] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

>> jogshy wrote:The voxel technique is referring to that cubemap grid array, nothing more nothing less.
I think it's all voxels, the guy (Jules Urbach from OTOY) clearly says in the youtube video: "this is a really novel way of rendering graphics, we're not using any polygons [...] every single pixel you see in the scene has depth and is essentially renderable as voxels".

BTW Not related to ray tracing, but another equally stunning ATI demo has surfaced showing a photorealistic scorpion in a terrarium:
Pic: [LINK http://farm4.static.flickr.com/3148/2585016023_341219cf32_b.jpg]
Vid: [LINK http://www.latimes.com/video/?autoStart=true&topVideoCatNo=default&clipId=2601141]
Proof of real-timeness:  [LINK http://youtube.com/watch?v=NuR1wBCw_FU]
(L) [2008/06/19] [cignox1] [Stunning ATI tech demo using ray tracing] Wayback!

I saw this a couple of days ago and well, I still have to find the words to describe it  [SMILEY :shock:]
I wasn't able to understand what that guy said in the video (my bad english, and the low quality sound of my notebook speakers), so if anyone feels like giving here a short overview I would apreciate. I'm particularly interested in the lighting: what is dynamic and what is not.
And what is all that voxel thing? Should this card be supposed to render such things with traditional techniques as well? I'm concerned about memory required by a voxel approach, but I don't really know how they did that...
On youtube someone said that he/she saw the high res version of the clip. Does someone know if it is available somewhere?
(L) [2008/06/19] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

I don't thik all the scene are voxels... If not, the meshes's appeareance should be square.... and I see plenty of curves there.
Perhaps the scene is ray traced indeed... just using an uniform grid as spatial structure... but I'm still thinking that he's referring to the humus GI technique ( [LINK http://www.youtube.com/watch?v=MPAExcS80NI] and [LINK http://www.youtube.com/watch?v=omq3O22EUx4&NR=1] ).... the marketing guys just went crazy pronouncing the cursed word...
About the high-res video... you have a 11s video in AMD's site:
[IMG #1 Image]
video: [LINK http://www.amd.com/us-en/assets/content_type/DigitalMedia/AMD_Ruby_S04.swf]
realtime evidence: [LINK http://www.youtube.com/watch?v=ROAJMfeRGD4&NR=1] ( sure you click on "watch in high quality )
More:
[IMG #2 Image]
[LINK http://youtube.com/watch?v=uyogmLKx4Ks]
[IMG #3 Image]
video: [LINK http://www.latimes.com/video/?autoStart=true&topVideoCatNo=default&clipId=2601141]
realtime evidence: [LINK http://www.youtube.com/watch?v=NuR1wBCw_FU] ( sure you click on "watch in high quality )
[IMG #1]:Not scraped: https://web.archive.org/web/20100619133828im_/http://www.pcghx.com/screenshots/medium/2008/06/Ruby_new_techdemo_01.png
[IMG #2]:[IMG:#1]
[IMG #3]:[IMG:#2]
(L) [2008/06/20] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

>> cignox1 wrote:I saw this a couple of days ago and well, I still have to find the words to describe it   
I wasn't able to understand what that guy said in the video (my bad english, and the low quality sound of my notebook speakers), so if anyone feels like giving here a short overview I would apreciate. I'm particularly interested in the lighting: what is dynamic and what is not.
And what is all that voxel thing? Should this card be supposed to render such things with traditional techniques as well? I'm concerned about memory required by a voxel approach, but I don't really know how they did that...
Ok, I tried to transcribe his talk as much as possible, there are a lot of parts that I couldn't understand because of the bad sound, but it should give a rough idea. Maybe someone with a better understanding of English could fill in the gaps.
So here it is:
 >> "What you are seeing here, is a frame from the animation you just saw, that's similar to (...)
So the first thing I'm gonna notice is that this isn't really, that's a video, we can look around, we can see the set that we've built,
in fact it is, it's a set, you can see it's really ...
When we first showed the clips of what we were doing, some people thought
... was film, and it's not, it's completely completely ... scene, created by our team.
And you could see here, this is the relighting portion of the rendering pipeline,
this is really, this is a very early (seizure), a preview of what we're doing with this Ruby demo.
So you're seeing only the second (path) of the Cinema 2.0 rendering pipeline, the relighting portion of it.
I can drag ...different layers ... global illumination, photon maps, diffuse (lighting) or in this case,
complete control over the scene reflections.
And this is a really novel way of rendering graphics, we're not using any polygons.
and the thing that (makes) it very different from this, the (simple) relighting demo, is that
every single pixel you see in the scene has depth and it's essentially renderable as voxels.
We also have the capability of controlling every aspect of the exposure in the lighting pipeline
and requires........
.... to be an impact in the rendering of any scene
So one of the things that is key in voxel based rendering is ray tracing, ...
and the other element is compression, because these game assets are enormous.
One of the things that's very exciting about the latest generation hardware..........
So we're able to compress these data sets, which are pretty massive, down to very reasonable components
and we think that we can stream
...
essentially a fully manageable, relightable, completely interactive scene
and that's the ... of 2.0
we've heard about the technology ...
(L) [2008/06/20] [davepermen] [Stunning ATI tech demo using ray tracing] Wayback!

>> jogshy wrote:I don't thik all the scene are voxels... If not, the meshes's appeareance should be square.... and I see plenty of curves there.
Perhaps the scene is ray traced indeed... just using an uniform grid as spatial structure... but I'm still thinking that he's referring to the humus GI technique.... the marketing guys just went crazy pronouncing the cursed word...
check this forum for raytraced voxels and you'll learn the meshes would not look squarish.. (there are some pics of a voxel-rendered car on here)
(L) [2008/06/20] [cignox1] [Stunning ATI tech demo using ray tracing] Wayback!

Thank you very much Ray Tracey! Now, perhaps I'm completely off, but the description of the algorithm sounds to me very similar to what described Carmack when asked about raytracing in games. Or it's just me?
(L) [2008/06/20] [syoyo] [Stunning ATI tech demo using ray tracing] Wayback!

I couldn't believe the scene is rendered with voxel or raytracing. If so, it was great.
I suppose something like a g-buffer based relighting technique extended to 3D(or surface) was used
(Since the camera is moving, and the demo seems seamlessly integrated with relighting tool for production)
(L) [2008/06/20] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

>> davepermen wrote:check this forum for raytraced voxels and you'll learn the meshes would not look squarish.. (there are some pics of a voxel-rendered car on here)
LEt's zoom it...
[IMG #1 Image]
Ooopz!  [SMILEY :oops:] S.Q.U.A.R.I.S.H  [SMILEY :lol:]
That can't be pure voxels... if not, when the scorpion's sting is zoomed we could see the squares... must be other technique.
On the video's words... there are some strange things...
Photon mapping. Okay but... for an outdoor scene you'll need a lot of photons and bounces... Most of the energy will escape to the sky.... I think perhaps he wanted to say "sky light technique"... so photons are emited from the sky.... which is, technically, not photon mapping.... it's just diffuse cube map lighting (+SH/PRT perhaps )...
Relighting. There is a chapter about it in GPU Gems 3. With a picture(color+zbuffer) you can extract the direct lighting coefficients and to pass depth to a world space position... then you can relight it using global illumination... the problem is that the coefficients occupy a lot and it's not a realtime technique...
Voxels. I think they aren't using voxels to render... perhaps they are using some kind of uniform grid to perform ray tracing... that can explain too why the asset is so big and they need to compress it... mixed with a PVS they could discard non-hit voxels faster.
Big asset. Just a question.... if a 50m^3 scene occupies all the VRAM(let's say 512Mb)... that technique will be impractical for a game.... He mentions something about loading the models using streaming... but if 50m^3 occupies 512M with heavy compression I doubt the streaming could be done enough fast...
Cinema 2.0.Perhaps we're accepting it is realtime but it's not... Perhaps it's just a technique to previsualize faster the render process of CGI movies... like Pixar's Lpics ( [LINK http://graphics.pixar.com/Lpics/paper.pdf] ). That explains the "Relighting technique" and why it occupies so much ( because are statically stored the G-buffer's frames )... so the technique just relights it and they plays it at interactive frame rates given a pre-stored camera path...
[IMG #1]:Not scraped: https://web.archive.org/web/20100619133828im_/http://img444.imageshack.us/img444/6020/slk0113ih2.jpg
(L) [2008/06/20] [davepermen] [Stunning ATI tech demo using ray tracing] Wayback!

talking about this one: [LINK http://ompf.org/forum/viewtopic.php?f=6&t=793 viewtopic.php?f=6&t=793]
(L) [2008/06/20] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

>> davepermen wrote:talking about this one: [LINK http://ompf.org/forum/viewtopic.php?f=6&t=793 viewtopic.php?f=6&t=793]
That's the one I posted... Squarish on zoom as I said...And even if you don't zoom it... you can see the voxelization near the front lamp.... And also in the joints of the different metal parts...
On the other hand, notice the Scorpion's video includes a big zoom to the sting... and you won't see any voxel box...
I think voxels are used for an uniform grid spatial structure... not to rasterize.
(L) [2008/06/20] [Oatmeal] [Stunning ATI tech demo using ray tracing] Wayback!

Maybe they came up with a new rendering technique???
(L) [2008/06/20] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

cignox1: you're welcome
jogshy: the "scorpion" demo and the "street" demo or not related. They were created by different people (scorpion by David Fincher and street by Jules Urbach).
The voxels and ray tracing only apply to the street demo. There isn't any information about the rendering techniques in the scorpion demo at the moment.
(L) [2008/06/20] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

>> Ray Tracey wrote:jogshy: the "scorpion" demo and the "street" demo or not related. They were created by different people (scorpion by David Fincher and street by Jules Urbach).
If you watch the "realtime evidence videos" I linked, you will see in both scorpion and Ruby demos the OTOY circular symbol:
[LINK http://www.otoy.com/]
Btw, the OTOY is a web plugin to visualize 3D worlds.... and, in both demos, the guy stops the demonstration and moves the camera, adjusts lighting, makes zoom, etc... and in both he does it using the OTOY scene editor.... so yep, they are related, there's no doubt.... and, after all, they are both listed in the Cinema 2.0 technology... also notice in the scorpion says "produced by"... not programmed by... the programmers can be the same... An analogy... There are a lot of different games and publishers using the UE3 engine... but they all use Epic's technology...
But there is a thing I cannot unsderstand... Why ATI launched that demo relying on a 3rd party programming and 3rd party production(like JulesWorld or Fisher)... and not just make the demo inhouse with people like Vlachos, Tatarchuk, etc... It's a bit strange, indeed.
(L) [2008/06/20] [toxie] [Stunning ATI tech demo using ray tracing] Wayback!

btw: is it david fincher like in se7en??
(L) [2008/06/20] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

>> toxie wrote:btw: is it david fincher like in se7en??
Yep, same guy:
 >> The scorpion demo was made by the film director David Fincher (director of "Fight Club") on a computer. Was it a video game? An interactive movie? AMD wouldn't say. But it looked scarily real.
 [LINK http://latimesblogs.latimes.com/technology/2008/06/video-games-tha.html]
jogshy: Ok, maybe they are related. I was also wondering why they showed the OTOY logo in both videos. But what I meant to say was that the voxel thing was only explicitely mentioned in the street demo. It's possible the scorpion demo uses similar techniques for rendering as the street demo, I really don't know. BTW I found an interesting video about rendering enormous scenes at interactive framerates with voxels here [LINK http://youtube.com/watch?v=rHGCGnwBheY] . They describe the method in this paper [LINK http://www.crs4.it/vic/cgi-bin/bib-page.cgi?id='Gobbetti:2005:FV']
(L) [2008/06/20] [davepermen] [Stunning ATI tech demo using ray tracing] Wayback!

>> jogshy wrote:davepermen wrote:talking about this one: [LINK http://ompf.org/forum/viewtopic.php?f=6&t=793 viewtopic.php?f=6&t=793]
That's the one I posted... Squarish on zoom as I said...And even if you don't zoom it... you can see the voxelization near the front lamp.... And also in the joints of the different metal parts...
all the stuff from ati is eighter far away, or in a very lowres pixelated swf video. you wouldn't notice any pixelation on the car in those two situations. and as they don't move around in the sceen, only look around, they wouldn't show any pixelation in the actual demo. That doesn't mean it needs to be in any way related to the way the car is rendered, it's just that one can't judge it from the actual available material, which is very bad in quality.
(L) [2008/06/20] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

>> davepermen wrote:all the stuff from ati is eighter far away, or in a very lowres pixelated swf video. you wouldn't notice any pixelation on the car in those two situations.... it's just that one can't judge it from the actual available material, which is very bad in quality.
[LINK http://www.youtube.com/watch?v=NuR1wBCw_FU]
Well, I cannot see any voxel then he does the zoom over the scorpion's sting... you can see even the small hairs ( click on the "watch in high quality" and put it in fullscreen )... Although the video is relatively small, there is a moment you can see like some kind of normal mapping there... and no voxel cubes... although the quality is not exceptional as you said... so who know...
On the other hand... how could update and render the dynamic object's voxels so fast? If all the scene would be completely static could be possible... but with some obejcts moving around that gonna be a nighmare... Perhaps using a dynamic voxel octree like Carmack said but I think you could get a similar result using rasterization and cubemap arrays put to the limit...
Whatever it is... i'm gonna be crazy, so let's kidnap Jules and force him to vomit all the information!  [SMILEY =P~] If he rejects we can show him scorpions... and not voxelized... pretty real ones  [SMILEY :twisted:]
 >> Ray Tracey wrote:. BTW I found an interesting video about rendering enormous scenes at interactive framerates with voxels here [LINK http://youtube.com/watch?v=rHGCGnwBheY] . They describe the method in this paper [LINK http://www.crs4.it/vic/cgi-bin/bib-page.cgi?id='Gobbetti:2005:FV']
Nice but 5-6FPS with an SCSI-320LB raptor hdd.... and has no animated objects(which gonna drop severely that frame rate).
Seriously... I don't think ATI is using voxels to render those scenes... other history is if they are using an uniform grid for ray tracing.
(L) [2008/06/20] [Bakura] [Stunning ATI tech demo using ray tracing] Wayback!

Maybe they are using the same method as Nijasure (the one used by Humus and used in the Ping-Pong demo), as in all the papers they refer that as voxel (each point that capture irradiance, they call it a voxel), but it looks much more realistic here, it should be a "mix".
(L) [2008/06/20] [cignox1] [Stunning ATI tech demo using ray tracing] Wayback!

If they have found a way to efficently d raytracing on graphic hw this is a good new, but I must say that if this technique requires all this memory, I'm a bit skeptical about it to be used in games: when HW will reach the several GB required for open scenes (as in Crysis, for example), most probably standard raytracing of comprarable quality will be possible on either the GPU or the CPU  [SMILEY :D]
(L) [2008/06/20] [beason] [Stunning ATI tech demo using ray tracing] Wayback!

So... a ray traced, highly detailed, animated, voxel grid, and no polygons. Hmm, I find this hard to believe. Although, regular grids would lend themselves to ray tracing. But, hanatos voxelized benz takes up 1.7Gb, and it's not moving, but he was using an octree which has tree overhead. Maybe there's grids for each piece that moves, so you don't have time-dependent volume data? Or maybe there is some efficient, compact way of representing a sparse volume grid (like dynamic tubular grid)? Maybe 3d mipmaps (like brick maps), and perhaps this video card has 4 GB memory... need more info.
(L) [2008/06/20] [slartybartfast] [Stunning ATI tech demo using ray tracing] Wayback!

OK - I think I finally figured out what that video is showing. There are two parts:
1) The Ruby video. I can believe this is rendered in real time, but I don't think there's any ray-tracing done at render time. Looks pretty boring to me - apart from the extensive use of dynamic reflection mapping, there's nothing there that isn't already in games like Crysis.
2) The presentation. This is about the Cinema 2.0 product - not the demo. The guy clearly states "what you are seeing is a frame from the animation you just saw - Cinema 2.0 style". He also states that we are just looking at the output of the re-lighting part of the rendering engine. All the stuff he mentions about ray tracing and photon mapping is in reference to how the *lighting* for the objects in the scene was created, not how they are being rendered on the screen. As  davepermen has already noted - they don't move the camera. I believe what we are looking at is litterally a still frame, albeit one made up of "3D pixels" - hence his reference to depth for each pixel. Imagine that each pixel has a Z value as well as a screen co-ordinate.Or, put another way, imagine QuickTimeVR, but with depth. Add to that all the lighting information and you could play all sorts of fancy tricks - how about if each pixel not only had a color and position, but a normal too ? You could make it look like the pixels were "shiny".
Of course, without any more real information, any theory by any of us is just speculation  [SMILEY :cry:]
(L) [2008/06/21] [lycium] [Stunning ATI tech demo using ray tracing] Wayback!

>> jogshy wrote:davepermen wrote:check this forum for raytraced voxels and you'll learn the meshes would not look squarish.. (there are some pics of a voxel-rendered car on here)
LEt's zoom it...
<snip>
Ooopz!   S.Q.U.A.R.I.S.H   
That can't be pure voxels... if not, when the scorpion's sting is zoomed we could see the squares... must be other technique.
on the one hand, yes i agree with you that it's probably not voxelising (and i dunno how that got mixed up with the big bag of techniques this demo does use!), but on the other hand it's not very difficult to do some numerical rootfinding in those small voxels for just about any surface type you can imagine (it mostly burns flops, and of course you do need to fetch the [usually small] data describing the surface).
(L) [2008/06/21] [gfyffe] [Stunning ATI tech demo using ray tracing] Wayback!

Judging by this quote:
 >> And this is a really novel way of rendering graphics, we're not using any polygons.
and the thing that (makes) it very different from this, the (simple) relighting demo, is that
every single pixel you see in the scene has depth and it's essentially renderable as voxels.
I call:
"Relief Mapping of Non-Height-Field Surface Details"
[LINK http://www.inf.ufrgs.br/~oliveira/RTM.html]
Notably,
[LINK http://www.inf.ufrgs.br/~oliveira/pubs_files/Policarpo_Oliveira_RTM_multilayer_I3D2006.pdf]
and
[LINK http://ati.de/developer/i3d2006/I3D2006-Tatarchuk-POM.pdf]
(L) [2008/06/21] [auld] [Stunning ATI tech demo using ray tracing] Wayback!

>> gfyffe wrote:Judging by this quote:
And this is a really novel way of rendering graphics, we're not using any polygons.
and the thing that (makes) it very different from this, the (simple) relighting demo, is that
every single pixel you see in the scene has depth and it's essentially renderable as voxels.
I call:
"Relief Mapping of Non-Height-Field Surface Details"
[LINK http://www.inf.ufrgs.br/~oliveira/RTM.html]
Notably,
[LINK http://www.inf.ufrgs.br/~oliveira/pubs_files/Policarpo_Oliveira_RTM_multilayer_I3D2006.pdf]
and
[LINK http://ati.de/developer/i3d2006/I3D2006-Tatarchuk-POM.pdf]
How do you justify that with "there are no polygons in this demo" ?
Auld
(L) [2008/06/22] [Ysaneya] [Stunning ATI tech demo using ray tracing] Wayback!

I've been talking with Inigo Quilez last week about this video, and here's my own personal theory: what we're looking at is a compositing technique between a 3D movie and dynamically-rendered ( whether via polygons, voxels or raytracing ) objects.
"3D movie" in this context means that the movie has been recorded at 360°. This allows the camera to rotate freely, but not move in the scene. Let's look at the press release:
"With Cinema 2.0 you won't just play movies, you'll play in them. Imagine the ability to look around the environments in a sci-fi movie, put yourself in the driver's seat in a race scene, duck behind things and pop up to see what's going on in an intense firefight -- all of these things are possible with Cinema 2.0,"
Notice that they use the words "look around", not "move". Also, their example mention being in the driver's seat in a race scene; again an example where your position doesn't change. The Ruby demo shows the camera rotating and zooming, but not moving. They also mention the word "movie" too many times for it being a coincidence.
If you look carefully at the background during the first seconds of the video, you'll see people walking and cars ( like taxis ) moving.
Also interesting to note is the reflection of Ruby near the car on the left when she crouches near it.
I believe the 3D movie also has depth information, which gives a way to composite dynamically rendered objects in the environment, relight them, etc..
(L) [2008/06/22] [Phantom] [Stunning ATI tech demo using ray tracing] Wayback!

Thanks for your explanation, that makes absolutely a lot of sense. 'No polygons' indeed, it's a 'bubble' with an AVI on it. Still strange that they talk about voxels though. And what about that 'streaming' and 'massive amounts of data'?
Anyway, makes a lot more sense now to release a ray traced game now that this appears not to be traced in real-time. [SMILEY ;)] I was rather suprised that AMD would have made such a leap forward, a while back I got the impression that their skill level was nowhere near this quality.
(L) [2008/06/22] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

There is a similar discussion going on at Beyond3D. This is what Dave Baumann, who works at ATI, had to say:
Code: [LINK # Select all]Originally Posted by trinibwoy 
Isn't the background static and only Ruby, the taxi and the robot being rendered by the card? Thought I saw something along those lines when the video first appeared.
Dave Baumann:
Code: [LINK # Select all]No. In fact the demo stops and you can fly through this area.
[LINK http://forum.beyond3d.com/showthread.php?p=1177427#post1177427]
Code: [LINK # Select all]Originally Posted by kyetech 
EDIT: Notice that on that given frame the camera stays in a fixed location looking around the city. Much like those stupid 3d pictures on quicktime?
Dave Baumann
Code: [LINK # Select all]No, you can move around the city area and the lighting can be changed as well.
[LINK http://forum.beyond3d.com/showthread.php?t=41377&page=169]
(L) [2008/06/22] [Ysaneya] [Stunning ATI tech demo using ray tracing] Wayback!

Then that proves my theory wrong. The quality they're achieving is truely fantastic. It's hard to believe it's fully real time.
(L) [2008/06/23] [auld] [Stunning ATI tech demo using ray tracing] Wayback!

>> jogshy wrote:Ray Tracey wrote:
But there is a thing I cannot unsderstand... Why ATI launched that demo relying on a 3rd party programming and 3rd party production(like JulesWorld or Fisher)... and not just make the demo inhouse with people like Vlachos, Tatarchuk, etc... It's a bit strange, indeed.
Jogshy, back in the day I worked for a graphics card manufacturer, this was common practice. There were even some companies who specialised in creating "next gen demos" which did not run in realtime on any graphics hardware yet but were targetted at the next generation of hardware months or even years before it existed. You went ot them, saw what they had and if you wanted one of the demos, you bought it. So I suppose the behaviour isn't unprecendented.
(L) [2008/06/23] [Inigo Quilez] [Stunning ATI tech demo using ray tracing] Wayback!

we need an official statement explaining what is what we see, cause even a downloadable realtime demo would not help to know what it is I think - last ATI's demos are a few hundred megabytes anyway where compressed (3d) video fits very well, so...
the only conclusion I can make so far is that nVidia urgently needs to improve in demo-making (I'm speaking about Medusa for example).
(L) [2008/06/23] [toxie] [Stunning ATI tech demo using ray tracing] Wayback!

yup.. to be honest, the Medusa thingie did not impress me at all..
(L) [2008/06/23] [cignox1] [Stunning ATI tech demo using ray tracing] Wayback!

>> toxie wrote:yup.. to be honest, the Medusa thingie did not impress me at all..
Me neither. The last nVidia tech demo that really amazed me was Last chance fuel, so many years ago that today it runs even on my old geForce 6600 Ultra...
But IMHO nothing really can be compared with the Ati Toy Shop demo, except this last Ruby one, if confirmed that everything is real time and dynamic...
(L) [2008/06/23] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

I listened again to Urbach's talk and was able to identify a few more words:
 >> One of the things that's very exciting about the latest generation of hardware ...
is that we can now write general purpose code ... that does wavelet compression.
So we're able to compress these data sets, which are pretty massive, down to very reasonable components
and we think that we can stream them down.
My guess is he's taking Carmacks idea about voxels a step further: Carmack proposed in [LINK http://www.pcper.com/article.php?aid=532 this article] to use ray tracing in a "sparse voxel octree" to retrieve and store polygons in a much more efficient way than rasterisation can, so he could have quasi unlimited geometric detail. He's also planning to write a "proof of concept" demo in CUDA later this year, hoping that Intel, Nvidia etc. would incorporate special function hardware in their products to accelerate the voxel octree. In fact, he was already planning to do this in 2000:
 >> Voxels and Curves
FS: You mentioned voxels and you're doing some research on that, do you think 3D hardware companies should be working on voxel acceleration? And what do you think of the voxel games that have come out, like Delta Force?
John: The Voxel stuff in software… I've written a few voxel engines at one point, actually the early version of Shadowcaster, Raven's Origin title, actually had voxel floors in it at one time. But we wound up taking that out when we rewrote the stuff to be more polygon based. There are some real advantages to voxel representations of things, because it gives you complete texturing and detail geometry in many ways. But I did these two voxel engines at the beginning of Quake III and it got to the point where I thought that I could almost make them run in software, but it would be at a fairly low resolution and compared to what you could do, at that speed with hardware polygons, it doesn't pay off in that case.
I did do an analysis of what the memory access patterns would be and everything; you could do a voxel ray-tracer in hardware with drastically less hardware than what we're actually using right now for all the triangle rasterizers and I think it could be a much more compelling visual representation in a lot of cases. But it's gonna be really difficult to see, I almost hesitate to tell people to pursue something like that. I know that I did some walkaround demos and everything, but the reason the PC industry is as good as it is right now in hardware is because we all had the example of SGI to look up to. We had working existence proofs of "this clearly works, look they've done it" and then it's just a matter of matching and then exceeding their performance.
To recommend something completely different, like saying "You should go have your fab make you a voxel chip, you should just go try this, spend millions of dollars on this," I'm really hesitant to do that because we don't have a complete existence proof that says this is a necessary and sufficient rendering primitive to do a complete engine or something with. Now if somebody did do a voxel caster there, you could go ahead and get depth value and intermix it with current triangle stuff and that would be an interesting intermediate step, but I honestly don't think it would take that much hardware, and someone right now, in this time of chaos when everyone's crazily trying to diversify their product, maybe someone will try something like that, just on a lark, because they're just fumbling around for something to do.I think there are some potential good things there, but I can't conclusively say this is the future direction, because while it works really well for environments, and there's some great stuff you can do with that, it's less clear how well it works for characters. You wind up saying, "well maybe you have to build them in a deformation matrix around them, and then when you raycast into it bend the rays as it hits the deformation lattice." But I haven't written a software version of that. I'd be hesitant to tell someone that this is clearly a good idea until I can present a simulation showing that it works, and that it looks more impressive than anything you can do directly. And that's not on my schedule right now to spend the time to do because I've got a couple of things that are immediately pressing in terms of research.  
[LINK http://www.firingsquad.com/features/carmack/page13.asp]
Carmack wanted to use polygons as an intermediate step to full voxel based rendering. In the case of the Ruby demo, I think Urbach dropped the polygon part and went ahead writing a full voxel engine using the vast floating point performance of the GPGPU ([LINK http://www.amd.com/us-en/Corporate/AboutAMD/0,,51_52_15438_15106,00.html?redir=uve001 the demo was rendered on 2 RV770's], which equals about 2 teraflops).
(L) [2008/06/23] [gfyffe] [Stunning ATI tech demo using ray tracing] Wayback!

>> auld wrote:How do you justify that with "there are no polygons in this demo" ?
Auld
I'm thinking, "no polygons" is marketing speak for "the detailed geometry is not achieved using polygons".  I don't think "no polygons" literally means "no vertices or faces were sent to the graphics card".  I mean, with the layered relief map stuff, you render an entire 3D puppy dog with 1 polygon.  You could do a car with 1 polygon, etc.  The polygon only exists to get the card to perform the depth-map-based shader on a particular area of the screen, and does not otherwise represent geometry, so I would not be surprised if marketing decided that means "no polygons"  [SMILEY :wink:]  However, that's only my guess.  Could be something completely different going on here.
(L) [2008/06/23] [jogshy] [Stunning ATI tech demo using ray tracing] Wayback!

Just a note... Crysis uses voxels for smalls parts of terrain ( to allow overhangs and holes ):
[LINK http://www.youtube.com/watch?v=UPKHBtZj5To]
I think we're going backwards to Comanche 3 and Outcast days ... haha!
[LINK http://www.youtube.com/watch?v=Ku-ICQvQJGI]
[LINK http://www.youtube.com/watch?v=-nA2WZbsPcc]
The ethernal return cycle theory...  [SMILEY :wink:]
(L) [2008/06/23] [beason] [Stunning ATI tech demo using ray tracing] Wayback!

Outcast was awesome.
(L) [2008/06/24] [davepermen] [Stunning ATI tech demo using ray tracing] Wayback!

I'd love to play outcast on nowadays hardware.. high res, ultradetailed landscape.
it was awesome, yes. oh and, bruce willis' voice made it perfect [SMILEY :)]
(L) [2008/06/24] [auld] [Stunning ATI tech demo using ray tracing] Wayback!

>> gfyffe wrote:auld wrote:How do you justify that with "there are no polygons in this demo" ?
Auld
I'm thinking, "no polygons" is marketing speak for "the detailed geometry is not achieved using polygons".  I don't think "no polygons" literally means "no vertices or faces were sent to the graphics card".  I mean, with the layered relief map stuff, you render an entire 3D puppy dog with 1 polygon.  You could do a car with 1 polygon, etc.  The polygon only exists to get the card to perform the depth-map-based shader on a particular area of the screen, and does not otherwise represent geometry, so I would not be surprised if marketing decided that means "no polygons"    However, that's only my guess.  Could be something completely different going on here.
Yes I must agree that occurred to me too but as you say, entirely speculation. Damn I hope this really is raytraced voxels. I have so many bets I could win [SMILEY :-)]
(L) [2008/06/27] [Phantom] [Stunning ATI tech demo using ray tracing] Wayback!

Some more info:
[LINK http://www.tgdaily.com/content/view/38145/135/]
(L) [2008/06/27] [davepermen] [Stunning ATI tech demo using ray tracing] Wayback!

Sounds awesome...
(L) [2008/06/27] [Michael77] [Stunning ATI tech demo using ray tracing] Wayback!

I think, this is the most interesting image:
[LINK http://www.tgdaily.com/images/slideshows/200806271/ati_rt_06.jpg]
Seems like they actually use a plane with one vertex for each pixel to find the intersection for each pixel. Then they probably use the transform-feedback of the GPU to enable recursions. Makes some sense to me, thought about this when the first Geometry shader came out - damn lack of time ;o) However, I think one problem still remains: How do you handle different materials? I a game environment mostly everything may be phong but in other applications I have yet to see a good solution for this.
(L) [2008/06/27] [Ray Tracey] [Stunning ATI tech demo using ray tracing] Wayback!

>> Phantom wrote:Some more info:
[LINK http://www.tgdaily.com/content/view/38145/135/]
Amazing. This guy must be a genius.
I found some other interesting tidbits about OTOY/Cinema 2.0:
 >> Sunday, June 01, 2008
Graphics Processors (GPUs) Revisited
 
 Another Telecosm brought another great talk by Jules Urbach. Hew as showing some new stuff (I even do not know I I can share it here, as he was requesting the cameraman to stop taping what was on the screens a number of times...). But anyway. You know - they have full ray tracing in the GPU. And he was showing how his models perform on stage. OK, I mean the computerized models of virtual reality. Humans with skin modeled several layers deep... some reflective, some absorbing different parts of light spectrum, with veins and bones below them... Or a model of the SpiderMan, all of them generated in high definition theater - like quality real time. This "real-time" part is the breakthrough. We have seen many computer - generated moves already, but nobody but OTOY can do it in real time. And all it takes is a number of clustered NVIDIA cards. This GPU trend is turning the computing industry upside down. Suddenly we have discovered GPUs are not only for graphics... they are supercomputers themselves.
[LINK http://headworx.slupik.com/2008/06/graphics-processors-gpus-revisited.html]
 >> George Gilder (10/22/07):  Jules Urbach is the avatar of the graphics processor paradigm that I have been touting for the last year or so. With the fully unexpected and disruptive emergence of massively parallel processing from the humble game machine rather than the lordly supercomputer, Jules has taken Hollywood cognoscenti by storm.
His most visible achievement is real-time rendering of photorealistic 3D images from a computer program. This means what it says--he can create a photorealistic scene without using photos. By doing real time rendering at 30 or more frames per second from a computer program, he accelerates the graphics process by roughly 10,000 times the speed attained by the paladins of Pixar and Industrial Light & Magic (Lucas Films).
[…]
As calculated by its Hollywood users, such as James Cameron (“Titanic,” “Transformers,” and Digital Domain), Jules accelerates the rendering process by an unbelievable factor of 13 million. (10K, says Jules, or 13M, says Cameron. I suppose it's just numbers.) In any case it's said to reduce the time to complete a video frame from five days at Pixar and Industrial Light & Magic to a real-time, 30 frames a second at Jules World. This technology cuts the cost of animating a film by between 40 and 70 percent.
[LINK http://www.gilder.com/fridayletter/Samples/11.02.07.htm]
Two articles from The Inquirer, horrible journalism (incredibly biased towards ATI and anti-Nvidia), but still an interesting read (particularly the first):
 >> The holy grail is to have the same art assets, possibly the most time-consuming and therefore expensive part of a film, be usable in other things, specifically home computers and gaming. No more multi-million polygon Transformer model for the movie and multi-thousand polygon versions for the home, you make one and use it for both. This would lead to the possibility of interactive movies with potentially the same quality as the real thing, and all sorts of other things like audiences influencing stories of movies in real time.
[LINK http://www.theinquirer.net/gb/inquirer/news/2008/06/17/amd-touts-cinematic-vision-v2]
 >> The message that Phil conveyed over and over again is that with Spider, you can take the art assets from the original Spiderman movie and run them at 30FPS. All this takes is a 790FX board, a Phenom X4 and 4 RS670s, well within shooting range for a mid to high end gaming rig. What took over a day a frame then is now realtime.
[LINK http://www.theinquirer.net/gb/inquirer/news/2007/11/14/day-roof-phil-hester]
(L) [2008/06/27] [toxie] [Stunning ATI tech demo using ray tracing] Wayback!

at the end of the day i only believe this as soon as the first executable is released so users can see limitations and everything.. [SMILEY :)]
especially the 10k speedup is something i can only laugh about!
(although i hope that they will prove me wrong [SMILEY ;)])

back