Section: New Results
GPU Rendering
Participants : Cyril Crassin, Jean-Dominique Gascuel, Nicolas Holzschuch, Fabrice Neyret.
Screen-Space Indirect Illumination for Video Games
Participants : Cyril Soler, Olivier Hoel, Franck Rochet, Nicolas Holzschuch.
In the context of the GENAC2 project, we have designed an algorithm for computing indirect illumination in screen space in the context of video games. In such a context, the most important criteria are the stability of the cost over time, the speed, and lack of noise and artifacts, whereas the mathematical accuracy of the computation is not very important.
We thus designed a screen-space hierarchical algorithm for computing indirect lighting for animated scenes. Our algorithm is fully compatible with deferred-shading rendering engines for video games, and computes indirect lighting in less than 10 ms, leaving enough computation time for other gaming tasks, such as interaction and animation (See Figure 8 ). Our algorithm works in two steps: first, we compute indirect illumination in screen-space at all possible scales, then we filter and combine together the illumination received at the different scales. Particular care was taken in verifying the practical integration of this technique inside a commercial video-game rendering engine, provided by our project partner in the GENAC2 project.
A paper describing this technique was submitted to the Interactive 3D Graphics 2010 conference (Evaluation still ongoing).
Real-time rendering of large detailed volumes
Participants : Cyril Crassin, Fabrice Neyret.
Cyril Crassin pursue his PhD thesis with Fabrice Neyret, on the the real-time rendering of very large and detailed volumes, taking advantage of GPU-adapted data-structure and algorithms. The main target corresponds to the cases where detail is concentrated at the interface between free space and clusters of density found in many natural volume data such as cloudy sky or vegetation, or data represented as generalized parallax maps, hypertextures or volumetric textures.
|
The new method is based on a dynamic N3 tree storing MIP-mapped 3D texture bricks in its leaves. We load on the fly on GPU only the necessary bricks at the necessary resolution, taking into account visibility. This maintains low memory consumption during interactive exploration and minimizes data transfer (See Figure 9 ). Our ray marching algorithm benefits from the multiresolution aspect of our data structure and provides real-time performance.
A paper has been published at the ACM Symposium on Interactive 3D Graphics and Games 2009 [28] , (also presented as a poster and a sketch at Siggraph'09), and a version is accepted as a book chapter in GPU Pro (ShaderX 8) . We have also be contacted by several game, visualisation and special effect companies interested in the technology, and we have been invited to do technical presentations at Intel “Visual Computing and Research Conference” (Saarbrücken, dec 09) and at “Crytek Academy Conference” (Francfort, nov 09). We are also in close contact with nVIDIA, where Cyril did a long stay in the context of Eurodoc founding programm.
Scalable Real-Time Animation of Rivers
Participants : Fabrice Neyret, Nicolas Holzschuch.
|
Many recent games and applications target the interactive exploration of realistic large scale worlds. These worlds consist mostly of static terrain models, as the simulation of animated fluids in these virtual worlds is computation- ally expensive. Adding flowing fluids, such as rivers, to these virtual worlds would greatly enhance their realism, but causes specific issues: as the user is usually observing the world at close range, small scale details such as waves and ripples are important.
However, the large scale of the world makes classical methods impractical for simulating these effects. We found an algorithm for the interactive simulation of realistic flowing fluids in large virtual worlds. Our method relies on two key contributions: the local computation of the velocity field of a steady flow given boundary conditions, and the advection of small scale details on a fluid, following the velocity field, and uniformly sampled in screen space. This is illustrated in Figure 10 .
This work, done in collaboration with the Evasion project-team in the scope of the PhD thesis of Qizhi Yu (supervised by Fabrice Neyret and Eric Bruneton), was accepted in the Eurographics 2009 conference, and published in the Computer Graphics Forum journal [22] .
Wind projection basis for real-time animation of trees
Participant : Lionel Baboud.
With this work we proposed a real-time method to animate complex scenes of thousands of trees under a user-controllable wind load. Firstly, modal analysis is applied to extract the main modes of deformation from the mechanical model of a 3D tree. The novelty of our contribution is to precompute a new basis of the modal stress of the tree under wind load. At runtime, this basis allows to replace the modal projection of the external forces by a direct mapping for any directional wind. We showed that this approach can be efficiently implemented on graphics hardware. This modal animation can be simulated at low computation cost even for large scenes containing thousands of trees (Figure 11 ).
This work, done in collaboration with the Evasion project-team and the LadHyX (joint laboratory of Ecole Polytechnique and CNRS) was accepted in the Eurographics 2009 conference, and published in the Computer Graphics Forum journal [14] .
Real-time realistic ocean lighting
Participant : Fabrice Neyret.
|
In collaboration with Eric Bruneton at Evasion project-team, we developped a new algorithm for modelling, animation, illumination and rendering of the ocean, in real-time, at all scales and for all viewing distances. Our algorithm is based on a hierarchical representation, combining geometry, normals and BRDF. For each viewing distance, we compute a simplified version of the geometry, and encode the missing details into the normal and the BRDF, depending on the level of detail required. We then use this hierarchical representation for illumination and rendering. Our algorithm runs in real-time, and produces highly realistic pictures and animations (see Figure 12 ). This work has been accepted for publication at the next Eurographics conference (Eurographics 2010).