## Section: Scientific Foundations

### Rendering

Participants : Laurent Alonso, Anass Lasram, Samuel Hornus, Bruno Jobard, Anass Lasram, Sylvain Lefebvre, Bruno Lévy, Vincent Nivoliers, Nicolas Ray, Thomas Viard.

Numerical simulation of light means solving for light intensity in
the “Rendering Equation”, an integral equation modeling energy
transfers (or light *intensity* transfers).
The Rendering Equation was first formalized by
Kajiya [29] ,
and is given by:

Computing global illumination (i.e., solving for intensity in Equation
1 ) in general environments is a challenging
task. Global illumination may be considered in terms of computing the
interactions between the *lighting signal* and the
*geometric signal* (i.e., the scene). These interactions occur at
various *scales*. This issue belongs to the same class of
problems encountered by geometry processing, described
in the previous section. As a consequence, the *signal
processing* family of approaches is again a well-suited formalism. As
such, the *multi-scale* approach is a natural choice, which
dramatically improves performances. Environments composed of a
large number of primitives, such as highly
tessellated models, show a high variability of these scales.

In addition, these methods are challenged with more and more complex materials (see Figure 3 ) which need to be taken into account in the simulation. The simple diffuse Lambert law has been replaced with much more complex reflection models. The goal is to create synthetic images that no longer have a synthetic aspect, in particular when human characters are considered.

One of the difficulties is finding efficient ways of evaluating the
visibility term. This is typically a Computational Geometry problem,
i.e., a matter of finding the right combinatorial data structure (the
*visibility complex*), studying its complexity and deriving
algorithms to construct it. To deal with this issue, several teams
(including VEGAS, ARTIS and REVES) study the visibility complex.

The other terms of the Rendering Equation cannot be solved analytically in general. Many different numerical resolution methods have been used. The main difficulties of the discipline are that each time a new physical effect should be simulated, the numerical resolution methods need to be adapted. In the worst case, it is even necessary to design a new ad-hoc numerical resolution method. For instance, in Monte-Carlo based solvers and in recent Photon-Mapping based methods, several sampling maps are used, one for each effect (a map is used for the diffuse part of lighting, another map is used for caustics, etc.). As a consequence, the discipline becomes a collection of (sometimes mutually exclusive) techniques, where each of these techniques can only simulate a specific lighting effect.

The other difficulty is the classical problem of satisfying two somewhat antinomic objectives at the same time. On the one hand, we want to simulate complex physical phenomena (subsurface scattering, polarization, interferences, etc.), responsible for subtle lighting effects. On the other hand, we want to visualize the result of the simulation in real-time.

We first experimented finite-element methods in parameter space, and
developed the *Virtual Mesh* approach and
a parallel solution mechanism for the associated hierarchical finite
element formulation. The initial method was
dedicated to scenes composed of quadrics. We combined this method
with our geometry processing methods to improve the visualization [2] .

One of our goals is now to design new representations of lighting coupled with the geometric representation. These representations of lighting need to be general enough so as to be easily extended when multiple physical phenomena should be simulated. Moreover, we want to be able to use these representations of lighting in the frame of real-time visualization. Our original approach to these problems consists in finding efficient function bases to represent the geometry and the physical attributes of the objects. We have first experimented this approach to the problem of image vectorization [3] . We think that our dynamic function basis formulation is likely to lead to efficient light simulation algorithms. The originality is that the so-defined optimization algorithm solves for approximation and sampling all together. Developing such an algorithm is the main goal of our ERC GoodShape project.