Section: New Results
3D User interfaces
ScrutiCam : Camera Manipulation Technique for 3D Objects Inspection
Participants : Fabrice Decle, Martin Hachet, Pascal Guitton.
Inspecting a 3D object is a common task in 3D applications. However, such a camera movement is not trivial and standard tools do not provide an efficient and unique tool for such a move. ScrutiCam is a new 3D camera manipulation technique  . It is based on the “click-and-drag” mouse move, where the user “drags” the point of interest on the screen to perform different camera movements such as zooming, panning and rotating around a model (Figure 13 ). ScrutiCam can stay aligned with the surface of the model in order to keep the area of interest visible. ScrutiCam is also based on the Point-Of-Interest (POI) approach, where the final camera position is specified by clicking on the screen. Contrary to other POI techniques, ScrutiCam allows the user to control the animation of the camera along the trajectory. It is also inspired by the “Trackball” technique, where the virtual camera moves along the bounding sphere of the model. However, ScrutiCam's camera stays close to the surface of the model, whatever its shape. It can be used with mice as well as with touch screens as it only needs a 2D input and a single button.
Multitouch 3D interaction
Participants : Sebastian Knoedel, Fabrice Decle, Martin Hachet.
Mobile interfaces are evolving towards touch-based approaches. This allows users to interact with their thumb directly on the screen. While multitouch systems have appeared as powerful interfaces for interaction with 2D content, few works have explored the potential of such systems for 3D applications. We introduce new 3D user interfaces that takes benefit from multitouch input for interaction with 3D data.
In the context of 3D camera control, we proposed a new tackball based interface  where the users sketch horizontal or vertical movements to observe an object (Figure 14 ). A user study revealed no significant difference for error rate between this new approach and a standard trackball control. Despite a better completion time with the direct control, the study showed that the subjects preferred using the planned version of the trackball because it limits disorientation.
Then, in the context of 3D data manipulation, we proposed an interface based on the mapping of the finger movements to corresponding movements on a 3D-oriented virtual plane (Figure 15 ). This interface can be used both with a direct touch approach where users directly touch the screen, and with a distant approach where the visualization and the interaction surfaces are separated. To illustrate our 3D multitouch approach, we present a demo application that allows novice users to arrange 3D objects such as furniture in a very simple way.
Remote and Collaborative 3D Interactions
Participants : Mariam Amyra, Martin Hachet, Sebastian Knoedel, Fabrice Decle.
We explored different approaches to enhance collaborative interaction with 3D environments (Figure 16 ). We proposed solutions allowing mobile users to communicate with distant colleagues. We also study co-located multi-user interaction in front of large screens. Among our propositions, we adapted Navidget for remote interaction. These new approaches were developed within the context of the ANR project Part@ge. We demonstrated our technology during the Paris Air show - Le Bourget, in June 2009.
Interactive 3D environments for music
Participants : Martin Hachet, Arash Kian, Florent Berthaut, Jean-Sébastien Franco, Myriam Desainte Catherine.
We investigate the use of 3D environments for musical performance. In particular, we have designed 3D audiovisual objects, which we call 3D reactive widgets (Figure 17 left). These elements allow us to manipulate and visualize sound processes, as their graphical parameters are mapped to sound and musical parameters. In order to provide the accuracy and expressivity required by musical performance, we have also built a new input device and defined several interaction techniques to manipulate these reactive widgets. Some of these interaction techniques were described in  . We now explore the possibilities of immersive environments for the creation and the navigation in musical structures.
In parallel, we noticed that while mixed reality has inspired the development of many new musical instruments, few approaches explore the potential of mobile setups. We studied a new musical interaction concept, called opportunistic music (Figure 17 right). It allows musicians to recreate a hardware musical controller using any objects of their immediate environment. This approach benefits from the physical attributes of real objects for controlling music. Our prototype is based on a stereo-vision tracking system associated with FSR sensors. It allows musicians to define and to interact with opportunistic tangible widgets. Linking these widgets with sound processes allows the interactive creation of musical pieces, where musicians get inspiration from the surrounding environment  .
Interactive Generation and Modification of Cutaway Illustrations for Polygonal Models
Participants : Sebastian Knoedel, Martin Hachet.
We present a system for creating appealing illustrative cutaway renderings  . This system bases on simple sketch-based interfaces and stylized rendering techniques for the study of elaborate 3D models (Figure 18 ). Since interactive visualization technology found its way to the general public, there is a demand for novel interaction techniques that allow easy exploration of the displayed illustrations. Hence, our system lets users create individual cutaway views to focus on hidden objects. At the same time, important contextual information is emphasized by illustrative rendering techniques.
Semi-automatic reassembly for fractured archeological objects
Participants : Nicolas Mellado, Patrick Reuter, Christophe Schlick.
We developed a semi-automatic reassembly system that assists an expert user to realize the reassembly of broken fragments. This approach integrates the long-year working experience of cultural heritage professionals with the precision of the algorithms of geometric modeling. The users positions the fragments pairwisely, as well as he can with a tangible user interface, and the reassembly system finds the best geometric fit in the local neighborhood concerning translations and rotations. This is done in real-time with a new speed-up technique of the ICP algorithm: we developed an efficient intersection algorithm of the two bounding-sphere hierarchies that are associated for the two fragment. The geometry-based matching can be tuned by taking into account not only the positions of the vertices of the fragments, but also their normals, that are weighted by a user-defined value.