Section: Scientific Foundations
3D User interfaces
The Iparla project aims at improving the development of 3D interactive applications for the mobile user. Consequently, as we have seen above, an essential part of this project consists in adapting the classical 3D graphics pipeline to the mobile context. However, we think that the development of modeling and rendering techniques cannot go without the development of adapted user interfaces. Indeed, the interest of mobile applications where complex data can be visualized in real-time is limited when the interaction with the data is difficult.
We believe that human factors have to be taken into account in the early stages of development. Indeed, the choice of the user interface can influence the modeling and rendering techniques to use. For example, an object-oriented construction of the scene has to be preferred when the main user task of a given application consists in selecting individual objects. In the Iparla project, we want to control the entire process, from the creation of the 3D environments to the interaction with these environments. Each of the components of this process have to be strongly linked and should not be considered independently.
When dealing with mobile devices, the classical user interfaces that have been developed for desktop workstations are not the most appropriate. For example, the lack of keyboards has led to the development of intuitive writing interfaces. The classical side-menus cannot be used for the control of the application without occluding a large amount of the screen and, consequently, without occluding a large part of the data to be visualized. Last but not least, the lack of pointing devices with cell-phones makes the manipulation of the data very difficult. In the Iparla project, we develop interaction techniques that are adapted to the user, to the task, and to the characteristics of mobile devices, for efficient interaction with 3D datasets.
For the success of mobile applications, the efficiency of interaction techniques is primordial. From previous work in the scope of VR and general Human Computer Interfaces (HCI), we investigate mobile HCI techniques. In particular, our work is based on the following foundations:
Collaboration. In many cases, the user does not interact alone. Consequently, the issues coming with collaborative work are taken into account.
Bi-manual interaction. It has been shown that the use of both hands can be more efficient than the use of one single hand.
Multi-degree of freedom (dof) interaction. It is necessary to adapt the structure of the interface to the structure of the task. Consequently, interaction with 3D data generally requires interfaces with more than 2-dof.
Gesture recognition. Non-intrusive and easy-to-learn interaction can be obtained from natural gesture recognition.
Video-based interaction. Modern mobile devices are equipped with embedded cameras. The video stream analysis can be used as input for the development of interaction techniques.
Interaction techniques are developed in concordance with the user and the task. They are evaluated via experiments. Hence, the user performance can be qualitatively and quantitatively measured, which indicates whether a new technique is more or less efficient than another one.