## Section: New Results

### Probabilistic Analysis of Geometric Data Structures and Algorithms

Participants : Olivier Devillers, Charles Duménil, Xavier Goaoc, Fernand Kuiebove Pefireko, Ji Won Park.

#### Expected Complexity of Routing in $\Theta $6 and Half-$\Theta $6 Graphs

We study online routing algorithms on the $\Theta $6-graph and the half-$\Theta $6-graph (which is equivalent to a variant of the Delaunay triangulation). Given a source vertex s and a target vertex t in the $\Theta $6-graph (resp. half-$\Theta $6-graph), there exists a deterministic online routing algorithm that finds a path from s to t whose length is at most 2 st (resp. 2.89 st) which is optimal in the worst case [Bose et al., SIAM J. on Computing, 44(6)]. We propose alternative, slightly simpler routing algorithms that are optimal in the worst case and for which we provide an analysis of the average routing ratio for the $\Theta $6-graph and half-$\Theta $6-graph defined on a Poisson point process. For the $\Theta $6-graph, our online routing algorithm has an expected routing ratio of 1.161 (when s and t random) and a maximum expected routing ratio of 1.22 (maximum for fixed s and t where all other points are random), much better than the worst-case routing ratio of 2. For the half-$\Theta $6-graph, our memoryless online routing algorithm has an expected routing ratio of 1.43 and a maximum expected routing ratio of 1.58. Our online routing algorithm that uses a constant amount of additional memory has an expected routing ratio of 1.34 and a maximum expected routing ratio of 1.40. The additional memory is only used to remember the coordinates of the starting point of the route. Both of these algorithms have an expected routing ratio that is much better than their worst-case routing ratio of 2.89 [27].

*In collaboration with Prosenjit Bose (University Carleton) and
JeanLou De Carufel (University of Ottawa)*

#### A Poisson sample of a smooth surface is a good sample

The complexity of the 3D-Delaunay triangulation (tetrahedralization) of $n$ points distributed on a surface ranges from linear to quadratic. When the points are a deterministic good sample of a smooth compact generic surface, the size of the Delaunay triangulation is $O(nlogn)$. Using this result, we prove that when points are Poisson distributed on a surface under the same hypothesis, whose expected number of vertices is $\lambda $, the expected size is $O\left(\lambda {log}_{2}\lambda \right)$ [22].

#### On Order Types of Random Point Sets

Let $P$ be a set of $n$ random points chosen uniformly in the unit square. We examine the typical resolution of the order type of $P$. First, we show that with high probability, $P$ can be rounded to the grid of step $\frac{1}{{n}^{3+\u03f5}}$ without changing its order type. Second, we study algorithms for determining the order type of a point set in terms of the number of coordinate bits they require to know. We give an algorithm that requires on average $4n{log}_{2}n+O\left(n\right)$ bits to determine the order type of $P$, and show that any algorithm requires at least $4n{log}_{2}n-O(nloglogn)$ bits. Both results extend to more general models of random point sets [29].

*In collaboration with Philippe Duchon (Université de Bordeaux) and Marc Glisse (project team *
Datashape
*).*

#### Randomized incremental construction of Delaunay triangulations of nice point sets

Randomized incremental construction (RIC) is one of the most important paradigms for building geometric data structures. Clarkson and Shor developed a general theory that led to numerous algorithms that are both simple and efficient in theory and in practice. Randomized incremental constructions are most of the time space and time optimal in the worst-case, as exemplified by the construction of convex hulls, Delaunay triangulations and arrangements of line segments. However, the worst-case scenario occurs rarely in practice and we would like to understand how RIC behaves when the input is nice in the sense that the associated output is significantly smaller than in the worst-case. For example, it is known that the Delaunay triangulations of nicely distributed points on polyhedral surfaces in ${\mathbb{E}}^{3}$ has linear complexity, as opposed to a worst-case quadratic complexity. The standard analysis does not provide accurate bounds on the complexity of such cases and we aim at establishing such bounds. More precisely, we will show that, in the case of nicely distributed points on polyhedral surfaces, the complexity of the usual RIC is $O(nlogn)$ which is optimal. In other words, without any modification, RIC nicely adapts to good cases of practical value. Our proofs also work for some other notions of nicely distributed point sets, such as $(\u03f5,\kappa )$-samples. Along the way, we prove a probabilistic lemma for sampling without replacement, which may be of independent interest [16], [26].

*In collaboration with Jean-Daniel Boissonnat, Kunal Dutta and Marc Glisse (project team *
Datashape
*).*

#### Random polytopes and the wet part for arbitrary probability distributions

We examine how the measure and the number of vertices of the convex hull of a random sample of $n$ points from an arbitrary probability measure in ${\mathbb{R}}^{d}$ relates to the wet part of that measure. This extends classical results for the uniform distribution from a convex set [Bárány and Larman 1988]. The lower bound of Bárány and Larman continues to hold in the general setting, but the upper bound must be relaxed by a factor of $logn$. We show by an example that this is tight [25].

*In collaboration with
Imre Barany (Rényi Institute of Mathematics)
Matthieu Fradelizi (Laboratoire d'Analyse et de Mathématiques Appliquées)
Alfredo Hubard (Laboratoire d'Informatique Gaspard-Monge)
Günter Rote (Institut für Informatik, Berlin)*