Team geometrica

Members
Overall Objectives
Scientific Foundations
Application Domains
Software
New Results
Contracts and Grants with Industry
Other Grants and Activities
Dissemination
Bibliography

Section: New Results

Topological and Geometric Inference

Triangulating Smooth Submanifolds with Light Scaffolding

Participants : Jean-Daniel Boissonnat, Arijit Ghosh.

We propose an algorithm to sample and mesh a k -submanifold Im1 $\#120132 $ of positive reach embedded in Im2 $\#8477 ^d$ . The algorithm first constructs a crude sample of Im1 $\#120132 $ using a brute force method. It then refines the sample according to a prescribed parameter $ \epsilon$ , and builds a mesh that approximates Im1 $\#120132 $ . Differently from most algorithms that have been developped for meshing surfaces of Im3 $\#8477 ^3$ , the refinement phase does not rely on a subdivision of Im2 $\#8477 ^d$ (such as a grid or a triangulation of the sample points) since the size of such scaffoldings depends exponentially on the ambient dimension d . Instead, we only compute local stars consisting of k -dimensional simplices around each sample point. By refining the sample, we can insure that all stars become coherent leading to a k -dimensional triangulated manifold Im4 $\mover \#120132 ^$ . The algorithm uses only simple numerical operations. We show that the size of the sample is O($ \epsilon$-k) and that Im4 $\mover \#120132 ^$ is a good triangulation of Im1 $\#120132 $ . More specifically, we show that Im1 $\#120132 $ and Im4 $\mover \#120132 ^$ are isotopic, that their Hausdorff distance is O($ \epsilon$2) and that the maximum angle between their tangent bundles is O($ \epsilon$) . The asymptotic complexity of the algorithm is T($ \epsilon$) = O($ \epsilon$-k2-k) (for fixed Im1 $\#120132 $ , d and k ).

Topological Inference via Meshing

Participant : Steve Oudot.

In collaboration with Benoît Hudson (TTI), Gary Miller and Donald Sheehy (CMU).

We apply ideas from mesh generation to improve the time and space complexities of computing the full persistent homological information associated with a point cloud P in Euclidean space Im2 $\#8477 ^d$ . Classical approaches rely on the Cech, Rips, $ \alpha$ -complex, or witness complex filtrations of P, whose complexities scale up very badly with d . For instance, the alpha-complex filtration incurs the Im5 $n^{\#937 (d)}$ size of the Delaunay triangulation, where n is the size of P. The common alternative is to truncate the filtrations when the sizes of the complexes become prohibitive, possibly before discovering the most relevant topological features. In this work we propose a new collection of filtrations, based on the Delaunay triangulation of a carefully-chosen superset of P, whose sizes are reduced to 2O(d2)n . A nice property of these filtrations is to be interleaved multiplicatively with the family of offsets of P, so that the persistence diagram of P can be approximated in 2O(d2)n3 time in theory, with a near-linear observed running time in practice. Thus, our approach remains tractable in medium dimensions, say 4 to 10 [31] .

Persistence-based Segmentation of Deformable Shapes

Participants : Frédéric Chazal, Primoz Skraba.

In collaboration with Maks Ovsjanikov and Leo Guibas (Stanford).

We combine two ideas: persistence-based clustering and the Heat Kernel Signature (HKS) function to obtain a multi-scale isometry invariant mesh segmentation algorithm. The key advantages of this approach is that it is tunable through a few intuitive parameters and is stable under near-isometric deformations. Indeed the method comes with feedback on the stability of the number of segments in the form of a persistence diagram. There are also spatial guarantees on part of the segments. Finally, we present an extension to the method which first detects regions which are inherently unstable and segments them separately. Both approaches are reasonably scalable and come with strong guarantees. We show numerous examples and a comparison with the segmentation benchmark and the curvature function [33] .

Geometric Inference for Measures based on Distance Functions.

Participants : Frédéric Chazal, David Cohen-Steiner.

In collaboration with Quentin Mérigot (Stanford).

Data often comes in the form of a point cloud sampled from an unknown compact subset of Euclidean space. The general goal of geometric inference is then to recover geometric and topological features (Betti numbers, curvatures,...) of this subset from the approximating point cloud data. In recent years, it appeared that the study of distance functions allows to address many of these questions successfully. However, one of the main limitations of this framework is that it does not cope well with outliers nor with background noise. In this paper [44] , we show how to extend the framework of distance functions to overcome this problem. Replacing compact subsets by measures, we introduce a notion of distance function to a probability distribution in Im6 $\#8477 ^n$ . These functions share many properties with classical distance functions, which makes them suitable for inference purposes. In particular, by considering appropriate level sets of these distance functions, it is possible to associate in a robust way topological and geometric features to a probability measure (see Figure 6 ). We also discuss connections between our approach and non parametric density estimation as well as mean-shift clustering.

Figure 6. On the left, a point cloud sampled on a mechanical part to which 10% of outliers (uniformly sampled in a box enclosing the model) have been added. On the right, the reconstruction of an isosurface of the distance function to the uniform probability measure on this point cloud.
IMG/blade-cloudIMG/blade-offset
(a) (b)

Zigzag Persistent Homology in Matrix Multiplication Time

Participant : Primoz Skraba.

This work has been done in collaboration with Nikola Milosavljevic (MPI Saarbrücken) and Dmitriy Morozov (Stanford Univ.).

We present a new algorithm for computing zigzag persistent homology, an algebraic structure which encodes changes to homology groups of a simplicial complex over a sequence of simplex additions and deletions [47] . Provided that there is an algorithm that multiplies two n×n matrices in M(n) time, our algorithm runs in O(M(n)logn) time if M(n) = O(n2) , and O(M(n)) time otherwise, for a sequence of n additions and deletions. In particular, the running time is O(n2.376) , by result of Coppersmith and Winograd. The fastest previously known algorithm for this problem takes O(n3) time in the worst case.


previous
next

Logo Inria