Team, Visitors, External Collaborators
Overall Objectives
Research Program
Application Domains
New Results
Bilateral Contracts and Grants with Industry
Partnerships and Cooperations
XML PDF e-pub
PDF e-Pub

Section: New Results

Near-Neighbor Preserving Dimension Reduction for Doubling Subsets of L1

Participants : Ioannis Emiris, Ioannis Psarros.

In [21], we study randomized dimensionality reduction which has been recognized as one of the fundamental techniques in handling high-dimensional data. Starting with the celebrated Johnson-Lindenstrauss Lemma, such reductions have been studied in depth for the Euclidean (L2) metric, but much less for the Manhattan (L1) metric. Our primary motivation is the approximate nearest neighbor problem in L1. We exploit its reduction to the decision-with-witness version, called approximate near neighbor, which incurs a roughly logarithmic overhead. In 2007, Indyk and Naor, in the context of approximate nearest neighbors, introduced the notion of nearest neighbor-preserving embeddings. These are randomized embeddings between two metric spaces with guaranteed bounded distortion only for the distances between a query point and a point set. Such embeddings are known to exist for both L2 and L1 metrics, as well as for doubling subsets of L2. The case that remained open were doubling subsets of L1. In this paper, we propose a dimension reduction by means of a near neighbor-preserving embedding for doubling subsets of L1. Our approach is to represent the pointset with a carefully chosen covering set, then randomly project the latter. We study two types of covering sets: c-approximate r-nets and randomly shifted grids, and we discuss the tradeoff between them in terms of preprocessing time and target dimension. We employ Cauchy variables: certain concentration bounds derived should be of independent interest.

This is joint work with Vassilis Margonis (NKUA), and is based on his MSc thesis.