Section: New Results
Participants : Sara Alouf, Eitan Altman, Konstantin Avrachenkov, Anne-Elisabeth Baert, Abdulhalim Dandoush, Alain Jean-Marie, Philippe Nain, Giovanni Neglia, Danil Nemirovsky, Marina Sokol, Sulan Wong.
Interplay between legislation, Economics and information technology
Participants : Eitan Altman, Sulan Wong.
The internet allows access to a huge amount of multimedia content (music, films, etc.) and to copyrighted books and journals. This access brings large gain to Internauts at the cost of depriving the creators and the copyright owners from their rights. Two types of legislations have been debated, the first attempting to fight against the unauthorized offer and demand to copyrighted creation available through the Internet, and the other aiming at benefiting from this demand through taxation. Researchers who specialize in performance analysis and in development of protocols for file sharing systems are well placed to analyze each if the above options along with the efficiency of measures proposed to enforce complience with the law. Furthermore, an economic analysis may be used to propose yet other directions for the evolution of file sharing systems and for Internet access.
In 2009 E. Altman has been working on the above issues in collaboration with S. Wong (University of A Coruna , Spain) who is a jurist specialized in copyrights and intellectual property. In collaboration with M. Ibrahim (Inria project-team Reso ) they analyze in  the role and interests of each economic actor as well as the interactions between the actors. The authors further evaluate the impact of measures taken by service or content providers to diminish the upload or download rate of a file on its availability in a file sharing system. In  , E. Altman and S. Wong develop an economic model in collaboration with J. Rojas (University of Avignon ) and explore the added value that wide access to the Internet can bring to content providers if pricing mecanisms such as Shapley value are adopted.
Storage in distributed/peer-to-peer systems
Participants : Sara Alouf, Anne-Elisabeth Baert, Abdulhalim Dandoush, Alain Jean-Marie, Philippe Nain.
Distributed systems using a network of peers has become an alternative solution for storing data. A. Dandoush, S. Alouf, and P. Nain study the performance of peer-to-peer storage systems in terms of data lifetime and availability. Prior efforts assumed data recovery process to follow an exponential distribution. To understand how the recovery process could be better modeled, they have implemented this process in the ns-2 network simulator (cf.  ) and have performed an intensive simulation analysis of it in  ,  .
Building on the findings in  ,  , A. Dandoush, S. Alouf and P. Nain develop in  Markovian models assuming that the fragment download/upload time is exponentially distributed so that the recovery time follows a hypo-exponential distribution with many distinct phases. They find in particular that a distributed recovery scheme is a good implementation choice only in large networks where peers have a good availability.
The question of optimal data replication and placement on distributed storage systems has been completed within the Vooddo project (funded by the “Multimedia” Program of the Anr ), jointly with V. Boudet and X. Roche from Lirmm , Cnrs /University of Montpellier II (see Section 8.3.3 ). The algorithmic problems of determining how much to replicate and where to replicate are difficult. The experimental evaluation of heuristics for both problems has been done in  ,  ,  . The study was completed by theoretical results including: the proof optimality of certain configurations (case of two replicates, Steiner systems), and the analysis (mean and variance) of certain random algorithms  .
Document ranking and clustering on the Web
Participants : Konstantin Avrachenkov, Danil Nemirovsky.
In  K. Avrachenkov, N. Litvak (University of Twente , The Netherlands) and K. S. Pham (St. Petersburg State University , Russia) study the PageRank mass of principal components in a bow-tie web graph as a function of the damping factor c . It is known that the web graph can be divided into three principal components: the giant Strongly Connected Component (SCC), the in-component (IN) and the out-component (OUT). The giant SCC contains a large group of pages having a hyperlink path connecting them. The pages in the IN (OUT) component have a path to (from) the SCC, but not back. Using a singular perturbation approach, the authors show that the PageRank share of the IN and SCC components remains high even for very large values of the damping factor, in spite of the fact that it drops to zero when c tends to one. However, a detailed study of the OUT component reveals the presence of “dead ends” (small groups of pages linking only to each other) that receive an unfairly high ranking when c is close to 1. The authors argue that this problem can be mitigated by choosing c as small as 1/2 .