Section: New Results
Keywords : Google, PageRank, P2P, BitTorrent, fluid model.
Content distribution networks and WWW
Document ranking in WWW
Surfers on the Internet frequently use search engines to find pages satisfying their queries. However, there are typically hundreds or thousands of relevant pages available on the Web. Thus, listing them in a proper order is a crucial and non-trivial task. PageRank is one of the main criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer and thus it reflects the popularity of a Web page. Google computes the PageRank using the power iteration method which requires about one week of intensive computations. In  K. Avrachenkov, D. Nemirovsky and N. Osipova, in collaboration with N. Litvak (University of Twente, The Netherlands), propose and analyze Monte Carlo type methods for the PageRank computation. There are several advantages of the probabilistic Monte Carlo methods over the deterministic power iteration method: Monte Carlo methods provide good estimation of the PageRank for relatively important pages already after one iteration; Monte Carlo methods have natural parallel implementation; and finally, Monte Carlo methods allow for continuous updates of the PageRank as the structure of the Web changes.
F. Clévenot-Perronnin and P. Nain, in collaboration with K. W. Ross (Polytechnic University, New York, USA), have investigated service differentiation and bandwidth diversity issues for BitTorrent like P2P networks. In  they introduce resource allocation strategies that are based on a single parameter. With the help of multiclass fluid models they show how this parameter could be tuned so as to achieve a target quality of service ratio.
In  , F. Clévenot-Perronnin and P. Nain propose a new model of Squirrel, a P2P cooperative web cache. This work extends the analysis in  to an arbitrary number of nodes and to documents with different popularities.