TREC is a joint INRIA-ENS project-team.

TREC is a joint INRIA-ENS project-team. It is focused on the modeling, the control and the design of communication networks and protocols. Its methodological activities are combined with projects defined with industrial partners, notably Thomson, Alcatel-Lucent and Sprint. The main research directions are:

communication network control: admission control, flow regulation, congestion control, traffic analysis in controlled networks;

modeling and performance analysis of wireless networks (cellular, mesh, ad-hoc, sensor, etc.): coverage and load analysis, power control, evaluation and optimization of the transport capacity, self organization;

stochastic network dynamics, with focus on classical topics for TREC like rare events or stability, and new ones like simulation (in particular perfect simulation) and statistics (in particular inverse problems);

economics of networks; a new domain opened in 2008: epidemic risk model, incentives, security, insurance, diffusion of innovations;

the development of mathematical tools based on stochastic geometry, random geometric graphs and spatial point processes: Voronoi tessellations, coverage processes, random spatial trees, random fields.

combinatorial optimization and analysis of algorithms: random graphs, belief propagation.

Here is the scientific content of each of our main research directions.

**Modeling and control of communication networks.**Here we mean admission control, flow regulation and feedback control
*à la TCP*. Our aim is a mathematical representation of the dynamics of the most commonly used control protocols, from which one could predict and optimize the resulting end user
bandwidth sharing and quality of service. We currently use our understanding of the dynamics of these protocols on Split TCP, as used in wireless access networks and in peer-to-peer
overlays, and on variants of TCP meant to reach higher throughputs such as scalable TCP.

**Modeling and performance analysis of wireless networks.**The main focus is on the following three classes of wireless networks: cellular networks, mobile ad hoc networks (MANETs) and
WiFi mesh networks.

Concerning cellular networks, our mathematical representation of interferences based on shot-noise has led to a variety of results on coverage and capacity of large CDMA networks when taking into account intercell interferences and power control. Our general goals are twofold: 1) to propose a strategy for the densification and parameterization of UMTS and future OFDM networks that is optimized for both voice and data traffic; 2) to design new self organization and self optimization protocols for cellular networks e.g. for power control, sub-carrier selection, load balancing, etc.

Using a similar approach, we currently investigate MAC layer scheduling algorithms and power control protocols for MANETs and their vehicular variants called VANETs. We concentrate on
cross layer optimizations allowing one to maximize the transport capacity of multihop MANETs. A recent example within this class of problems is that of opportunistic routing for MANETs. Our
main quantitative results concern one-hop analysis as well as scaling laws for end-to-end delays on long routes. This last question is treated studying an appropriate first passage
percolation problem on a new class of random graphs called
*space-time SINR graphs*.

**Theory of network dynamics.**TREC is also pursuing the elaboration of a stochastic network calculus, that would allow the analysis of network dynamics by algebraic methods. The
mathematical tools are those of discrete event dynamical systems: semi-rings, ergodic theory, perfect simulation, stochastic comparison, inverse problems, large deviations, etc.

**Economics of networks**The premise this relatively new direction of research, developed jointly with Jean Bolot (SPRINT) is that economic incentives drive the development and
deployment of technology. Such incentives exist if there is a market where suppliers and buyers can meet. In today's Internet, such a market is missing. We started by looking at the general
problem of security on Internet from an economic perspective and derived a model showing that network externalities and misaligned incentives are responsible for a low investment in
security measures. We then analyzed the possible impact of insurance.

**The development of mathematical tools based on stochastic geometry and random geometric graphs**Classical Stochastic Geometry. Stochastic geometry is a rich branch of applied
probability which allows one to quantify random phenomena on the plane or in higher dimension. It is intrinsically related to the theory of point processes and also to random geometric
graphs. Our research is centered on the development of a methodology for the analysis, the synthesis, the optimization and the comparison of architectures and protocols to be used in
wireless communication networks. The main strength of this method is its capacity for taking into account the specific properties of wireless links, as well as the fundamental question of
scalability.

**Combinatorial optimization and analysis of algorithms.**In this research direction started in 2007, we build upon our expertise on random trees/graphs and our collaboration with
D. Aldous in Berkeley. Sparse graph structures have proved useful in a number of applications from information processing tasks to the modeling of social networks. We obtained new results
for stochastic processes taking place on such graphs. Thereby, we were able to analyze an iterative message passing algorithm for the random assignement problem and to characterize its
performance. Likewise, we made a sensitivity analysis of such processes and computed the corresponding scaling exponents (for a dynamic programming optimization problem). We also derived
analytic formula for the spectrum of the adjacency matrix of diluted random graphs.

Depending on the classes of communication networks, we focus on different issues:

Concerning the Internet, we concentrate on probing and on the analysis of flow and congestion control.

Concerning operator networks, we work on the control and the optimization of wireless and wireline access networks;

Concerning self-organized networks, we focus on the design of MAC and routing protocols and on the evaluation of the ultimate capacity.

We interact on these questions with the following industrial partners: Thomson (probing), Alcatel (wireline access with Antwerp and wireless access with Villarceaux) and Sprint (Internet probing and wireless access). We also have some point to point interactions with researchers of France Télécom on wireless cellular networks.

UTRANDIM is a continuation of SERT (Spatial Erlang for Real Time services), a Software designed by M. Karray for the evaluation of various properties of large cdma networks and in particular the probability that calls are blocked due to the unfeasibility of the power control inherent to cdma. This tool is based on the research conducted with FT R&D described in Section . The approach is constantly developed and enriched. It is now included in UTRANDIM, a current dimensioning tool of Orange Corporate for UMTS and LTE networks.

TREC participated in the design of a software tool developed by N2NSoft for the optimal control of the diffusion of video on demand in a large DSL access network. The setting is that of layered coding where a controlled degradation of the quality of the video streams may be preferred to the rejection of customers. Various schemes are implemented in the software tool including a scheme based on Markov decision theory. This work was part of a research contract of Alcatel-Lucent involving TREC and N2NSoft.

This axis concerns the analysis and the design of wireless access communication networks. Our contributions are organized in terms of network classes: cellular networks, wireless LANs and MANETs, VANETs. We also have a section on generic results that concern more general wireless networks. We are interested both in macroscopic models, which are particularly important for economic planning and in models allowing the definition and the optimization of protocols. Our approach combines several tools, queueing theory, point processes, stochastic geometry, random graphs.

The activity on cellular networks has several complementary facets ranging from performance evaluation to protocol design. The work is mainly based on strong collaborations with Alcatel-Lucent and Sprint. We also have personal collaborations with two researchers of Orange Labs.

Building upon the scalable admission and congestion control schemes developed in
,
, which allow for an exact representation of the geometry of
interference in networks, in collaboration with M.K. Karray [Orange Labs], we continue developing a
*comprehensive approach to the performance evaluation of cellular networks*. This approach, that resulted in three patents filed by INRIA and FT, is used by Orange. Some of our methods
were in particular integrated to Orange's dimensioning tool (initially SERT, now
*UTRANDIM*).

This year, the main focus was on the extension of our approach to cellular networks implementing Orthogonal Frequency-Division Multiple Access (OFDMA). The recent interest in OFDMA comes from the fact that it is used in the mobility mode of IEEE 802.16 WirelessMAN Air Interface standard, commonly referred to as WiMAX and OFDMA is currently a working assumption in 3GPP Long Term Evolution (LTE) downlink. Also, OFDMA is the candidate access method for the IEEE 802.22 Wireless Regional Area Networks. It is the context of LTE cellular networks that we have primarily on mind. However, our approach applies to other OFDMA downlink scenarios as well.

The primary objective is to build a
*dimensioning method*for the radio part of the downlink in wireless cellular OFDMA networks, i.e.; a method allowing one to evaluate what is the minimal density of base stations
assuring a given quality of service (QoS) for a given traffic demand per surface unit.

The configuration of the neighbour cell list (NCL) has an important impact on the number of dropped calls in cellular networks. In a method for optimizing NCLs is presented. It consists of an initialization using a self-configuration phase, followed by a self-optimization phase that further refines the NCL based on measurements provided by mobile stations during the network operation. Algorithms for both initial self-configuration, and ongoing self-optimization are presented. The performance of the proposed methods is evaluated for different user speeds and different NCL sizes. In addition, the convergence speed of the proposed self-optimization method is evaluated.

In the papers , , we investigate two critical issues pertaining to small cell networks: best signal quality and user mobility management. We show that, in dense small cell networks, the extremal signal strength distribution tends, after renormalization, to a Gumbel distribution and that it is asymptotically independent of the total interference. Besides, we propose a simple random cell scanning scheme. We establish an analytical model to find the optimal number of cells to be scanned.

The distribution of the maximum of the SINRs,
Y_{S}, received from a cell set
Sis useful for many problems in cellular networks. By modelling the interference field as a shot noise process, in an ongoing work
we could analyze the joint distribution of the interference and the
maximum of the signal strengths, and deduce from this the distribution of
Y_{S}. This can in particular be used to determine NCL sizes which minimize the real-time call dropping rate and maximize the user data throughput in macro cellular networks.

A MANET is made of mobile nodes which are at the same time terminals and routers, connected by wireless links, the union of which forms an arbitrary topology. The nodes are free to move randomly and organize themselves arbitrarily. Important issues in such a scenario are connectivity, medium access (MAC), routing and stability. This year, in collaboration with Paul Mühlethaler [INRIA HIPERCOM], we mainly worked on the analysis of MAC and routing protocols in multi-hop MANETS.

Spatial Aloha is probably the simplest medium access protocol to be used in a large mobile ad hoc network: Each station tosses a coin independently of everything else and accesses the channel if it gets heads. In a network where stations are randomly and homogeneously located in a plane, there is a way to tune the bias of the coin so as to obtain the best possible compromise between spatial reuse and per transmitter throughput. In the paper that complements we showed how to address this questions using stochastic geometry and more precisely Poisson shot noise field theory. The theory that is developed is fully computational and leads to new closed form expressions for various kinds of spatial averages (like e.g. outage, throughput or transport). It also allows one to derive general scaling laws that hold for general fading assumptions. We exemplified its flexibility by analyzing a natural variant of Spatial Aloha which we call Opportunistic Aloha and which consists in replacing the coin tossing by an evaluation of the quality of the channel of each station to its receiver and a selection of the stations with good channel (e.g. fading) conditions. We showed how to adapt the general machinery to this variant and how to optimize and implement it. We also showed that when properly tuned, Opportunistic Aloha very significantly outperforms Spatial Aloha, with e.g. a mean throughput per unit area twice higher for Rayleigh fading scenarios with typical parameters.

Consider again a slotted version of Aloha for MANETS. As above, our model features transmitters randomly located in the Euclidean plane, according to a Poisson point process and a set of receivers representing the next-hop from every transmitter. We concentrate on the so-called outage scenario, where a successful transmission requires a SINR larger than some threshold. In we analyze the local delays in such a network, namely the number of times slots required for nodes to transmit a packet to their prescribed next-hop receivers. The analysis depends very much on the receiver scenario and on the variability of the fading. In most cases, each node has finite-mean geometric random delay and thus a positive next hop throughput. However, the spatial (or large population) averaging of these individual finite mean-delays leads to infinite values in several practical cases, including the Rayleigh fading and positive thermal noise case. In some cases it exhibits an interesting phase transition phenomenon where the spatial average is finite when certain model parameters (receiver distance, thermal noise, Aloha medium access probability) are below a threshold and infinite above. To the best of our knowledge, this phenomenon, has not been discussed in the literature. We comment on the relationships between the above facts and the heavy tails found in the so-called "RESTART" algorithm. We argue that the spatial average of the mean local delays is infinite primarily because of the outage logic, where one transmits full packets at time slots when the receiver is covered at the required SINR and where one wastes all the other time slots. This results in the "RESTART" mechanism, which in turn explains why we have infinite spatial average. Adaptive coding offers another nice way of breaking the outage/RESTART logic. We show examples where the average delays are finite in the adaptive coding case, whereas they are infinite in the outage case.

In classical routing strategies for wireless ad-hoc (mobile or mesh) networks packets are transmitted on a pre-defined route that is usually obtained by a shortest path routing protocol.
In
we review some recent ideas from
,
concerning a new routing technique which is
*opportunistic*in the sense that each packet at each hop on its (specific) route from an origin to a destination takes advantage of the actual pattern of nodes that captured its recent
(re)transmission in order to choose the next relay. The paper focuses both on the distributed algorithms allowing such a routing technique to work and on the evaluation of the gain in
performance it brings compared to classical mechanisms. On the algorithmic side, we show that it is possible to implement this opportunistic technique in such a way that the current
transmitter of a given packet does not need to know its next relay a priori, but the nodes that capture this transmission (if any) perform a
*self selection*procedure to chose the packet relay node and acknowledge the transmitter. We also show that his routing technique works well with various medium access protocols (such
as Aloha, CSMA, TDMA) Finally, we show that the above relay self selection procedure can be optimized in the sense that it is the node that optimizes some given utility criterion (e.g.
minimize the remaining distance to the final destination) which is chosen as the relay. The performance evaluation part is based on stochastic geometry and combines simulation a analytical
models. The main result is that such opportunistic schemes very significantly outperform classical routing schemes when properly optimized and provided at least a small number of nodes in
the network know their geographical positions exactly.

Mathematical analysis of asymptotic properties of opportunistic routing on large distances (when the Euclidean distance between the source and destination node tends to infinity) reviles the following surprising negative result: Under Poisson assumption for the repartition of nodes and some natural assumptions on the wireless channels, the mean delay per unit of distance is infinite. The main positive result states that when adding a periodic node infrastructure of arbitrarily small intensity to the Poisson point process, this “delay rate” is positive and finite (see Section for more details).

Vehicular Ad Hoc NETworks (VANETs) are special cases of MANETs where the network is formed between vehicles. VANETs are today the most promising civilian application for MANETs and they are likely to revolutionize our traveling habits by increasing safety on the road while providing value added services.

In we adapted the stochastic geometry framework previously worked out for planar MANETs to propose two models of point to point traffic for Aloha-based linear VANETs. The first one uses a SINR capture condition to qualify a successful transmission, while the second one express the transmission throughput as a function of SINR using Shannon's law. Assuming a Poisson repartition of vehicles, a power-law mean path-loss and a Raleigh fading, we derive explicit formulas for the probability of a successful transmission on a given distance and the mean throughput, respectively. Furthermore, we optimize two quantities directly linked to the achievable network throughput: the mean density of packet progress and the mean density of information transport. This is realized by tuning the communication range and the probability of channel access. We also present numerical examples and study the impact of the thermal noise on the optimal tuning of network parameters. The mathematical tools for its analysis are borrowed from and .

Conflict-avoiding codes are used in the multiple access collision channel without feedback. The number of codewords in a conflict-avoiding code is the number of potential users that can be supported in the system. In , a new upper bound on the size of conflict-avoiding codes is proved. This upper bound is general in the sense that it is applicable to all code lengths and all Hamming weights. Several existing constructions for conflict-avoiding codes, which are known to be optimal for Hamming weights equal to four and five, are shown to be optimal for all Hamming weights in general.

This traditional research topic of TREC has several new threads like perfect simulation, active probing or Markov decision.

Active probing began by measuring end-to-end path metrics, such as delay and loss, in a direct measurement process which did not require inference of internal network parameters. The field has since progressed to measuring network metrics, from link capacities to available bandwidth and cross traffic itself, which reach deeper and deeper into the network and require increasingly complex inversion methodologies. In , we formulate this line of thought as a set of inverse problems in queueing theory. Queueing theory is typically concerned with the solution of direct problems, where the trajectory of the queueing system, and laws thereof, are derived based on a complete specification of the system, its inputs and initial conditions. Inverse problems aim to deduce unknown parameters of the system based on partially observed trajectories. We provide a general definition of the inverse problems in this class and map out the key variants: the analytical methods, the statistical methods and the design of experiments. We also show how this inverse problem viewpoint translates to the design of concrete Internet probing applications.

Most active probing techniques suffer of the “Bottleneck” limitation: all characteristics of the path after the bottleneck link are erased and unreachable. we are currently investigating a new tomography technique, based on the measurement of the fluctuations of point-to-point end-to-end delays, and allowing one to get insight on the residual available bandwidth along the whole path. For this, we combine classical queueing theory models with statistical analysis to obtain estimators of residual bandwidth on all links of the path. These estimators are proved to be tractable, consistent and efficient. In we evaluate their performance with simulation and trace-based experiments.

Lately this method has been generalized in to a probing multicast tree instead of a single path

Perfect simulation, introduced by Propp and Wilson in 1996, is a simulation algorithm that uses coupling arguments to give an unbiased sample from the stationary distribution of a Markov
chain on a finite state space
. In the general case, the algorithm starts trajectories from all
at some time in the past until time
t= 0. If the end state is the same for all trajectories, then the chain has coupled and the end state has the stationary distribution of the Markov chain. Otherwise,
the simulations are started further in the past. The complexity of the algorithm depends on the cardinality of the state space, which is prohibitive for most applications. This simulation
technique becomes efficient if the Markov chain is monotone, as the monotonicity allows to consider only the extremal trajectories of the system. In the non-monotone case, it is possible to
avoid generating all the trajectories by considering bounding processes (Hubert, 2004). The construction of these bounding processes is model-dependent and in general not straightforward. In
a recent work
we proposed an algorithm to construct bounding processes, called
envelopes, for the case of a finite Markov chain with a lattice structure. We show that this algorithm is efficient for some classes of non-monotone queueing networks, such as networks of
queues with batch arrivals, queues with fork and join nodes and/or with negative customers.

In an ongoing work with B. Gaujal [INRIA Rhône-Alpes], F. Pin [ENS Paris] and J.-M. Vincent [Université Joseph Fourier], we are extending these results to a more general framework of piece-wise space homogeneous Markov chains (each event divides the state space into a few zones and the transition of the event is constant within each zone).

Cellular automata were first introduced as deterministic functions
, were
is a finite alphabet, and
Ea discrete space (
or
). Their particularity is to be characterized by a local transition function
for some finite neighborhood
, through the relation
. In an ongoing work with J. Mairesse [LIAFA, CNRS and Université Paris 7] and I. Marcovici [ENS Lyon], we consider probabilistic cellular automata (PCA), which are defined by a
local function
, where
denotes the set of probability measures on
. We study some properties of their invariant measures, and propose an algorithm allowing for perfect sampling of the stationary distribution of an ergodic PCA that is an extension
of the envelope algorithm in
.

Numerical methods for solving Markov chains are in general inefficient if the state space of the chain is very large (or infinite) and lacking a simple repeating structure. An alternative to solving such chains is to construct models that are simple to analyze and that provide bounds for a reward function of interest. In a recent work we presented a new bounding method for Markov chains inspired by Markov reward theory; our method constructs bounds by redirecting selected sets of transitions, facilitating an intuitive interpretation of the modifications on the original system. Redirecting sets of transitions is based on an extension of precedence relations to sets of states (van Houtum et al. 1998), and allows to design more accurate bounds (ex: bounds having the same mean behavior). We show that our method is compatible with strong aggregation of Markov chains; thus we can obtain bounds for the initial chain by analyzing a much smaller chain. We apply the precedence relations on set of states combined with aggregation to prove the bounds of order fill rates for an inventory system of service tools with joint demands/returns. We are currently extending these results to Markov decision processes.

In an ongoing work with I.M. H. Vliegen [Technische Universiteit Eindhoven, The Netherlands] and A. Scheller-Wolf [Carnegie Mellon University, USA], we apply these results in an optimization problem of base stock levels for service tools inventory. The first results of this work were published as a part of the PhD thesis of I. Vliegen (defended in November 2009) .

In an ongoing work with V. Gupta [Carnegie Mellon University, USA] and J. Mairesse Ana Bušić studies the bipartite matching model of customers and servers is a queueing model introduced
by Caldentey, Kaplan and Weiss (Adv. in Appl. Probab., 2009). Let
Cand
Sbe the sets of customer and server classes. At each time step, a pair of customer and server arrive according to a joint probability measure
. Also, a pair of
*matched*customer and server, if there exists any, departs from the system.
*Authorized matchings*are given by a fixed bipartite graph
G= (
C,
S,
E), where
. The evolution of the model can be described by a discrete time Markov chain, where the state of the chain is given by two equal-length words of unmatched customers and servers. The
stability properties are studied under various BF (Buffer First) matching policies, i.e. policies with priority given to customers/servers that are already present in the buffer. It
includes the following policies: FIFO, priorities, MLQ (Match the Longest Queue), or MSQ (Match the Shortest Queue). Assume that the model cannot be decomposed into two independent
submodels. Necessary stability conditions are then given by:

where
S(
U)denotes the servers that can be matched with customers in
U(and
C(
V)is defined dually).

The notion of
*extremal facet*is introduced. For models with only extremal facets, the stability region is maximal for any BF policy, i.e. the conditions
are also sufficient. For models with non-extremal facets, the situation is more intricate. The MLQ policy has a maximal stability region. In the case of a tree, there is a static
priority policy that has maximal stability region.

The main topics covered in 2009 concern transport equations for Scalable TCP and for Split TCP.

The idea of Split TCP is to replace a multihop, end-to-end TCP connection by a cascade of shorter TCP connections using intermediate nodes as proxies, thus achieving higher throughput. In the model that we developed with G. Carofiglio [Bell Laboratories, Alcatel–Lucent] and S. Foss, we consider two long-lived TCP-Reno flows traversing two links with different medium characteristics in cascade. A buffer at the end of the first link prevents the loss of packets that cannot be immediately forwarded on the second link by storing them temporarily. The target of our study is the characterization of the TCP throughput on both links as well as the buffer occupancy. In we establish the partial differential equations for throughput dynamics jointly with that of buffer occupancy in the proxy, we determine the stability conditions by exploiting some intrinsic monotonicity and continuity properties of the system and we derive tail asymptotics for buffer occupancy in the proxy and end-to-end delays.

The unsatisfactory performance of TCP in high speed wide area networks has led to several versions of TCP–like HighSpeed TCP, Fast TCP, Scalable TCP or CUBIC, all aimed at speeding up the window update algorithm. In a joint work with G. Carifiglio , we focus on Scalable TCP which belongs to the class of Multiplicative Increase Multiplicative Decrease congestion control protocols. We present a new stochastic model for the evolution of the instantaneous throughput of a single STCP flow in the Congestion Avoidance phase, under the assumption of a constant per-packet loss probability. This model allows one to derive several closed-form expressions for the key stationary distributions associated with this protocol: we characterize the throughput obtained by the flow, the time separating Multiplicative Decrease events, the number of bits transmitted over certain time intervals and the size of rate decrease. Several applications leveraging these closed form expressions are considered with a particular emphasis on QoS guarantees in the context of dimensioning.

Malicious softwares or malwares for short have become a major security threat. While originating in criminal behavior, their impact are also influenced by the decisions of legitimate end users. Getting agents in the Internet, and in networks in general, to invest in and deploy security features and protocols is a challenge, in particular because of economic reasons arising from the presence of network externalities. Our goal in this paper is to model and quantify the impact of such externalities on the investment in security features in a network. In , we study a network of interconnected agents, which are subject to epidemic risks such as those caused by propagating viruses and worms. Each agent can decide whether or not to invest some amount to self-protect and deploy security solutions which decreases the probability of contagion. Borrowing ideas from random graphs theory, we solve explicitly this "micro"-model and compute the fulfilled expectations equilibria. We are able to compute the network externalities as a function of the parameters of the epidemic. We show that the network externalities have a public part and a private one. As a result of this separation, some counter-intuitive phenomena can occur: there are situations where the incentive to invest in self-protection decreases as the fraction of the population investing in self-protection increases. In a situation where the protection is strong and ensures that the protected agent cannot be harmed by the decision of others, we show that the situation is similar to a free-rider problem. In a situation where the protection is weaker, then we show that the network can exhibit critical mass. We also look at interaction with the security supplier. In the case where security is provided by a monopolist, we show that the monopolist is taking advantage of these positive network externalities by providing a low quality protection.

Entities in the Internet, ranging from individuals and enterprises to service providers, face a broad range of epidemic risks such as worms, viruses, and botnet-driven attacks. Those risks are interdependent risks, which means that the decision by an entity to invest in security and self-protect affects the risk faced by others (for example, the risk faced by an individual decreases when its providers increases its investments in security). As a result of this, entities tend to invest too little in self-protection, relative to the socially efficient level, by ignoring benefits conferred on by others.

TREC is actively working on a book project focused on the use of the stochastic geometry framework for the modeling of wireless communications.

Stochastic geometry is a rich branch of applied probability which allows to study random phenomenons on the plane or in higher dimension. It is intrinsically related to the theory of point processes. Initially its development was stimulated by applications to biology, astronomy and material sciences. Nowadays, it is also used in image analysis. During the 03-08 period, we contributed to proving that it could also be of use to analyze wireless communication networks. The reason for this is that the geometry of the location of mobiles and/or base stations plays a key role since it determines the signal to interference ratio for each potential channel and hence the possibility of establishing simultaneously some set of communications at a given bit rate.

Stochastic geometry provides a natural way of defining (and computing) macroscopic properties of wireless networks, by some averaging over all potential geometrical patterns for e.g. the mobiles. Its role is hence similar to that played by the theory of point processes on the real line in the classical queueing theory. The methodology was initiated in , and it was further developed through several papers including , , , , .

The two-volume book
,
that will appear in the series
*Foundations and Trends in Networking*(NOW Publishers;
http://

More precisely, Volume I provides a concise introduction to relevant models of stochastic geometry, such as spatial shot-noise processes, coverage processes and random tessellations, and to variants of these basic models which incorporate information theoretic notions, such as signal to noise ratio.

Volume II shows how these space-time averages can be used to analyze and optimize the medium access control and routing protocols of interest in large wireless communication networks. This is based on both qualitative and quantitative results. The most important qualitative results are in terms of phase transitions for infinite population models. Quantitative results leverage closed form expressions for the key network performance characteristics.

The monograph provides a comprehensive and unified methodology for wireless network design and it gives a direct access to an emerging and fast growing branch of stochastic modeling.

Stochastic geometric models of wireless networks have in general been investigated under Poissonian setting (see , ). The first aim of the PhD thesis of Yogeshwaran D. is to study certain performance measures of wireless networks using stochastic geometric tools in the non-Poissonian setting. Due to the difficulty in obtaining closed-form expressions for various performance measures in non-Poissonian settings (see ), we attempted a qualitative study of the performance measures.

Directionally convex (
dcx) ordering is a tool for comparison of dependence structure of random vectors that also takes into account the variability of the marginal distributions. When extended to random
fields it concerns comparison of all finite dimensional distributions. In
, viewing locally finite measures as non-negative fields of
measure-values indexed by the bounded Borel subsets of the space,

we formulate and study the
dcxordering of random measures on locally compact spaces. We show that the
dcxorder is preserved under some of the natural operations considered on random measures and point processes, such as deterministic displacement of points, independent superposition and
thinning as well as independent, identically distributed marking. Further operations such as position dependent marking and displacement of points though do not preserve the
dcxorder on all point processes, are shown to preserve the order on Cox point processes. We also examine the impact of
dcxorder on the second moment properties, in particular on clustering and on Palm distributions. Comparisons of Ripley's functions, pair correlation functions as well as examples seem
to indicate that point processes higher in
dcxorder cluster more. As the main result, we show that non-negative integral shot-noise fields with respect to
dcxordered random measures inherit this ordering from the measures. Numerous applications of this result are shown, in particular to comparison of various Cox processes and some
performance measures of wireless networks, in both of which shot-noise fields appear as key ingredients.

Heuristics indicate that clustering of a point process negatively impacts the percolation of the related continuum percolation model, called also the Boolean model. In in current work in progress we move towards a formal statement of this heuristic. Namely, we consider some critical radii for continuum percolation model and show that these are greater for Cox point processes, which are greater in the so called dcx order. This integral order has previously been shown (see ) as suitable for comparison of dependence structure and clustering properties of point processes. Extensions to general point processes as well as comparison of critical levels for percolation of the level-sets of random fields are also discussed. Further, we give examples of point processes whose Boolean models percolate better than Poisson Boolean models.

Random Geometric Graphs (RGG) have played an important role in providing a framework for modeling in wireless communication, starting with the pioneering work on connectivity by Gilbert (1961); . Vertices or points of the graphs represent communicating entities such as base stations. These vertices are assumed to be distributed in space randomly according to some point process, typically a Poisson point process. En edge between two points means that the communicating entities able to communicate with each other. In the classical model an edge exists between any two pair of nodes if the distance between them is less than some critical threshold. A variant of this classical model that exhibits the union of the coverage regions of all nodes is also referred to in stochastic geometry as the Boolean model. In the following, more fundamental works, we study some variants and extensions of the classical models, more or less related to wireless communication networks.

We investigate percolation in the AB Poisson-Boolean model in
d-dimensional Euclidean space, and asymptotic properties of AB random geometric graphs on Poisson points in
[0, 1]
^{d}. The AB random geometric graph we study is a generalization to the continuum of the
ABpercolation model on discrete lattices. We show existence of
ABpercolation for all
d2, and derive bounds for a critical intensity. For
ABrandom geometric graphs, we derive a weak law result for the largest nearest neighbor distance and almost sure asymptotic bounds for the connectivity threshold. This submitted work
can be found in
.

Delay Tolerant Networks, in the simplest terms, are networks that take into account the time-delay in the transmission of information along a network. First passage percolation models have been found to be useful for study of transmission of information along networks. We consider spatial first passage percolation on stationary graphs constructed on point processes with delayed propagation of the information at the vertices of the graph. Depending on the manner of the time-delay, one can obtain various models. The time for propagation of information along networks in such models do not possess the sub-additive property, a key component in the study of first passage percolation models. This is a work in progress.

The following mathematical formalism proposed in is useful when studying macroscopic properties of routing in MANETs, and in particular these produced by opportunistic routing in the sense described in Section . One can model the users of a mobile as points of a stochastic point process where each node can be a transmitter or receiver in each time step. The SINR graph is a geometric graph where the nodes are the points of a point process and an edge is present between a transmitter and a receiver if the SINR at the receiver is above a certain threshold. Due to fluctuations in propagation and MAC, these edges vary in time. In order to account for this fact, in we introduce and analyze SINR graphs which have space and a time dimension. The spatial aspect originates from the random locations of the network nodes in the Euclidean plane. The time aspect stems from the random transmission policy followed by each network node and from the time variations of the wireless channel characteristics. The combination of these random space and time aspects leads to fluctuations of the SINR experienced by the wireless channels, which in turn determine the progression of packets in space and time in such a network.

The times of occurrence of earthquakes in a given area of seismic activity form a simple point process
Non the real line, where
N((
a,
b])is the number of shocks in the time interval
(
a,
b]. The dynamics governing the process can be expressed by the stochastic intensity
(
t). In the stress release model, for
t0,
, where
c>0and
is an i.i.d. sequence of non-negative random variables with finite expectation, whereas
X_{0}is some real random variable. The process
is known to be ergodic.

Another model of interest in seismology is the Hawkes branching process, where the stochastic intensity is
, where
his a non-negative function, called the fertility rate and
is a non-negative integrable function. Such point process appears in the specialized literature under the name ETAS (Epidemic Type After-Shock and is used to model the aftershocks. It
is well known that the corresponding process “dies out” in finite time under the condition
.

A model mixing stress release and Hawkes aftershocks is

where
>0. The positive constant
cis the rate at which the strain builds up. If there is a shock at time
t, then the strain is relieved by the quantity
Z_{N(
t)}. Each shock (primary or secondary) at time
tgenerates aftershocks according to a Poisson process of intensity
. In
, we give necessary and sufficient conditions of ergodicity for this
model.

Belief propagation is a non-rigorous decentralized and iterative algorithmic strategy for solving complex optimization problems on huge graphs by purely-local propagation of dynamic messages along their edges. Its remarkable performance in various domains of application from statistical physics to image processing or error-correcting codes have motivated a lot of theoretical works on the crucial question of convergence of beliefs despite the cycles, and in particular the way it evolves as the size of the underlying graph grows to infinity. However, a complete and rigorous understanding of those remarkable emergence phenomena (general conditions for convergence, asymptotic speed and influence of the initialization) still misses. A new idea consists in using the topological notion of local weak convergence of random geometric graphs to define a limiting local structure as the number of vertexes grows to infinity and then replace the asymptotic study of the phenomenon by its direct analysis on the infinite graph.

This method has already allowed us to establish asymptotic convergence at constant speed for the special case of the famous optimal assignment problem, resulting in a distributed algorithm
with asymptotic complexity
O(
n^{2})compared to
O(
n^{3})for the best-known exact algorithm. This is joint work with Devavrat Shah (MIT). It has been published in the Journal of Mathematics of Operations Research
and appeared in SODA'09
. We hope this method will be easily extended to other optimization
problems on tree-like graphs and will become a powerful tool in the fascinating quest for a general mathematical understanding of Belief Propagation.

A very simple example of an algorithmic problem solvable by dynamic programming is to maximize, over
, the objective function
for given
_{i}>0. This problem, with random
(
_{i}), provides a test example for studying the relationship between optimal and near-optimal solutions of combinatorial optimization problems. In
we showed that, amongst solutions differing from the optimal solution
in a small proportion
of places, we can find near-optimal solutions whose objective function value differs from the optimum by a factor of order
^{2}but not smaller order. We conjecture this relationship holds widely in the context of dynamic programming over random data, and Monte Carlo simulations for the Kauffman-Levin NK model
are consistent with the conjecture. This work is a technical contribution to a broad program initiated in Aldous-Percus (2003) of relating such scaling exponents to the algorithmic difficulty
of optimization problems.

Bootstrap percolation model has been used in several related applications. In , we consider bootstrap percolation in living neural networks. Recent experimental studies of living neural networks reveal that global activation of neural networks induced by electrical stimulation can be explained using the concept of bootstrap percolation on a directed random network. The experiment consists in activating externally an initial random fraction of the neurons and observe the process of firing until its equilibrium. The final portion of neurons that are active depends in a non linear way on the initial fraction. Our main result in is a theorem which enables us to find the final proportion of the fired neurons in the asymptotic case, in the case of random directed graphs with given node degrees as the model for interacting network. This gives a rigorous mathematical proof of a phenomena observed by physicists in neural networks .

In , we adapt the model given in , which is on graphs, to an equivalent on hypergraphs. For this, we generalized the result obtained by Darling and Norris in , which deals with the k-core of a random hypergraph. The proof of this result was the subject of the training course report (for master's degree) of E. Coupechoux . Now, we are trying to deduce from this result, new results on the giant component of random hypergraphs.

Motivated by the modeling of the spread of viruses or epidemics with coordination among agents, we introduce in a new model generalizing both the basic contact model and the bootstrap percolation. We analyze this percolated threshold model when the underlying network is a random graph with fixed degree distribution. Our main results unify many results in the random graphs literature. In particular, we provide a necessary and sufficient condition under which a single node can trigger a large cascade. Then we quantify the possible impact of an attacker against a degree based vaccination and an acquaintance vaccination. We define a security metric allowing to compare the different vaccinations. The acquaintance vaccination requires no knowledge of the node degrees or any other global information and is shown to be much more efficient than the uniform vaccination in all cases.

TREC is a partner of the 3-year ANR project called CMON, jointly with Thomson, LIP6, the INRIA project-team Planète and the community
http://

The collaboration with the Paris Lab of Thomson materializes into

joint seminars and reading groups, notably the Paris-Networking series (
http://

joint courses taught with L. Massoulié and A. Chaintreau,

joint invitations of well known scientists (like e.g. V. Anantharam from Berkeley and D. Veitch from Melbourne),

a joint patent,

partnership in a 3-year ANR project on network measurements called
*CMON*(see above).

In 2009, the interaction with the research lab of Sprint (Sprint ATL, in Burlingame, California) focused on two main topics:

Bayesian inference to locate mobiles in cellular networks .

The analysis of risks on the Internet through an interaction with J. Bolot (see Sections and ).

This collaboration resulted in several joint papers this year again.

This 6 year grant started in September 06 and bears on the modeling of mobile ad hoc networks. It allowed us to hire in 2007 a PhD student, D. Yogeshwaran from IISc Bangalore. The work of D. Yogeshwaran bears on the stochastic comparison of random measures, point process and shot-noise fields.

In 2009 we completed the third phase of a research project with the Network Strategy Group of Alcatel Antwerp (Danny de Vleeschauwer and Koen Laevens) and with N2NSoft (Dohy Hong). This project was focused on the modeling of the interaction of a large collection of multimedia sources that join and leave and that share an access network. The main objective was the design of optimal choking policies for the transport of layer encoded video in such an access networks. The third phase of the project focused on the optimal caching strategies of video chunks within this context.

Since 2007 the collaboration with France Télécom has not been part of any formal framework. Spontaneous collaborations continue with Mohamed Karray, with whom we work on the coverage and
capacity of the CDMA, UMTS and OFDM networks. This resulted in three patents. The pertinence of our approach has already been recognized by Orange Corporate. This operator uses some of our
methods in its dimensioning tool
*UTRANDIM*. This year the collaboration lead to two publications on the performance evaluation and dimensioning of cellular networks (see Section
) and the co-advising of a Master Student Fran cois-Xavier Klepper.

TREC is currently a partner of the
*European Network of Excellence (NoE)*called Euro-NF (
http://

F. Baccelli is a member of the working group 7.3 of IFIP.

the following scientists gave talks in 2009:

France

Marc Lelarge from
*INRIA-ENS, France*, talking on “Economics of Malware: Epidemic Risks Model, Network Externalities and Incentives” February 2009,

Fernando Peruani from
*CEA, France*, talking on “Information spreading in dynamical networks of mobile agents”; May 2009,

Nicolas Gast from
*INRIA, France*, talking on, “A Mean Field Approach for Optimization and Applications”; May 2009,

Calvin Chen from
*INRIA, France*, talking on “User Unsuppressible Protocol Sequences for Collision Channels without Feedback”; May 2009,

Anthony P. Metcalfe from
*Paris 6, France*, talking on “Universality properties of Gelfand-Tsetlin patterns”; December 2009,

Furcy Pin from
*ENS*, talking on “Statistical Estimation of Delays in a Multicast Tree”; December 2009,

Bruno Kauffmann from
*Inria and Paris 6*, talking on “Inverse problems in queueing networks”; December 2009,

Europe

Guenter Last from
*University of Karlsruhe, Germany*, talking on “Invariant transports of random measures” January 2009

Florence Bénézit from
*EPFL, Switzerland*, talking on “Locating IP congested links with unicast probes”; March 2009,

Patrick Thiran from
*EPFL, Switzerland*, talking on “Locating IP congested links with unicast probes”, March 2009,

Hermann Thorisson from
*University of Iceland*, talking on “Coupling and Convergence in Density and in Distribution”; June 2009,

Moez Draief from
*Imperial College London*, talking on “Convergence Speed of Binary Interval Consensus”; September 2009,

Dario Maggiorini from
*The University of Milano*, talking on “ Network Traffic Over a Public Transportation Network: a Probabilistic Approach”; December 2009.

Asia, Australia, Canada, USA

Massimo Franceschetti from
*University of California, San Diego, USA*, talking on “Information-theoretic and physical limits on the capacity scaling of wireless ad-hoc networks”; January 2009,

William A. Massey from
*Princeton, USA*, talking on “Dynamic Pricing to Control Loss Systems with Quality of Service Targets” and “Dynamical Queueing Systems”; January 2009,

Ravi Mazumdar from
*University of Waterloo, Canada*, talking on “Comparison theorems and the validity of heavy traffic limit distributions for stochastic networks”; May 2009,

Vishal Misra
*Columbia University, USA*, talking on “A Shapley Value approach to Internet Economics”; June 2009,

Andrea Montanari from
*Stanford University, USA*, talking on “Matrix Completion from Fewer Entries”; July 2009,

Siu Wai Ho from
*University of South Australia*, talking on “The Refinement of Two Fundamental Tools in Information Theory”; October 2009.

D. Manjunath from
*IIT Bombay, India*, talking on “Computing Functions Over Random Networks: Two New Formulations”; December 2009.

TREC is a founding member of and participates to Paris-Networking (
http://

M. Lelarge animates the project-team seminar
http://

M. Lelarge animates the reading group on random graphs.

B. Błaszczyszyn is a member of
*Commission détachement, délégation et post-doc "sur subvention", Inria Rocquencourt*.

P. Brémaud is a member of the editorial board of the following journals:
*Journal of Applied Probability, Advances in Applied Probability, Journal of Applied Mathematics and Stochastic Analysis*;

F. Baccelli is a member of the editorial board of the following journals:
*QUESTA, Journal of Discrete Event Dynamical Systems, Mathematical Methods of Operations Research, Advances in Applied Probability*.

Graduate Course on point processes, stochastic geometry and random graphs (program ”Master de Sciences et Technologies”), by F. Baccelli, B. Blaszczyszyn and L. Massoulié (45h).

Assistant teaching in L1 and L2 courses LI102 (imperative programming and bases of algorithmic), LI105 (from chipset to the Internet) and LI218 (Initiation to task automation) by B. Kauffmann (64h in total).

Course on Information Theory and Coding by M. Lelarge and J. Salez,

Math/Physics projects: Statistical mechanics of mean field disordered systems by M. Lelarge and G. Semerjian,

Undergraduate course (master level, MMFAI) by P. Brémaud and M. Lelarge on Random Structures and Algorithms.

A Short Course on Palm Theory for Point Processes, B. Błaszczyszyn (October 2009, 10H).

A Short Course on Stochastic Geometry and Wireless Networks, F. Baccelli (September–October 2009, 8H).

Participation in the following conferences:

Journées ALÉA, École thématique du CNRS, CIRM (Luminy, Mars 2009;
http://

Cornell Probability Summer School (Ithaca, NY, July 2009;
http://

Conference on Probabilistic Techniques in Computer Science (Barcelona, Spain, September 2009;
http://

Statistical physics, combinatorics and probability: from discrete to continuous models (Institut Henri Poincaré, Paris, September–December 2009;
http://

Visiting Miller Professor at UC Berkeley from August to December 09;

Scientific adviser of the “Direction Scientifique” of INRIA for communications;

In charge of the “research chapter” of the EIT KIC proposal;

Co-organizer of the SCS (Stochastic Processes in Communication Sciences) Programme of the Newton Institute for Mathematical Sciences to be held in Cambridge during the first semester of 2010.

Guest co-editor of a special issue of JSAC entitled "Stochastic Geometry and Random Graphs for Wireless Networks" .

Member of the program committee of IEEE Infocom'09, ITW'09;

Co supervision of the thesis of P. Bermolen (ENST).

Keynote lectures:

ValueTools'09, Pisa, Italy, October 09 (Stochastic Geometry for Wireless Networks);

Spaswin'09, Seoul, Korea, June 09 (Opportunistic Routing);

Journées de Probabilités de Poitiers, June 09 (Stochastic Geometry);

BCS Computer Journal Lecture, Imperial College, London, February 09 (Wireless Networks).

Presentation at the following conferences:

Conference Stochastic Networks And Related Topics, Bedlewo, Poland, May 09;

IEEE Infocom'09, Rio de Janeiro, April 09;

IPAM workshop on transport equations, UCLA, April 09 (invited lecture);

Erlang centennial conference, Copenhagen, April 09 (invited lecture).

Presentation at the following seminars:

MIT, LIDS Colloquium, December 09;

UC Berkeley, EECS Colloquium, December 09;

Bell Laboratories, Murray Hill (Maths Center), June 09;

Distinguished Lecture Series, Yonsei University, Seoul, Korea, June 09;

Université de Lille, January 09.

Member of the thesis committee of Pierre Calka (Habilitation, Université Paris 5).

Presentation at the DGA/INRIA Seminar, (INRIA Rocquencourt, Mars 2009;
http://

Presentation at the conference Stochastic Networks And Related Topics (Bedlewo, Poland, May;
http://

Tutorial lecture at the summer school
*ResCom2009*(La Palmyre, 7-12 June 2009;
http://

Presentations at the seminar of Computer Science Department, University of Milan (Milan, Italy, October 2009).

Participation in NET-COOP (EURANDOM, Eindhoven, The Netherlands, November 2009;
http://

Received in 2009 the “Grand Prix France Télécom” of the French Academy of Science.

Presentation at the “Céremonie prix de l'Académie 2009” co-organized by SMAI, INRIA and l'Académie des sciences de l'Institut de France. (IHP, Paris, November 2009;
http://

Presentations at the following seminars:

Seminar of the Department of Networking and Networks Domain, Alcatel-Lucent Bell Labs, Villarceaux, France, June 2009,

INRIA Alcatel-Lucent Common Laboratory Workshop (Internal Seminar), INRIA Paris Rocquencourt, October 2009,

TREC's Seminar, Paris, May 2009.

Poster presentation at ACM SIGMETRICS (Seattle, USA, June 2009;
http://

Talk at EuroNF Traf workshop (Paris, France, December 2009;
http://

Participation in the Erlang Centennial Conference, organized by the Queuing System community and the Technical University of Denmark (Copenhagen, Denmark, April 2009;
http://

Member of the program committee of ACM SIGMETRICS 09 (Seattle, June,
http://

Member of the program committee of Euro-NF conference: NetCoop 2009 (Eindhoven, November
http://

Presentations at the following conferences:

Fifth bi-annual Conference on The Economics of the Software and Internet Industries (Toulouse, January,
http://

IEEE INFOCOM 2009 (Rio de Janeiro, April,
http://

ACM SIGMETRICS 2009 (Seattle, June,
http://

WEIS 2009 (London, June,
http://

Allerton 2009 (Urbana-Champaign, October,
http://

Participation in the following conferences:

IPAM Workshop: Probabilistic Techniques and Applications (Los Angeles, October,
http://

Presentation at the following seminars:

ITA 2009 (San Diego, February,
http://

ALEA 2009 (Luminy, March,
http://

Stochastic Networks And Related Topics (Bedlewo, Poland, May,
http://

INFORMS/APS 2009 (Ithaca, July,
http://

Young European Queueing Theorists, EURANDOM, (Eindhoven, November,
http://

Presentations at the Summer School of Probability, July 2009.

Presentation at the the TEMPO seminar at Orange Labs (organized by Thomas Bonald).

Presentations at the following seminars:

Bell Labs France–INRIA workshop (Paris, January 2009),

Bell Labs France–INRIA workshop (Paris, October 2009).

In charge of tutorials for the course “Theórie de l'Information et Codage” at ENS Paris, from January to June, 2009.

In charge of tutorials for the course “Graphes et Combinatoire” at Université Paris 6, from January to June, 2009.

In charge of the course "Formal Calculus" in
*Classes Préparatoires aux Grandes, Écoles (MPSI)*, Lycée Henri IV, Paris, for the year 2009.

Presentation ACM-SIAM Symposium on Discrete Algorithms (SODA09) (New-York, January 2009;
http://

Participation in the following conferences:

Journées ALEA 2009, (CIRM, Luminy, Marsh 2009;
http://

Fifth Cornell Probability Summer School (Cornell University, Ithaca, June 2009;
http://

Presentation at 39th Summer school in Probability (St. Flour, July 2009).

Participation in the following conferences:

Trimester program on Statistical physics, combinatorics and probability. (IHP, Paris, September-December 2009;
http://

Presentation at the following seminars

Departement of IEOR, IIT (Powai Mumbai, India; August 2009).

Departement of Mathematics, Indian Institute of Science (Bangalore, India; September 2009).