Section: Scientific Foundations
Distributed source coding
Distributed source coding (DSC) has emerged as an enabling technology for sensor networks.
It refers to the compression of correlated signals captured by different sensors which do not communicate between themselves. All the signals captured are compressed independently and transmitted to a central base station which has the capability to decode them jointly. DSC finds its foundation in the seminal Slepian-Wolf (SW) and Wyner-Ziv (WZ) theorems.
Let us consider two binary correlated sources X and Y .
If the two coders communicate, it is well known from Shannon's theory that the minimum lossless rate for X and Y is given by the joint entropy H(X, Y) . Slepian and Wolf have established in 1973 that this lossless compression rate bound can be approached with a vanishing error probability for long sequences, even if the two sources are coded separately, provided that they are decoded jointly and that their correlation is known to both the encoder and the decoder. The achievable rate region is thus defined by RXH(X|Y) , RY
H(Y|X) and RX + RY
H(X, Y) , where H(X|Y ) and H(Y|X) denote the conditional entropies between the two sources.
In 1976, Wyner and Ziv considered the problem of coding of two correlated sources X and Y , with respect to a fidelity criterion. They have established the rate-distortion function R*X|Y(D) for the case where the side information Y is perfectly known to the decoder only. For a given target distortion D , R*X|Y(D) in general verifies RX|Y(D)R*X|Y(D)
RX(D) , where RX|Y(D) is the rate required to encode X if Y is available to both the encoder and the decoder, and RX is the minimal rate for encoding X without SI. Wyner and Ziv have shown that, for correlated Gaussian sources and a mean square error distortion measure, there is no rate loss with respect to joint coding and joint decoding of the two sources, i.e., R*X|Y(D) = RX|Y(D) .