## Network functional compression

##### Author(s)

Feizi, Soheil
DownloadFull printable version (8.414Mb)

##### Other Contributors

Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science.

##### Advisor

Muriel Médard.

##### Terms of use

##### Metadata

Show full item record##### Abstract

In this thesis, we consider different aspects of the functional compression problem. In functional compression, the computation of a function (or, some functions) of sources is desired at the receiver(s). The rate region of this problem has been considered in the literature under certain restrictive assumptions. In Chapter 2 of this Thesis, we consider this problem for an arbitrary tree network and asymptotically lossless computations. In particular, for one-stage tree networks, we compute a rate-region and for an arbitrary tree network, we derive a rate lower bound based on the graph entropy. We introduce a new condition on colorings of source random variables' characteristic graphs called the coloring connectivity condition (C.C.C.). We show that unlike the condition mentioned in Doshi et al., this condition is necessary and sufficient for any achievable coding scheme based on colorings. We also show that, unlike entropy, graph entropy does not satisfy the chain rule. For one stage trees with correlated sources, and general trees with independent sources, we propose a modularized coding scheme based on graph colorings to perform arbitrarily closely to the derived rate lower bound. We show that in a general tree network case with independent sources, to achieve the rate lower bound, intermediate nodes should perform some computations. However, for a family of functions and random variables called chain rule proper sets, it is sufficient to have intermediate nodes act like relays to perform arbitrarily closely to the rate lower bound. In Chapter 3 of this Thesis, we consider a multi-functional version of this problem with side information, where the receiver wants to compute several functions with different side information random variables and zero distortion. Our results are applicable to the case with several receivers computing different desired functions. We define a new concept named multi-functional graph entropy which is an extension of graph entropy defined by K6rner. We show that the minimum achievable rate for this problem is equal to conditional multi-functional graph entropy of the source random variable given the side information. We also propose a coding scheme based on graph colorings to achieve this rate. In these proposed coding schemes, one needs to compute the minimum entropy coloring (a coloring random variable which minimizes the entropy) of a characteristic graph. In general, finding this coloring is an NP-hard problem. However, in Chapter 4, we show that depending on the characteristic graph's structure, there are some interesting cases where finding the minimum entropy coloring is not NP-hard, but tractable and practical. In one of these cases, we show that, by having a non-zero joint probability condition on random variables' distributions, for any desired function, finding the minimum entropy coloring can be solved in polynomial time. In another case, we show that if the desired function is a quantization function, this problem is also tractable. We also consider this problem in a general case. By using Huffman or Lempel-Ziv coding notions, we show that finding the minimum entropy coloring is heuristically equivalent to finding the maximum independent set of a graph. While the minimum-entropy coloring problem is a recently studied problem, there are some heuristic algorithms to approximately solve the maximum independent set problem. Next, in Chapter 5, we consider the effect of having feedback on the rate-region of the functional compression problem . If the function at the receiver is the identity function, this problem reduces to the Slepian-Wolf compression with feedback. For this case, having feedback does not make any benefits in terms of the rate. However, it is not the case when we have a general function at the receiver. By having feedback, one may outperform rate bounds of the case without feedback. We finally consider the problem of distributed functional compression with distortion. The objective is to compress correlated discrete sources such that an arbitrary deterministic function of those sources can be computed up to a distortion level at the receiver. In this case, we compute a rate-distortion region and then, propose a simple coding scheme with a non-trivial performance guarantee.

##### Description

Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010. Includes bibliographical references (p. 97-99).

##### Date issued

2010##### Department

Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science##### Publisher

Massachusetts Institute of Technology

##### Keywords

Electrical Engineering and Computer Science.