Show simple item record

dc.contributor.authorAyaso, Ola
dc.contributor.authorShah, Devavrat
dc.contributor.authorDahleh, Munther A.
dc.date.accessioned2011-05-11T20:37:32Z
dc.date.available2011-05-11T20:37:32Z
dc.date.issued2010-12
dc.date.submitted2009-10
dc.identifier.issn0018-9448
dc.identifier.otherINSPEC Accession Number: 11655671
dc.identifier.urihttp://hdl.handle.net/1721.1/62819
dc.description.abstractA network of nodes communicate via point-to-point memoryless independent noisy channels. Each node has some realvalued initial measurement or message. The goal of each of the nodes is to acquire an estimate of a given function of all the initial measurements in the network. As the main contribution of this paper, a lower bound on computation time is derived. This bound must be satisfied by any algorithm used by the nodes to communicate and compute, so that the mean-square error in the nodes’ estimate is within a given interval around zero. The derivation utilizes information theoretic inequalities reminiscent of those used in rate distortion theory along with a novel “perturbation” technique so as to be broadly applicable. To understand the tightness of the bound, a specific scenario is considered. Nodes are required to learn a linear combination of the initial values in the network while communicating over erasure channels. A distributed quantized algorithm is developed, and it is shown that the computation time essentially scales as is implied by the lower bound. In particular, the computation time depends reciprocally on “conductance”, which is a property of the network that captures the information-flow bottleneck. As a by-product, this leads to a quantized algorithm, for computing separable functions in a network, with minimal computation time.en_US
dc.description.sponsorshipNational Science Foundation (U.S.). Division of Human and Social Dynamics (HSD project 0729361)en_US
dc.description.sponsorshipUnited States. Air Force Office of Scientific Research (Grant FA9550-08-0085)en_US
dc.description.sponsorshipNational Science Foundation (U.S.) (EFRI-ARES Grant 0735956)en_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineers / IEEE Information Theory Societyen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/tit.2010.2080850en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceIEEEen_US
dc.titleInformation Theoretic Bounds for Distributed Computation Over Networks of Point-to-Point Channelsen_US
dc.typeArticleen_US
dc.identifier.citationAyaso, O., D. Shah, and M.A. Dahleh. “Information Theoretic Bounds for Distributed Computation Over Networks of Point-to-Point Channels.” Information Theory, IEEE Transactions On 56.12 (2010) : 6020-6039. Copyright © 2010, IEEEen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.approverDahleh, Munther A.
dc.contributor.mitauthorAyaso, Ola
dc.contributor.mitauthorShah, Devavrat
dc.contributor.mitauthorDahleh, Munther A.
dc.relation.journalIEEE transactions on information theoryen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsAyaso, Ola; Shah, Devavrat; Dahleh, Munther A.en
dc.identifier.orcidhttps://orcid.org/0000-0002-1470-2148
dc.identifier.orcidhttps://orcid.org/0000-0003-0737-3259
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record