Show simple item record

dc.contributor.authorMedard, Muriel
dc.contributor.authorChristiansen, Mark M.
dc.contributor.authorDuffy, Ken R.
dc.contributor.authorTessaro, Stefano
dc.contributor.authorCalmon, Flavio du Pin
dc.contributor.authorVaria, Mayank H.
dc.date.accessioned2014-09-29T16:26:42Z
dc.date.available2014-09-29T16:26:42Z
dc.date.issued2013-10
dc.identifier.isbn978-1-4799-3410-2
dc.identifier.isbn978-1-4799-3409-6
dc.identifier.urihttp://hdl.handle.net/1721.1/90435
dc.description.abstractLower bounds for the average probability of error of estimating a hidden variable X given an observation of a correlated random variable Y, and Fano's inequality in particular, play a central role in information theory. In this paper, we present a lower bound for the average estimation error based on the marginal distribution of X and the principal inertias of the joint distribution matrix of X and Y. Furthermore, we discuss an information measure based on the sum of the largest principal inertias, called k-correlation, which generalizes maximal correlation. We show that k-correlation satisfies the Data Processing Inequality and is convex in the conditional distribution of Y given X. Finally, we investigate how to answer a fundamental question in inference and privacy: given an observation Y, can we estimate a function f(X) of the hidden random variable X with an average error below a certain threshold? We provide a general method for answering this question using an approach based on rate-distortion theory.en_US
dc.description.sponsorshipUnited States. Intelligence Advanced Research Projects Activity (Air Force Contract FA8721-05-C-0002)en_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/Allerton.2013.6736575en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleBounds on inferenceen_US
dc.typeArticleen_US
dc.identifier.citationCalmon, Flavio P., Mayank Varia, Muriel Medard, Mark M. Christiansen, Ken R. Duffy, and Stefano Tessaro. “Bounds on Inference.” 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton) (October 2013).en_US
dc.contributor.departmentLincoln Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Research Laboratory of Electronicsen_US
dc.contributor.mitauthorCalmon, Flavio du Pinen_US
dc.contributor.mitauthorMedard, Murielen_US
dc.contributor.mitauthorVaria, Mayank H.en_US
dc.contributor.mitauthorTessaro, Stefanoen_US
dc.relation.journalProceedings of the 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton)en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsCalmon, Flavio P.; Varia, Mayank; Medard, Muriel; Christiansen, Mark M.; Duffy, Ken R.; Tessaro, Stefanoen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-2912-7972
dc.identifier.orcidhttps://orcid.org/0000-0003-4059-407X
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record