| dc.contributor.author | Carrara, Nicholas | |
| dc.contributor.author | Vanslette, Kevin | |
| dc.date.accessioned | 2020-06-03T18:51:11Z | |
| dc.date.available | 2020-06-03T18:51:11Z | |
| dc.date.issued | 2020-03 | |
| dc.date.submitted | 2020-02 | |
| dc.identifier.issn | 1099-4300 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/125652 | |
| dc.description.abstract | Using first principles from inference, we design a set of functionals for the purposes of ranking joint probability distributions with respect to their correlations. Starting with a general functional, we impose its desired behavior through the Principle of Constant Correlations (PCC), which constrains the correlation functional to behave in a consistent way under statistically independent inferential transformations. The PCC guides us in choosing the appropriate design criteria for constructing the desired functionals. Since the derivations depend on a choice of partitioning the variable space into n disjoint subspaces, the general functional we design is the n-partite information (NPI), of which the total correlation and mutual information are special cases. Thus, these functionals are found to be uniquely capable of determining whether a certain class of inferential transformations, ρ→∗ρ′ , preserve, destroy or create correlations. This provides conceptual clarity by ruling out other possible global correlation quantifiers. Finally, the derivation and results allow us to quantify non-binary notions of statistical sufficiency. Our results express what percentage of the correlations are preserved under a given inferential transformation or variable mapping. Keywords: n-partite information; total correlation; mutual information; entropy; probability theory; correlation | en_US |
| dc.publisher | MDPI | en_US |
| dc.relation.isversionof | 10.3390/e22030357 | en_US |
| dc.rights | Creative Commons Attribution 4.0 International license | en_US |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | en_US |
| dc.source | MDPI | en_US |
| dc.title | The design of mutual information as a global correlation quantifier | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Carrara, Nicholas, and Kevin Vanslette, "The design of mutual information as a global correlation quantifier." Entropy 22, 3 (Mar. 2020): no. 357 doi 10.3390/e22030357 ©2020 Author(s) | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Mechanical Engineering | en_US |
| dc.relation.journal | Entropy | en_US |
| dc.eprint.version | Final published version | en_US |
| dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
| eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
| dspace.date.submission | 2020-05-19T12:42:21Z | |
| mit.journal.volume | 22 | en_US |
| mit.journal.issue | 3 | en_US |
| mit.license | PUBLISHER_CC | |
| mit.metadata.status | Complete | |