Show simple item record

dc.contributor.advisorLizhong Zheng.en_US
dc.contributor.authorKozynski Waserman, Fabián Arielen_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.en_US
dc.date.accessioned2018-09-17T15:56:32Z
dc.date.available2018-09-17T15:56:32Z
dc.date.copyright2018en_US
dc.date.issued2018en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/118078
dc.descriptionThesis: Elec. E., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (page 37).en_US
dc.description.abstractIn this thesis, a geometric framework for describing relevant information in a collection of data is applied for the general problems of selecting informative features (dimension reduction) from high dimensional data. The framework can be used in an unsupervised manner, extracting universal features that can be used later for general classification of data. This framework is derived by applying local approximations on the space of probability distributions and a small perturbation approach. With this approach, different information theoretic results can be interpreted as linear algebra optimizations based on the norms of vectors in a linear space, which are in general, easier to carry out. Fundamentally, using known procedures such as Singular Value Decomposition (SVD) and Principal Component Analysis (PCA), dimension reduction for maximizing power can be achieved in a straight forward manner. Using the geometric framework, we relate calculation of SVD of a particular matrix related to a probabilistic channel to the application of Alternating Conditional Expectation (ACE) in the problem of optimal regression. The key takeaway of this method is that such problems can be studied in the space of distributions of the data and not the space of outcomes. This geometric framework allows to give an operational meaning to information metrics in the context of data analysis and feature selection. Additionally, it provides a method to obtain universal classification functions without knowledge of the important feature of the problem. This framework is the applied to the problem of data classification and analysis with satisfactory results.en_US
dc.description.statementofresponsibilityby Fabián Ariel Kozynski Waserman.en_US
dc.format.extent37 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectElectrical Engineering and Computer Science.en_US
dc.titleAlternating conditional expectation (ACE) applied to classification and recommendation problemsen_US
dc.typeThesisen_US
dc.description.degreeElec. E.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.identifier.oclc1051773088en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record