MIT Libraries homeMIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Undergraduate Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Undergraduate Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Alternating conditional expectation (ACE) applied to classification and recommendation problems

Author(s)
Kozynski Waserman, Fabián Ariel
Thumbnail
DownloadFull printable version (1.827Mb)
Other Contributors
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Advisor
Lizhong Zheng.
Terms of use
MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
In this thesis, a geometric framework for describing relevant information in a collection of data is applied for the general problems of selecting informative features (dimension reduction) from high dimensional data. The framework can be used in an unsupervised manner, extracting universal features that can be used later for general classification of data. This framework is derived by applying local approximations on the space of probability distributions and a small perturbation approach. With this approach, different information theoretic results can be interpreted as linear algebra optimizations based on the norms of vectors in a linear space, which are in general, easier to carry out. Fundamentally, using known procedures such as Singular Value Decomposition (SVD) and Principal Component Analysis (PCA), dimension reduction for maximizing power can be achieved in a straight forward manner. Using the geometric framework, we relate calculation of SVD of a particular matrix related to a probabilistic channel to the application of Alternating Conditional Expectation (ACE) in the problem of optimal regression. The key takeaway of this method is that such problems can be studied in the space of distributions of the data and not the space of outcomes. This geometric framework allows to give an operational meaning to information metrics in the context of data analysis and feature selection. Additionally, it provides a method to obtain universal classification functions without knowledge of the important feature of the problem. This framework is the applied to the problem of data classification and analysis with satisfactory results.
Description
Thesis: Elec. E., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.
 
Cataloged from PDF version of thesis.
 
Includes bibliographical references (page 37).
 
Date issued
2018
URI
http://hdl.handle.net/1721.1/118078
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Publisher
Massachusetts Institute of Technology
Keywords
Electrical Engineering and Computer Science.

Collections
  • Undergraduate Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries homeMIT Libraries logo

Find us on

Twitter Facebook Instagram YouTube RSS

MIT Libraries navigation

SearchHours & locationsBorrow & requestResearch supportAbout us
PrivacyPermissionsAccessibility
MIT
Massachusetts Institute of Technology
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.