Show simple item record

dc.contributor.authorSrebro, Nathan
dc.date.accessioned2005-12-22T02:16:24Z
dc.date.available2005-12-22T02:16:24Z
dc.date.issued2004-11-22
dc.identifier.otherMIT-CSAIL-TR-2004-076
dc.identifier.otherAITR-2004-009
dc.identifier.urihttp://hdl.handle.net/1721.1/30507
dc.description.abstractMatrices that can be factored into a product of two simpler matricescan serve as a useful and often natural model in the analysis oftabulated or high-dimensional data. Models based on matrixfactorization (Factor Analysis, PCA) have been extensively used instatistical analysis and machine learning for over a century, withmany new formulations and models suggested in recent years (LatentSemantic Indexing, Aspect Models, Probabilistic PCA, Exponential PCA,Non-Negative Matrix Factorization and others). In this thesis weaddress several issues related to learning with matrix factorizations:we study the asymptotic behavior and generalization ability ofexisting methods, suggest new optimization methods, and present anovel maximum-margin high-dimensional matrix factorizationformulation.
dc.format.extent132 p.
dc.format.extent96239481 bytes
dc.format.extent5561927 bytes
dc.format.mimetypeapplication/postscript
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.relation.ispartofseriesMassachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory
dc.subjectAI
dc.titleLearning with Matrix Factorizations


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record