Show simple item record

dc.contributor.advisorTrevor Darrell
dc.contributor.authorStiefelhagen, Raineren_US
dc.contributor.authorDarrell, Trevoren_US
dc.contributor.authorUrtasun, Raquelen_US
dc.contributor.authorGeiger, Andreasen_US
dc.contributor.otherVisionen_US
dc.date.accessioned2008-09-29T20:15:10Z
dc.date.available2008-09-29T20:15:10Z
dc.date.issued2008-09-26
dc.identifier.urihttp://hdl.handle.net/1721.1/42840
dc.description.abstractNon-linear dimensionality reduction methods are powerful techniques to deal with high-dimensional datasets. However, they often are susceptible to local minima and perform poorly when initialized far from the global optimum, even when the intrinsic dimensionality is known a priori. In this work we introduce a prior over the dimensionality of the latent space, and simultaneously optimize both the latent space and its intrinsic dimensionality. Ad-hoc initialization schemes are unnecessary with our approach; we initialize the latent space to the observation space and automatically infer the latent dimensionality using an optimization scheme that drops dimensions in a continuous fashion. We report results applying our prior to various tasks involving probabilistic non-linear dimensionality reduction, and show that our method can outperform graph-based dimensionality reduction techniques as well as previously suggested ad-hoc initialization strategies.en_US
dc.format.extent8 p.en_US
dc.relation.ispartofseriesMIT-CSAIL-TR-2008-056
dc.titleRank Priors for Continuous Non-Linear Dimensionality Reductionen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record