Advanced Search
DSpace@MIT

Maximum Entropy Discrimination

Research and Teaching Output of the MIT Community

Show simple item record

dc.contributor.author Jaakkola, Tommi en_US
dc.contributor.author Meila, Marina en_US
dc.contributor.author Jebara, Tony en_US
dc.date.accessioned 2004-10-20T20:29:28Z
dc.date.available 2004-10-20T20:29:28Z
dc.date.issued 1999-12-01 en_US
dc.identifier.other AITR-1668 en_US
dc.identifier.uri http://hdl.handle.net/1721.1/7089
dc.description.abstract We present a general framework for discriminative estimation based on the maximum entropy principle and its extensions. All calculations involve distributions over structures and/or parameters rather than specific settings and reduce to relative entropy projections. This holds even when the data is not separable within the chosen parametric class, in the context of anomaly detection rather than classification, or when the labels in the training set are uncertain or incomplete. Support vector machines are naturally subsumed under this class and we provide several extensions. We are also able to estimate exactly and efficiently discriminative distributions over tree structures of class-conditional models within this framework. Preliminary experimental results are indicative of the potential in these techniques. en_US
dc.format.extent 6420262 bytes
dc.format.extent 1702298 bytes
dc.format.mimetype application/postscript
dc.format.mimetype application/pdf
dc.language.iso en_US
dc.relation.ispartofseries AITR-1668 en_US
dc.title Maximum Entropy Discrimination en_US


Files in this item

Name Size Format Description
AITR-1668.ps 6.122Mb Postscript
AITR-1668.pdf 1.623Mb PDF

This item appears in the following Collection(s)

Show simple item record

MIT-Mirage