Show simple item record

dc.contributor.authorQuattoni, Ariadna
dc.contributor.authorCarreras Perez, Xavier
dc.contributor.authorCollins, Michael
dc.date.accessioned2010-10-15T15:03:10Z
dc.date.available2010-10-15T15:03:10Z
dc.date.issued2009-01
dc.identifier.isbn978-1-60558-516-1
dc.identifier.urihttp://hdl.handle.net/1721.1/59367
dc.description.abstractIn recent years the l[subscript 1],[subscript infinity] norm has been proposed for joint regularization. In essence, this type of regularization aims at extending the l[subscript 1] framework for learning sparse models to a setting where the goal is to learn a set of jointly sparse models. In this paper we derive a simple and effective projected gradient method for optimization of l[subscript 1],[subscript infinity] regularized problems. The main challenge in developing such a method resides on being able to compute efficient projections to the l[subscript 1],[subscript infinity] ball. We present an algorithm that works in O(n log n) time and O(n) memory where n is the number of parameters. We test our algorithm in a multi-task image annotation problem. Our results show that l[subscript 1],[subscript infinity] leads to better performance than both l[subscript 2] and l[subscript 1] regularization and that it is is effective in discovering jointly sparse solutions.en_US
dc.description.sponsorshipNational Science Foundation (U.S.) (grant no. 0347631)en_US
dc.language.isoen_US
dc.publisherAssociation for Computing Machineryen_US
dc.relation.isversionofhttp://dx.doi.org/10.1145/1553374.1553484en_US
dc.rightsAttribution-Noncommercial-Share Alike 3.0 Unporteden_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/en_US
dc.sourceMIT web domainen_US
dc.subjectalgorithmsen_US
dc.subjectdesignen_US
dc.subjectmanagementen_US
dc.subjectperformanceen_US
dc.subjecttheoryen_US
dc.titleAn efficient projection for l1,∞ regularizationen_US
dc.title.alternativeAn efficient projection for l [subscript 1],[subscript infinity] regularizationen_US
dc.typeArticleen_US
dc.identifier.citationQuattoni, Ariadna, Xavier Carreras, Michael Collins, and Trevor Darrell (2009). An efficient projection for l [subscript 1],[subscript infinity] regularization. Proceedings of the 26th Annual International Conference on Machine Learning (New York, N.Y.: ACM): 857-864. © 2009 ACMen_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.approverCollins, Michael
dc.contributor.mitauthorQuattoni, Ariadna
dc.contributor.mitauthorCarreras Perez, Xavier
dc.contributor.mitauthorCollins, Michael
dc.relation.journalProceedings of the 26th Annual International Conference on Machine Learningen_US
dc.eprint.versionAuthor's final manuscript
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsQuattoni, Ariadna; Carreras, Xavier; Collins, Michael; Darrell, Trevoren
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record