| dc.contributor.author | Quattoni, Ariadna | |
| dc.contributor.author | Carreras Perez, Xavier | |
| dc.contributor.author | Collins, Michael | |
| dc.date.accessioned | 2010-10-15T15:03:10Z | |
| dc.date.available | 2010-10-15T15:03:10Z | |
| dc.date.issued | 2009-01 | |
| dc.identifier.isbn | 978-1-60558-516-1 | |
| dc.identifier.uri | http://hdl.handle.net/1721.1/59367 | |
| dc.description.abstract | In recent years the l[subscript 1],[subscript infinity] norm has been proposed for joint regularization. In essence, this type of regularization aims at extending the l[subscript 1] framework for learning sparse models to a setting where the goal is to learn a set of jointly sparse models. In this paper we derive a simple and effective projected gradient method for optimization of l[subscript 1],[subscript infinity] regularized problems. The main challenge in developing such a method resides on being able to compute efficient projections to the l[subscript 1],[subscript infinity] ball. We present an algorithm that works in O(n log n) time and O(n) memory where n is the number of parameters. We test our algorithm in a multi-task image annotation problem. Our results show that l[subscript 1],[subscript infinity] leads to better performance than both l[subscript 2] and l[subscript 1] regularization and that it is is effective in discovering jointly sparse solutions. | en_US |
| dc.description.sponsorship | National Science Foundation (U.S.) (grant no. 0347631) | en_US |
| dc.language.iso | en_US | |
| dc.publisher | Association for Computing Machinery | en_US |
| dc.relation.isversionof | http://dx.doi.org/10.1145/1553374.1553484 | en_US |
| dc.rights | Attribution-Noncommercial-Share Alike 3.0 Unported | en_US |
| dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/3.0/ | en_US |
| dc.source | MIT web domain | en_US |
| dc.subject | algorithms | en_US |
| dc.subject | design | en_US |
| dc.subject | management | en_US |
| dc.subject | performance | en_US |
| dc.subject | theory | en_US |
| dc.title | An efficient projection for l1,∞ regularization | en_US |
| dc.title.alternative | An efficient projection for l [subscript 1],[subscript infinity] regularization | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Quattoni, Ariadna, Xavier Carreras, Michael Collins, and Trevor Darrell (2009). An efficient projection for l [subscript 1],[subscript infinity] regularization. Proceedings of the 26th Annual International Conference on Machine Learning (New York, N.Y.: ACM): 857-864. © 2009 ACM | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | en_US |
| dc.contributor.approver | Collins, Michael | |
| dc.contributor.mitauthor | Quattoni, Ariadna | |
| dc.contributor.mitauthor | Carreras Perez, Xavier | |
| dc.contributor.mitauthor | Collins, Michael | |
| dc.relation.journal | Proceedings of the 26th Annual International Conference on Machine Learning | en_US |
| dc.eprint.version | Author's final manuscript | |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
| dspace.orderedauthors | Quattoni, Ariadna; Carreras, Xavier; Collins, Michael; Darrell, Trevor | en |
| mit.license | OPEN_ACCESS_POLICY | en_US |
| mit.metadata.status | Complete | |