Show simple item record

dc.contributor.authorKiros, Ryan
dc.contributor.authorZhu, Yukun
dc.contributor.authorSalakhutdinov, Ruslan R.
dc.contributor.authorZemel, Richard
dc.contributor.authorUrtasun, Raquel
dc.contributor.authorFidler, Sanja
dc.contributor.authorTorralba, Antonio
dc.date.accessioned2018-02-05T16:14:54Z
dc.date.available2018-02-05T16:14:54Z
dc.date.issued2015-12
dc.identifier.urihttp://hdl.handle.net/1721.1/113418
dc.description.abstractWe describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice. We will make our encoder publicly available.en_US
dc.description.sponsorshipNatural Sciences and Engineering Research Council of Canadaen_US
dc.description.sponsorshipSamsung (Firm)en_US
dc.description.sponsorshipCanadian Institute for Advanced Researchen_US
dc.description.sponsorshipUnited States. Office of Naval Research (Grant N00014-14-1-0232)en_US
dc.language.isoen_US
dc.publisherNeural Information Processing Systems Foundationen_US
dc.relation.isversionofhttps://papers.nips.cc/paper/5950-skip-thought-vectorsen_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceNeural Information Processing Systems (NIPS)en_US
dc.titleSkip-thought Vectorsen_US
dc.typeArticleen_US
dc.identifier.citationKiros, Ryan et al. "Skip-Thought Vectors." Advances in Neural Information Processing Systems 28 (NIPS 2015), 7-12 December, 2015, Montreal, Canada, Neural Information Processing Systems Foundation, 2015. © 2015 Neural Information Processing Systems Foundationen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.mitauthorTorralba, Antonio
dc.relation.journalAdvances in Neural Information Processing Systems 28 (NIPS 2015)en_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsKiros, Ryan; Xhu, Yukun; Salakhutdinov, Ruslan; Zemel, Richard S.; Torralba, Antonio; Urtasun, Raquel; Fidler, Sanjaen_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-4915-0256
mit.licensePUBLISHER_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record