Skip-thought Vectors
Author(s)
Kiros, Ryan; Zhu, Yukun; Salakhutdinov, Ruslan R.; Zemel, Richard; Urtasun, Raquel; Fidler, Sanja; Torralba, Antonio; ... Show more Show less
DownloadTorralba-skip-thought-vectors.pdf (1.265Mb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice. We will make our encoder publicly available.
Date issued
2015-12Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Advances in Neural Information Processing Systems 28 (NIPS 2015)
Publisher
Neural Information Processing Systems Foundation
Citation
Kiros, Ryan et al. "Skip-Thought Vectors." Advances in Neural Information Processing Systems 28 (NIPS 2015), 7-12 December, 2015, Montreal, Canada, Neural Information Processing Systems Foundation, 2015. © 2015 Neural Information Processing Systems Foundation
Version: Final published version