dc.contributor.advisor | Solomon, Justin | |
dc.contributor.author | Gabrielsson, Rickard Brüel | |
dc.date.accessioned | 2023-11-02T20:11:33Z | |
dc.date.available | 2023-11-02T20:11:33Z | |
dc.date.issued | 2023-09 | |
dc.date.submitted | 2023-09-21T14:26:34.499Z | |
dc.identifier.uri | https://hdl.handle.net/1721.1/152728 | |
dc.description.abstract | We introduce Deep Augmentation, an approach to data augmentation using dropout to
dynamically transform a targeted layer within a neural network, with the option to use
the stop-gradient operation, offering significant improvements in model performance and
generalization. We demonstrate the efficacy of Deep Augmentation through extensive
experiments on contrastive learning tasks in computer vision and NLP domains, where we
observe substantial performance gains with ResNets and Transformers as the underlying
models. Our experimentation reveals that targeting deeper layers with Deep Augmentation
outperforms augmenting the input data, and the simple network- and data-agnostic nature of
this approach enables its seamless integration into computer vision and NLP pipelines. | |
dc.publisher | Massachusetts Institute of Technology | |
dc.rights | In Copyright - Educational Use Permitted | |
dc.rights | Copyright retained by author(s) | |
dc.rights.uri | https://rightsstatements.org/page/InC-EDU/1.0/ | |
dc.title | Enhancing Self-Supervised Learning through Transformations in Higher Activation Space | |
dc.type | Thesis | |
dc.description.degree | S.M. | |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
mit.thesis.degree | Master | |
thesis.degree.name | Master of Science in Electrical Engineering and Computer Science | |