Show simple item record

dc.contributor.advisorMurray, Fiona
dc.contributor.authorMarquez, Sofia M.
dc.date.accessioned2024-08-01T19:07:06Z
dc.date.available2024-08-01T19:07:06Z
dc.date.issued2024-02
dc.date.submitted2024-07-11T15:29:29.664Z
dc.identifier.urihttps://hdl.handle.net/1721.1/155913
dc.description.abstractTransfer learning from large, pre-trained models and data augmentation are arguably the two most widespread solutions to the problem of data scarcity. However, both methods suffer from limitations that prevent more optimal solutions to natural language processing tasks. We consider that transfer learning benefits from fine-tuning on increased target dataset size, and that data augmentation benefits from applying transformations in a selective, rather than random, manner. Thus, this work evaluates a new augmentation paradigm that uses the attention masks of pre-trained transformers to more effectively apply text transformations in high-importance locations, creating augmentations which can be used for further finetuning. Our comprehensive analysis points to limited success of utilizing this context-aware augmentation method. By shedding light on its strengths and limitations, we offer insights that can guide the selection of optimal augmentation techniques for a variey of models, and lay groundwork for further research in the pursuit of effective solutions for natural language processing tasks under data constraints.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://rightsstatements.org/page/InC-EDU/1.0/
dc.titleEvaluating Data Augmentation with Attention Masks for Context Aware Transformations
dc.typeThesis
dc.description.degreeM.Eng.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciences
mit.thesis.degreeMaster
thesis.degree.nameMaster of Engineering in Computation and Cognition


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record