Show simple item record

dc.contributor.advisorGibson, Edward
dc.contributor.advisorBerwick, Robert C.
dc.contributor.authorHeuser, Annika
dc.date.accessioned2023-01-19T18:39:10Z
dc.date.available2023-01-19T18:39:10Z
dc.date.issued2022-09
dc.date.submitted2022-08-11T01:31:41.471Z
dc.identifier.urihttps://hdl.handle.net/1721.1/147233
dc.description.abstractPsycholinguists study online language processing to gain insight into both the different mental representations of various sentence types and the computational resources required to build those representations. Psycholinguists have a number of tools available to them, the most prevalent being eye-tracking and self-paced reading (SPR). However, a lesser-known tool called the Maze task, more specifically G(rammatical)- Maze, is arguably a better choice for detecting and localizing differences in processing difficulty from word to word. In G-Maze, a participant must choose between each successive word in sentence and a distractor word that does not make sense based on the preceding context. If a participant chooses the distractor as opposed to the actual word, then the trial ends and they may not complete the sentence. Like SPR, G-Maze can be cheaply run on a crowdsourcing platform, but it does a better job of localizing effects and filtering out noisy data. Still, the effort required to pick contextually inappropriate distractors for hundreds of words might cause an experimenter to hesitate before picking this method. Boyce et al. (2020) remove this hesitation with A(uto)-Maze, a tool that automatically generates distractors using a computational language model. In this thesis, we introduce the next generation of A-Maze: T(ransformer)-Maze. Transformer models are the current state of the art in natural language processing, and thousands, pretrained in a variety of languages, are freely available on the internet, specifically through Huggingface’s Transformers package. In our validation experiment, T-Maze proves itself to be as effective as G-Maze with handmade materials, run in a lab. We are excited to provide psycholinguists with a new tool that allows them to easily gather high-quality online sentence processing data in many different languages.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright MIT
dc.rights.urihttp://rightsstatements.org/page/InC-EDU/1.0/
dc.titleTransformer-Maze
dc.typeThesis
dc.description.degreeM.Eng.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeMaster
thesis.degree.nameMaster of Engineering in Computation and Cognition


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record