| dc.contributor.advisor | Marin Soljačić. | en_US |
| dc.contributor.author | Dangovski, Rumen Rumenov. | en_US |
| dc.contributor.other | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. | en_US |
| dc.date.accessioned | 2020-09-15T21:52:57Z | |
| dc.date.available | 2020-09-15T21:52:57Z | |
| dc.date.copyright | 2020 | en_US |
| dc.date.issued | 2020 | en_US |
| dc.identifier.uri | https://hdl.handle.net/1721.1/127338 | |
| dc.description | Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020 | en_US |
| dc.description | Cataloged from the official PDF of thesis. | en_US |
| dc.description | Includes bibliographical references (pages 36-45). | en_US |
| dc.description.abstract | Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. While LSTMs and GRUs were designed to model long-range dependencies more accurately than conventional RNNs, they nevertheless have problems copying or recalling information from the long distant past. Here, we derive a physics-inspired phase-coded representation of the memory state, Rotational Unit of Memory (RUM), which unifies the concepts of unitary learning and associative memory. We show experimentally that RNNs based on RUMs can solve basic sequential tasks such as memory copying and memory recall much better than LSTMs/GRUs. We further demonstrate that by replacing LSTM/GRU with RUM units and by integrating RUM with Transformers we can apply neural networks to real-world problems such as language modeling and text summarization, yielding results comparable to the state of the art. | en_US |
| dc.description.statementofresponsibility | by Rumen Rumenov Dangovski. | en_US |
| dc.format.extent | 45 pages | en_US |
| dc.language.iso | eng | en_US |
| dc.publisher | Massachusetts Institute of Technology | en_US |
| dc.rights | MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. | en_US |
| dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
| dc.subject | Electrical Engineering and Computer Science. | en_US |
| dc.title | Applied natural language processing inspired by fundamental mathematics and physics | en_US |
| dc.type | Thesis | en_US |
| dc.description.degree | S.M. | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | en_US |
| dc.identifier.oclc | 1192473139 | en_US |
| dc.description.collection | S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science | en_US |
| dspace.imported | 2020-09-15T21:52:57Z | en_US |
| mit.thesis.degree | Master | en_US |
| mit.thesis.department | EECS | en_US |