dc.contributor.advisor | Rebentisch, Eric S. | |
dc.contributor.author | Lim, Shao Cong | |
dc.date.accessioned | 2023-01-19T19:48:01Z | |
dc.date.available | 2023-01-19T19:48:01Z | |
dc.date.issued | 2022-09 | |
dc.date.submitted | 2022-10-12T16:05:12.911Z | |
dc.identifier.uri | https://hdl.handle.net/1721.1/147405 | |
dc.description.abstract | Modern engineered systems are immensely complex. Extensive sets of natural language requirements guide the development of such systems. As such, tools to assist system engineers in managing and extracting information from these requirements must also scale to match the complexity of these systems. However, the systems engineering community has lagged in adopting advanced natural language processing techniques. Pre-trained language models, such as BERT, represent state-of-the-art in the field. This thesis seeks to understand if these pre-trained language models can achieve higher model performance at a lower computational and manpower cost than earlier techniques. The results show that adapting these language models through task-adaptive pretraining leads to consistent improvements in model performance and greater model robustness. These results indicate the potential of applying such language models in the systems engineering domain. However, much work remains to improve model performance and expand possible applications. | |
dc.publisher | Massachusetts Institute of Technology | |
dc.rights | In Copyright - Educational Use Permitted | |
dc.rights | Copyright MIT | |
dc.rights.uri | http://rightsstatements.org/page/InC-EDU/1.0/ | |
dc.title | A Case for Pre-trained Language Models in Systems Engineering | |
dc.type | Thesis | |
dc.description.degree | S.M. | |
dc.contributor.department | System Design and Management Program. | |
mit.thesis.degree | Master | |
thesis.degree.name | Master of Science in Engineering and Management | |