Show simple item record

dc.contributor.authorBuehler, Markus J
dc.date.accessioned2024-09-18T18:33:59Z
dc.date.available2024-09-18T18:33:59Z
dc.date.issued2023-08-28
dc.identifier.urihttps://hdl.handle.net/1721.1/156895
dc.description.abstractWe report a flexible language-model-based deep learning strategy, applied here to solve complex forward and inverse problems in protein modeling, based on an attention neural network that integrates transformer and graph convolutional architectures in a causal multi-headed graph mechanism, to realize a generative pretrained model. The model is applied to predict the secondary structure content (per-residue level and overall content), protein solubility, and sequencing tasks. Further trained on inverse tasks, the model is rendered capable of designing proteins with these properties as target features. The model is formulated as a general framework, completely prompt-based, and can be adapted for a variety of downstream tasks. We find that adding additional tasks yields emergent synergies that the model exploits in improving overall performance, beyond what would be possible by training a model on each dataset alone. Case studies are presented to validate the method, yielding protein designs specifically focused on structural materials, but also exploring the applicability in the design of soluble, antimicrobial biomaterials. While our model is trained to ultimately perform eight distinct tasks, with available datasets, it can be extended to solve additional problems. In a broader sense, this study illustrates a form of multiscale modeling that relates a set of ultimate building blocks (here, byte-level utf8 characters that define the nature of the physical system at hand) to complex output. This materiomic scheme captures complex emergent relationships between universal building block and resulting properties, via a synergizing learning capacity, to express a set of potentialities embedded in the knowledge used in training via the interplay of universality and diversity.</jats:p> <jats:p>Significance statement: Predicting the properties of materials based on a flexible description of their structure, environment, or process, is a long-standing challenge in multiscale modeling. Our MaterioFormer language model, trained to solve forward and inverse tasks, incorporates a deep learning capacity through attention and graph strategies to yield a multimodal approach to model and design materials. Since our model is prompt-based and information is encoded consistently via byte-level utf8 tokenization, it can process diverse modalities of information, such as sequence data, description of tasks, and numbers, and offers a flexible workflow that integrates human intelligence and artificial intelligence. Autoregressive training, using pre-training against a large unlabeled dataset, allows for straightforward adjustment of specific objectives.en_US
dc.language.isoen
dc.publisherAIP Publishingen_US
dc.relation.isversionof10.1063/5.0157367en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceAIP Publishingen_US
dc.titleGenerative pretrained autoregressive transformer graph neural network applied to the analysis and discovery of novel proteinsen_US
dc.typeArticleen_US
dc.identifier.citationMarkus J. Buehler; Generative pretrained autoregressive transformer graph neural network applied to the analysis and discovery of novel proteins. J. Appl. Phys. 28 August 2023; 134 (8): 084902.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Atomistic and Molecular Mechanicsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Center for Computational Science and Engineeringen_US
dc.relation.journalJournal of Applied Physicsen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2024-09-18T18:23:06Z
dspace.orderedauthorsBuehler, MJen_US
dspace.date.submission2024-09-18T18:23:09Z
mit.journal.volume134en_US
mit.journal.issue8en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record