Show simple item record

dc.contributor.authorLiang, Qiaohao
dc.contributor.authorGongora, Aldair E
dc.contributor.authorRen, Zekun
dc.contributor.authorTiihonen, Armi
dc.contributor.authorLiu, Zhe
dc.contributor.authorSun, Shijing
dc.contributor.authorDeneault, James R
dc.contributor.authorBash, Daniil
dc.contributor.authorMekki-Berrada, Flore
dc.contributor.authorKhan, Saif A
dc.contributor.authorHippalgaonkar, Kedar
dc.contributor.authorMaruyama, Benji
dc.contributor.authorBrown, Keith A
dc.contributor.authorFisher III, John
dc.contributor.authorBuonassisi, Tonio
dc.date.accessioned2021-12-14T19:19:24Z
dc.date.available2021-12-14T19:19:24Z
dc.date.issued2021-12
dc.identifier.urihttps://hdl.handle.net/1721.1/138478
dc.description.abstract<jats:title>Abstract</jats:title><jats:p>Bayesian optimization (BO) has been leveraged for guiding autonomous and high-throughput experiments in materials science. However, few have evaluated the efficiency of BO across a broad range of experimental materials domains. In this work, we quantify the performance of BO with a collection of surrogate model and acquisition function pairs across five diverse experimental materials systems. By defining acceleration and enhancement metrics for materials optimization objectives, we find that surrogate models such as Gaussian Process (GP) with anisotropic kernels and Random Forest (RF) have comparable performance in BO, and both outperform the commonly used GP with isotropic kernels. GP with anisotropic kernels has demonstrated the most robustness, yet RF is a close alternative and warrants more consideration because it is free from distribution assumptions, has smaller time complexity, and requires less effort in initial hyperparameter selection. We also raise awareness about the benefits of using GP with anisotropic kernels in future materials optimization campaigns.</jats:p>en_US
dc.language.isoen
dc.publisherSpringer Science and Business Media LLCen_US
dc.relation.isversionof10.1038/s41524-021-00656-9en_US
dc.rightsCreative Commons Attribution 4.0 International licenseen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceNatureen_US
dc.titleBenchmarking the performance of Bayesian optimization across multiple experimental materials science domainsen_US
dc.typeArticleen_US
dc.identifier.citationLiang, Qiaohao, Gongora, Aldair E, Ren, Zekun, Tiihonen, Armi, Liu, Zhe et al. 2021. "Benchmarking the performance of Bayesian optimization across multiple experimental materials science domains." npj Computational Materials, 7 (1).
dc.contributor.departmentMassachusetts Institute of Technology. Department of Materials Science and Engineering
dc.contributor.departmentMassachusetts Institute of Technology. Research Laboratory of Electronics
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineering
dc.contributor.departmentSingapore-MIT Alliance in Research and Technology (SMART)
dc.relation.journalnpj Computational Materialsen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2021-12-14T19:13:49Z
dspace.orderedauthorsLiang, Q; Gongora, AE; Ren, Z; Tiihonen, A; Liu, Z; Sun, S; Deneault, JR; Bash, D; Mekki-Berrada, F; Khan, SA; Hippalgaonkar, K; Maruyama, B; Brown, KA; Fisher III, J; Buonassisi, Ten_US
dspace.date.submission2021-12-14T19:13:51Z
mit.journal.volume7en_US
mit.journal.issue1en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record