MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Analytical benchmark problems and methodological framework for the assessment and comparison of multifidelity optimization methods

Author(s)
Mainini, Laura; Serani, Andrea; Pehlivan-Solak, Hayriye; Di Fiore, Francesco; Rumpfkeil, Markus P.; Minisci, Edmondo; Quagliarella, Domenico; Yildiz, Sihmehmet; Ficini, Simone; Pellegrini, Riccardo; Thelen, Andrew; Bryson, Dean; Nikbay, Melike; Diez, Matteo; Beran, Philip S.; ... Show more Show less
Thumbnail
Download11831_2025_Article_10392.pdf (6.576Mb)
Publisher with Creative Commons License

Publisher with Creative Commons License

Creative Commons Attribution

Terms of use
Creative Commons Attribution https://creativecommons.org/licenses/by/4.0/
Metadata
Show full item record
Abstract
As engineering systems increase in complexity and performance demands intensify, Multidisciplinary Design Optimization (MDO) methodologies are becoming essential for integrating models from multiple disciplines to optimize complex multi-physics systems. Within this context, major challenges remain in selecting appropriate disciplinary fidelity levels, and how to couple them effectively. Multifidelity methods offer a promising path forward by strategically combining information sources of varying fidelity - whether computational or experimental - to enable efficient and scalable design exploration and optimization. Despite the development of numerous multifidelity methods, their comparative performance remains difficult to assess due to the absence of standardized benchmark frameworks capable of evaluating performance across diverse optimization tasks. To address this gap, this paper introduces a comprehensive benchmarking framework that includes: (i) a suite of analytical benchmark optimization problems designed to stress-test and validate multifidelity methods; (ii) a set of assessment metrics for quantifying and comparing performance over measurable objectives; and (iii) the classification, evaluation, and comparison of several families of multifidelity optimization methods and frameworks using the proposed benchmarks to identify their respective strengths and weaknesses in real-world scenarios. The proposed benchmark problems are analytically defined functions carefully selected to capture mathematical challenges commonly encountered in real-world applications, including high dimensionality, multimodality, discontinuities, and noise. Their closed-form nature ensures computational efficiency, high reproducibility, and a clear separation of algorithmic behavior from numerical artifacts. The accompanying performance metrics support the systematic evaluation of multifidelity methods, measuring both optimization effectiveness and global approximation accuracy. By providing a rigorous, reproducible, and accessible benchmarking framework, this work aims to enable the broader community to understand, compare, and advance multifidelity optimization methods for complex problems in science and engineering.
Date issued
2025-11-10
URI
https://hdl.handle.net/1721.1/163736
Department
Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Journal
Archives of Computational Methods in Engineering
Publisher
Springer Netherlands
Citation
Mainini, L., Serani, A., Pehlivan-Solak, H. et al. Analytical benchmark problems and methodological framework for the assessment and comparison of multifidelity optimization methods. Arch Computat Methods Eng (2025).
Version: Final published version

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.