Details
| Originalsprache | Englisch |
|---|---|
| Aufsatznummer | 118190 |
| Fachzeitschrift | Computer Methods in Applied Mechanics and Engineering |
| Jahrgang | 445 |
| Frühes Online-Datum | 14 Juli 2025 |
| Publikationsstatus | Veröffentlicht - 1 Okt. 2025 |
Abstract
Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs’ strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.
ASJC Scopus Sachgebiete
- Ingenieurwesen (insg.)
- Numerische Mechanik
- Ingenieurwesen (insg.)
- Werkstoffmechanik
- Ingenieurwesen (insg.)
- Maschinenbau
- Physik und Astronomie (insg.)
- Allgemeine Physik und Astronomie
- Informatik (insg.)
- Angewandte Informatik
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: Computer Methods in Applied Mechanics and Engineering, Jahrgang 445, 118190, 01.10.2025.
Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
}
TY - JOUR
T1 - Enhancing high-dimensional probabilistic model updating
T2 - A generic generative model-inspired framework with GAN-embedded implementation
AU - Mo, Jiang
AU - Yan, Wang Ji
AU - Yuen, Ka Veng
AU - Beer, Michael
N1 - Publisher Copyright: © 2025
PY - 2025/10/1
Y1 - 2025/10/1
N2 - Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs’ strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.
AB - Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs’ strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.
KW - Gaussian mixture model
KW - Generative adversarial network
KW - Maximum mean discrepancy
KW - Probabilistic model updating
KW - Probability distribution
KW - Reparameterization trick
UR - http://www.scopus.com/inward/record.url?scp=105010503249&partnerID=8YFLogxK
U2 - 10.1016/j.cma.2025.118190
DO - 10.1016/j.cma.2025.118190
M3 - Article
AN - SCOPUS:105010503249
VL - 445
JO - Computer Methods in Applied Mechanics and Engineering
JF - Computer Methods in Applied Mechanics and Engineering
SN - 0045-7825
M1 - 118190
ER -