Enhancing high-dimensional probabilistic model updating: A generic generative model-inspired framework with GAN-embedded implementation

Research output: Contribution to journalArticleResearchpeer review

Authors

Research Organisations

External Research Organisations

  • University of Macau
  • Guangdong-Hong Kong-Macao Joint Laboratory on Smart Cities
  • University of Liverpool
  • Tongji University
View graph of relations

Details

Original languageEnglish
Article number118190
JournalComputer Methods in Applied Mechanics and Engineering
Volume445
Early online date14 Jul 2025
Publication statusPublished - 1 Oct 2025

Abstract

Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs’ strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.

Keywords

    Gaussian mixture model, Generative adversarial network, Maximum mean discrepancy, Probabilistic model updating, Probability distribution, Reparameterization trick

ASJC Scopus subject areas

Cite this

Enhancing high-dimensional probabilistic model updating: A generic generative model-inspired framework with GAN-embedded implementation. / Mo, Jiang; Yan, Wang Ji; Yuen, Ka Veng et al.
In: Computer Methods in Applied Mechanics and Engineering, Vol. 445, 118190, 01.10.2025.

Research output: Contribution to journalArticleResearchpeer review

Download
@article{03d652d3e66e42d08745d5c45365ebe6,
title = "Enhancing high-dimensional probabilistic model updating: A generic generative model-inspired framework with GAN-embedded implementation",
abstract = "Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs{\textquoteright} strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.",
keywords = "Gaussian mixture model, Generative adversarial network, Maximum mean discrepancy, Probabilistic model updating, Probability distribution, Reparameterization trick",
author = "Jiang Mo and Yan, {Wang Ji} and Yuen, {Ka Veng} and Michael Beer",
note = "Publisher Copyright: {\textcopyright} 2025",
year = "2025",
month = oct,
day = "1",
doi = "10.1016/j.cma.2025.118190",
language = "English",
volume = "445",
journal = "Computer Methods in Applied Mechanics and Engineering",
issn = "0045-7825",
publisher = "Elsevier BV",

}

Download

TY - JOUR

T1 - Enhancing high-dimensional probabilistic model updating

T2 - A generic generative model-inspired framework with GAN-embedded implementation

AU - Mo, Jiang

AU - Yan, Wang Ji

AU - Yuen, Ka Veng

AU - Beer, Michael

N1 - Publisher Copyright: © 2025

PY - 2025/10/1

Y1 - 2025/10/1

N2 - Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs’ strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.

AB - Probabilistic model updating (PMU), which seeks to identify the probability distributions of model parameters by aligning model predictions with measured responses, is essential for ensuring the credibility of numerical models. However, PMU faces challenges like high computational costs from repeated solver calls and the curse of dimensionality in optimization. Driven by the intrinsic parallels between generative models and PMU, characterized by iterative processes aimed at minimizing disparities between generated and real data distributions, this study presents an innovative and generic PMU framework that emulates the core principles of generative models. Specifically, based on generative adversarial networks (GANs), the PMU-GAN is designed by integrating a probabilistic interpretable network as a generator and a learnable distance metric as a discriminator. By establishing this connection, the innovative approach offers compelling solutions for high-dimensional PMU, harnessing GANs’ strong fitting capacity, optimization prowess, and excellence in high-dimensional realms. The generator utilizes a Gaussian mixture model (GMM) for input distribution approximation and sampling, alongside a differentiable metamodel to expedite output sample generation in lieu of time-consuming solvers. Compared to conventional neural networks, the GMM simplifies and constrains the input distribution forms, facilitating improved convergence. By employing Gumbel-SoftMax and reparameterization tricks, the class probabilities and Gaussian component parameters unique to GMMs are embedded in the generator as trainable parameters, enabling gradient-based optimization and endowing the generator with interpretability. The discriminator, featuring an invertible network and maximum mean discrepancy, refines its ability to gauge distribution disparities through a learning process. Through adversarial training, both the generator's generative power and the discriminator's discernment capability are enhanced. The efficacy of the proposed method in high-dimensional PMU is substantiated through numerical and experimental demonstrations, showcasing its potential in advancing the field.

KW - Gaussian mixture model

KW - Generative adversarial network

KW - Maximum mean discrepancy

KW - Probabilistic model updating

KW - Probability distribution

KW - Reparameterization trick

UR - http://www.scopus.com/inward/record.url?scp=105010503249&partnerID=8YFLogxK

U2 - 10.1016/j.cma.2025.118190

DO - 10.1016/j.cma.2025.118190

M3 - Article

AN - SCOPUS:105010503249

VL - 445

JO - Computer Methods in Applied Mechanics and Engineering

JF - Computer Methods in Applied Mechanics and Engineering

SN - 0045-7825

M1 - 118190

ER -

By the same author(s)