Physics-constrained non-Gaussian probabilistic learning on manifolds
From MaRDI portal
Publication:6495601
DOI10.1002/NME.6202WikidataQ127339623 ScholiaQ127339623MaRDI QIDQ6495601
Christian Soize, Roger G. Ghanem
Publication date: 30 April 2024
Published in: International Journal for Numerical Methods in Engineering (Search for Journal in Brave)
probabilistic learningmachine learninguncertainty quantificationKullback-Leiblerstatistical constraintsdata driven
Artificial intelligence (68Txx) Multivariate analysis (62Hxx) Probabilistic methods, stochastic differential equations (65Cxx)
Related Items (2)
Updating an uncertain and expensive computational model in structural dynamics based on one single target FRF using a probabilistic learning tool ⋮ Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- On the statistical dependence for the components of random elasticity tensors exhibiting material symmetry properties
- Kullback-Leibler upper confidence bounds for optimal sequential allocation
- Uncertainty quantification. An accelerated course with advanced applications in computational engineering
- Approximate Bayesian computational methods
- Data-driven probability concentration and sampling on manifold
- Stochastic spectral methods for efficient Bayesian solution of inverse problems
- An algorithm for finding the distribution of maximal entropy
- Polynomial chaos representation of databases on manifolds
- Statistical and computational inverse problems.
- Entropy-based closure for probabilistic learning on manifolds
- Supervised distance metric learning through maximization of the Jeffrey divergence
- Non-Gaussian positive-definite matrix-valued random fields for elliptic stochastic partial differential operators
- Inverse problems: A Bayesian perspective
- Information Theory and Statistical Mechanics
- Polynomial Chaos Expansion of a Multimodal Random Vector
- Construction of probability distributions in high dimension using the maximum entropy principle: Applications to stochastic processes, random fields and random matrices
- Ambiguous Joint Chance Constraints Under Mean and Dispersion Information
- Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps
- Learning Variable-Length Markov Models of Behavior
- Goal-Oriented Optimal Approximations of Bayesian Linear Inverse Problems
- Calculation of Lagrange Multipliers in the Construction of Maximum Entropy Distributions in High Stochastic Dimension
- Elements of Information Theory
- On Information and Sufficiency
This page was built for publication: Physics-constrained non-Gaussian probabilistic learning on manifolds