Density regression and uncertainty quantification with Bayesian deep noise neural networks
From MaRDI portal
Publication:6548857
DOI10.1002/STA4.604MaRDI QIDQ6548857
Daiwei Zhang, Jian Kang, Tianci Liu
Publication date: 3 June 2024
Published in: Stat (Search for Journal in Brave)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo
- Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Deep distribution regression
- Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Error bounds for approximations with deep ReLU networks
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Sampling-Based Approaches to Calculating Marginal Densities
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Prediction Intervals for Artificial Neural Networks
- Gibbs Sampling
- Probable networks and plausible predictions — a review of practical Bayesian methods for supervised neural networks
- Bayesian Density Regression
- Distribution-free Prediction Bands for Non-parametric Regression
- A Kernel-Expanded Stochastic Neural Network
This page was built for publication: Density regression and uncertainty quantification with Bayesian deep noise neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6548857)