Affiliation:
1. Information Technology Institute, Vietnam National University, Hanoi, Vietnam
Abstract
We find the convergence rates of the collocation approximation by deep ReLU neural networks of solutions to elliptic PDEs with lognormal inputs, parametrized by $\boldsymbol{y}$ in the noncompact set ${\mathbb R}^\infty$. The approximation error is measured in the norm of the Bochner space $L_2({\mathbb R}^\infty, V, \gamma)$, where $\gamma$ is the infinite tensor-product
standard Gaussian probability measure on ${\mathbb R}^\infty$ and $V$ is the energy space. We also obtain similar dimension-independent results in the case when the lognormal inputs are parametrized by ${\mathbb R}^M$ of very large dimension $M$, and the approximation error is measured in the $\sqrt{g_M}$-weighted uniform norm of the Bochner space $L_\infty^{\sqrt{g}}({\mathbb R}^M, V)$, where $g_M$ is the density function of the standard Gaussian probability measure on ${\mathbb R}^M$.
Bibliography: 62 titles.
Funder
National Foundation for Science and Technology Development
Publisher
Steklov Mathematical Institute
Subject
Algebra and Number Theory
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Conclusions;Analyticity and Sparsity in Uncertainty Quantification for PDEs with Gaussian Random Field Inputs;2023
2. Smolyak Sparse-Grid Interpolation and Quadrature;Analyticity and Sparsity in Uncertainty Quantification for PDEs with Gaussian Random Field Inputs;2023