Author:
Marcati Carlo,Opschoor Joost A. A.,Petersen Philipp C.,Schwab Christoph
Abstract
AbstractIn certain polytopal domains $$\varOmega $$
Ω
, in space dimension $$d=2,3$$
d
=
2
,
3
, we prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in $$H^1(\varOmega )$$
H
1
(
Ω
)
for weighted analytic function classes. These classes comprise in particular solution sets of source and eigenvalue problems for elliptic PDEs with analytic data. Functions in these classes are locally analytic on open subdomains $$D\subset \varOmega $$
D
⊂
Ω
, but may exhibit isolated point singularities in the interior of $$\varOmega $$
Ω
or corner and edge singularities at the boundary $$\partial \varOmega $$
∂
Ω
. The exponential approximation rates are shown to hold in space dimension $$d = 2$$
d
=
2
on Lipschitz polygons with straight sides, and in space dimension $$d=3$$
d
=
3
on Fichera-type polyhedral domains with plane faces. The constructive proofs indicate that NN depth and size increase poly-logarithmically with respect to the target NN approximation accuracy $$\varepsilon >0$$
ε
>
0
in $$H^1(\varOmega )$$
H
1
(
Ω
)
. The results cover solution sets of linear, second-order elliptic PDEs with analytic data and certain nonlinear elliptic eigenvalue problems with analytic nonlinearities and singular, weighted analytic potentials as arise in electron structure models. Here, the functions correspond to electron densities that exhibit isolated point singularities at the nuclei.
Funder
Swiss Federal Institute of Technology Zurich
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics,Computational Theory and Mathematics,Computational Mathematics,Analysis
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献