- Journal Home
- Volume 35 - 2024
- Volume 34 - 2023
- Volume 33 - 2023
- Volume 32 - 2022
- Volume 31 - 2022
- Volume 30 - 2021
- Volume 29 - 2021
- Volume 28 - 2020
- Volume 27 - 2020
- Volume 26 - 2019
- Volume 25 - 2019
- Volume 24 - 2018
- Volume 23 - 2018
- Volume 22 - 2017
- Volume 21 - 2017
- Volume 20 - 2016
- Volume 19 - 2016
- Volume 18 - 2015
- Volume 17 - 2015
- Volume 16 - 2014
- Volume 15 - 2014
- Volume 14 - 2013
- Volume 13 - 2013
- Volume 12 - 2012
- Volume 11 - 2012
- Volume 10 - 2011
- Volume 9 - 2011
- Volume 8 - 2010
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2009
- Volume 4 - 2008
- Volume 3 - 2008
- Volume 2 - 2007
- Volume 1 - 2006

Commun. Comput. Phys., 28 (2020), pp. 1768-1811.

Published online: 2020-11

Cited by

- BibTex
- RIS
- TXT

This paper quantitatively characterizes the approximation power of deep feed-forward neural networks (FNNs) in terms of the number of neurons. It is shown by construction that ReLU FNNs with width $\mathcal{O}$(max{$d⌊N^{1/d}⌋$,$N$+1}) and depth $\mathcal{O}(L)$ can approximate an arbitrary Hölder continuous function of order $α∈(0,1]$ on $[0,1]^d$ with a nearly tight approximation rate $\mathcal{O}(\sqrt{d}N^{−2α/d}L^{−2α/d})$ measured in $L^p$ -norm for any $N,L∈\mathbb{N}^+$ and $p∈[1,∞]$. More generally for an arbitrary continuous function $f$ on $[0,1]^d$ with a modulus of continuity $ω_f (·)$, the constructive approximation rate is $\mathcal{O}(\sqrt{d}ω_f(N^{−2α/d}L^{−2α/d}))$. We also extend our analysis to $f$ on irregular domains or those localized in an ε-neighborhood of a $d_\mathcal{M}$-dimensional smooth manifold $\mathcal{M}⊆[0,1]^d$ with $d_\mathcal{M}≪d$. Especially, in the case of an essentially low-dimensional domain, we show an approximation rate $\mathcal{O}(ω_f(\frac{ε}{1−δ}\sqrt{\frac{d}{d_δ}}+ε)+\sqrt{d}ω_f(\frac{\sqrt{d}}{1−δ\sqrt{d_δ}}N^{−2α/d_δ}L^{−2α/d_δ})$ for ReLU FNNs to approximate $f$ in the $ε$-neighborhood, where $d_δ=\mathcal{O}(d_\mathcal{M}\frac{\rm{ln}(d/δ)}{δ^2})$ for any $δ∈(0,1)$ as a relative error for a projection to approximate an isometry when projecting $\mathcal{M}$ to a $d_δ$-dimensional domain.

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2020-0149}, url = {http://global-sci.org/intro/article_detail/cicp/18396.html} }This paper quantitatively characterizes the approximation power of deep feed-forward neural networks (FNNs) in terms of the number of neurons. It is shown by construction that ReLU FNNs with width $\mathcal{O}$(max{$d⌊N^{1/d}⌋$,$N$+1}) and depth $\mathcal{O}(L)$ can approximate an arbitrary Hölder continuous function of order $α∈(0,1]$ on $[0,1]^d$ with a nearly tight approximation rate $\mathcal{O}(\sqrt{d}N^{−2α/d}L^{−2α/d})$ measured in $L^p$ -norm for any $N,L∈\mathbb{N}^+$ and $p∈[1,∞]$. More generally for an arbitrary continuous function $f$ on $[0,1]^d$ with a modulus of continuity $ω_f (·)$, the constructive approximation rate is $\mathcal{O}(\sqrt{d}ω_f(N^{−2α/d}L^{−2α/d}))$. We also extend our analysis to $f$ on irregular domains or those localized in an ε-neighborhood of a $d_\mathcal{M}$-dimensional smooth manifold $\mathcal{M}⊆[0,1]^d$ with $d_\mathcal{M}≪d$. Especially, in the case of an essentially low-dimensional domain, we show an approximation rate $\mathcal{O}(ω_f(\frac{ε}{1−δ}\sqrt{\frac{d}{d_δ}}+ε)+\sqrt{d}ω_f(\frac{\sqrt{d}}{1−δ\sqrt{d_δ}}N^{−2α/d_δ}L^{−2α/d_δ})$ for ReLU FNNs to approximate $f$ in the $ε$-neighborhood, where $d_δ=\mathcal{O}(d_\mathcal{M}\frac{\rm{ln}(d/δ)}{δ^2})$ for any $δ∈(0,1)$ as a relative error for a projection to approximate an isometry when projecting $\mathcal{M}$ to a $d_δ$-dimensional domain.

*Communications in Computational Physics*.

*28*(5). 1768-1811. doi:10.4208/cicp.OA-2020-0149