On the approximation of rough functions with deep neural networks
The essentially non-oscillatory (ENO) procedure and its variant, the ENO-SR procedure, are
very efficient algorithms for interpolating (reconstructing) rough functions. We prove that the
ENO (and ENO-SR) procedure are equivalent to deep ReLU neural networks. This
demonstrates the ability of deep ReLU neural networks to approximate rough functions to
high-order of accuracy. Numerical tests for the resulting trained neural networks show
excellent performance for interpolating functions, approximating solutions of nonlinear …
very efficient algorithms for interpolating (reconstructing) rough functions. We prove that the
ENO (and ENO-SR) procedure are equivalent to deep ReLU neural networks. This
demonstrates the ability of deep ReLU neural networks to approximate rough functions to
high-order of accuracy. Numerical tests for the resulting trained neural networks show
excellent performance for interpolating functions, approximating solutions of nonlinear …
Abstract
The essentially non-oscillatory (ENO) procedure and its variant, the ENO-SR procedure, are very efficient algorithms for interpolating (reconstructing) rough functions. We prove that the ENO (and ENO-SR) procedure are equivalent to deep ReLU neural networks. This demonstrates the ability of deep ReLU neural networks to approximate rough functions to high-order of accuracy. Numerical tests for the resulting trained neural networks show excellent performance for interpolating functions, approximating solutions of nonlinear conservation laws and at data compression.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果