Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
This paper analyzes expression rates for deep NN emulation of splines and high-order polynomials, as in [24].
Expression rates and stability in Sobolev norms of deep ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, ...
Oct 11, 2023 · We show expression rates and stability in Sobolev norms of deep feedforward ReLU neural networks (NNs) in terms of the number of parameters defining the NN.
Missing: Chebyšev | Show results with:Chebyšev
Approximation rate bounds for emulations of real-valued functions on intervals by deep neural networks (DNNs) are established.
Chebyšev coefficients can be computed easily from the values of the function in the Clenshaw–Curtis points using the inverse fast Fourier transform. Bounds on ...
... ReLU NN with 2k−1 + 2 outputs. The first output equals the NN input x, the other outputs emulate high-order Chebyšev polynomials of degrees 2k−1,…,2k ...
Sep 4, 2024 · This paper studies the statistical theory of offline reinforcement learning with deep ReLU networks. We consider the off-policy evaluation (OPE) ...
We present an emulation of univariate Chebyšev polynomials T n (x) of arbitrary degrees by ReLU NNs, which will be developed in [24] and which follows ...
Approximation rate bounds for emulations of real-valued functions on intervals by deep neural networks (DNNs) are established. The approximation results are ...
Missing: Chebyšev | Show results with:Chebyšev
Deep ReLU networks and high-order finite element methods II: Chebyšev emulation. JAA Opschoor, C Schwab. Computers & Mathematics with Applications 169, 142-162, ...