On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

Authors

  • Dansheng Yu
  • Yunyou Qian
  • Fengjun Li

DOI:

https://doi.org/10.4208/ata.OA-2021-0006

Keywords:

Approximation rate, modulus of continuity, modulus of smoothness, neural network operators.

Abstract

Recently, Li [16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in [16]. By using new methods, we also give right approximation rate estimations of the approximation by Li’s neural networks.

Published

2023-03-03

Abstract View

  • 32961

Pdf View

  • 3459

Issue

Section

Articles

How to Cite

On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights. (2023). Analysis in Theory and Applications, 39(1), 93-104. https://doi.org/10.4208/ata.OA-2021-0006