首页 | 本学科首页   官方微博 | 高级检索  
     检索      


On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights
Authors:Dansheng Yu  Yunyou Qian & Fengjun Li
Abstract:Recently, Li 16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in 16]. By using new methods, we also give right approximation rate estimations of the approximation by Li’s neural networks.
Keywords:Approximation rate  modulus of continuity  modulus of smoothness  neural network operators  
点击此处可从《分析论及其应用》浏览原始摘要信息
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号