首页 | 本学科首页   官方微博 | 高级检索  
     


Convergence of BP Algorithm for Training MLP with Linear Output
Authors:Hongmei Shao  Wei Wu  Wenbin Liu
Abstract:The capability of multilayer perceptrons(MLPs)for approximating continuous functions with arbitrary accuracy has been demonstrated in the past decades.Back propagation(BP)algorithm is the most popular learning algorithm for training of MLPs.In this paper,a simple iteration formula is used to select the leaming rate for each cycle of training procedure,and a convergence result is presented for the BP algo- rithm for training MLP with a hidden layer and a linear output unit.The monotonicity of the error function is also guaranteed during the training iteration.
Keywords:Multilayer perceptron  BP algorithm  Convergence  Monotoniciy.
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号