Convergence of BP Algorithm for Training MLP with Linear Output |
| |
Authors: | Hongmei Shao Wei Wu Wenbin Liu |
| |
Abstract: | The capability of multilayer perceptrons(MLPs)for approximating continuous functions with arbitrary accuracy has been demonstrated in the past decades.Back propagation(BP)algorithm is the most popular learning algorithm for training of MLPs.In this paper,a simple iteration formula is used to select the leaming rate for each cycle of training procedure,and a convergence result is presented for the BP algo- rithm for training MLP with a hidden layer and a linear output unit.The monotonicity of the error function is also guaranteed during the training iteration. |
| |
Keywords: | Multilayer perceptron BP algorithm Convergence Monotoniciy. |
本文献已被 CNKI 维普 万方数据 等数据库收录! |