首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Backfitting neural networks
Authors:Email author" target="_blank">Anil?Kumar?GhoshEmail author  Email author" target="_blank">Smarajit?BoseEmail author
Institution:(1) Stat-Math Unit, Indian Statistical Institute, 203, B. T. Road, Calcutta 700108, India
Abstract:Summary  Regression and classification problems can be viewed as special cases of the problem of function estimation. It is rather well known that a two-layer perceptron with sigmoidal transformation functions can approximate any continuous function on the compact subsets ofRP if there are sufficient number of hidden nodes. In this paper, we present an algorithm for fitting perceptron models, which is quite different from the usual backpropagation or Levenberg-Marquardt algorithm. This new algorithm based on backfitting ensures a better convergence than backpropagation. We have also used resampling techniques to select an ideal number of hidden nodes automatically using the training data itself. This resampling technique helps to avoid the problem of overfitting that one faces for the usual perceptron learning algorithms without any model selection scheme. Case studies and simulation results are presented to illustrate the performance of this proposed algorithm.
Keywords:Backfitting  backpropagation  backward deletion  cross validation  multi-layer perceptron  simulation
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号