Neural networks,linear functions and neglected non-linearity |
| |
Authors: | Email author" target="_blank">B?CurryEmail author P?H?Morgan |
| |
Institution: | (1) Cardiff Business School, Cardiff University, Aberconway Building, Colum Drive, CF10 3EU Cardiff, United Kingdom |
| |
Abstract: | The multiplicity of approximation theorems for Neural Networks do not relate to approximation of linear functions per se. The problem for the network is to construct a linear function by superpositions of non-linear activation functions such as the sigmoid function. This issue is important for applications of NNs in statistical tests for neglected nonlinearity, where it is common practice to include a linear function through skip-layer connections. Our theoretical analysis and evidence point in a similar direction, suggesting that the network can in fact provide linear approximations without additional assistance . Our paper suggests that skip-layer connections are unnecessary, and if employed could lead to misleading results.Received: August 2002, Revised: March 2003, AMS Classification:
82c32The authors are grateful to Prof. Mick Silver and to GFK Marketing for help with the provision of data. |
| |
Keywords: | universal approximation non-linear regression network weights hidden layers skip-layer connections |
本文献已被 SpringerLink 等数据库收录! |
|