A Learning Framework for Neural Networks Using Constrained Optimization Methods |
| |
Authors: | Stavros J. Perantonis Nikolaos Ampazis Vassilis Virvilis |
| |
Affiliation: | (1) Institute of Informatics and Telecommunications, National Center for Scientific Research Demokritos, GR-153 10 Agia Paraskevi, Greece;(2) Institute of Informatics and Telecommunications, National Center for Scientific Research Demokritos, GR-153 10 Agia Paraskevi, Greece |
| |
Abstract: | Conventional supervised learning in neural networks is carried out by performing unconstrained minimization of a suitably defined cost function. This approach has certain drawbacks, which can be overcome by incorporating additional knowledge in the training formalism. In this paper, two types of such additional knowledge are examined: Network specific knowledge (associated with the neural network irrespectively of the problem whose solution is sought) or problem specific knowledge (which helps to solve a specific learning task). A constrained optimization framework is introduced for incorporating these types of knowledge into the learning formalism. We present three examples of improvement in the learning behaviour of neural networks using additional knowledge in the context of our constrained optimization framework. The two network specific examples are designed to improve convergence and learning speed in the broad class of feedforward networks, while the third problem specific example is related to the efficient factorization of 2-D polynomials using suitably constructed sigma-pi networks. |
| |
Keywords: | neural networks supervised learning constrained optimization |
本文献已被 SpringerLink 等数据库收录! |