A weak condition of globally asymptotic stability for neural networks |
| |
Affiliation: | Yangtze Center of Mathematics and Department of Mathematics, Sichuan University, Chengdu, Sichuan 610064, PR China |
| |
Abstract: | In this work we consider a general class of continuous activation functions which may be neither bounded nor differentiable; however, many sigmoidal functions are included as special cases. With this class of activation functions we give a result on asymptotic stability for neural networks under a weak condition of nonnegative definiteness. Then we show that differentiability is a condition for its exponential stability. |
| |
Keywords: | |
本文献已被 ScienceDirect 等数据库收录! |
|