首页 | 本学科首页   官方微博 | 高级检索  
     检索      

全局优化学习算法下的高阶神经网络模型
引用本文:王阿明,梁明理.全局优化学习算法下的高阶神经网络模型[J].武汉大学学报(理学版),1994(3).
作者姓名:王阿明  梁明理
作者单位:武汉大学物理学系,徐州医学院
摘    要:研究了一个高阶神经网络模型,该模型采用全局优化学习算法,能使所有学习图样都成为系统的稳态吸引子,其存储容量远高于Hebb-rule-like型学习算法下的高阶神经网络模型,并能存储识别相关图样.对由30个神经元组成的二阶神经网络系统进行了计算机模拟,模拟结果证实了上述结论.此外,还分析了初始突触强度对学习效果的影响,计算了不同存储图样数目下的平均吸引半径.

关 键 词:高阶神经网络模型.学习算祛,存储容量,平均吸引半径

A HIGHER-ORDER NEURAL NETWORK MODEL USING GLOBAL OPTIMAL LEARMOMG ALGORITHM
Wang Aming,Liang Mingli.A HIGHER-ORDER NEURAL NETWORK MODEL USING GLOBAL OPTIMAL LEARMOMG ALGORITHM[J].JOurnal of Wuhan University:Natural Science Edition,1994(3).
Authors:Wang Aming  Liang Mingli
Abstract:This paper presents a higher-order neural network model. It uses global optimal learning algorithm to enable all the learning patterns to be stable attracters of the system. Thus its storage capecity is much higher than the higher-order model using Hebb-rule-like learning algorithm, and coherent patterns can be stored in the new model. Calculations of computer simulation to a 2nd-order system with 30 neurons are carried out. The results confirm the above conclusion. The relationship between learning effects and inicial synapse weights and the relationship between average attraction radius and number of stored patterns are simulated and analysed.
Keywords:higher-order reural network model  learning algorithm  storage capacity  average attraction radius
本文献已被 CNKI 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号