首页 | 本学科首页   官方微博 | 高级检索  
     检索      


On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks
Authors:Stephen Grossberg
Institution:(1) Massachusetts Institute of Technology, Cambridge, Massachusetts
Abstract:Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on any finite number of cells nabla without causing irrevocable sampling bias ifA = bernou orA cap bernou =. Total energy transfer from inputs ofA to outputs of bernou depends on the entropy of the input distribution. Pattern completion on recall trials can occur without destroying perfect memory even ifA = bernou by choosing the signal thresholds sufficiently large. The mathematical results are global limit and oscillation theorems for a class of nonlinear functional-differential systems.The preparation of this work was supported in part by the National Science Foundation (GP 9003), the Office of Naval Research (N00014-67-A-024-OQ16), and the A.P. Sloan Foundation.
Keywords:learning  stimulus sampling  nonlinear difference-differential equations  global limits and oscillations  flows on signed networks  functional-differential systems  energy-entropy dependence  pattern completion  recurrent and nonrecurrent anatomy  sensory transduction rules  ratio limit theorems
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号