首页 | 本学科首页   官方微博 | 高级检索  
     


Efficient Hopfield pattern recognition on a scale-free neural network
Authors:D. Stauffer  A. Aharony  L. da Fontoura Costa  J. Adler
Affiliation:School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel Aviv University, Ramat Aviv, Tel Aviv 69978, Israel, IL
Cybernetic Vision Research Group, IFSC-USP, Caixa Postal 369, 13560-970 S ao Carlos, SP, Brazil, BR
Department of Physics, Technion-IIT, Haifa 32000, Israel, IL
Abstract:Neural networks are supposed to recognise blurred images (or patterns) of N pixels (bits) each. Application of the network to an initial blurred version of one of P pre-assigned patterns should converge to the correct pattern. In the “standard" Hopfield model, the N “neurons” are connected to each other via N2 bonds which contain the information on the stored patterns. Thus computer time and memory in general grow with N2. The Hebb rule assigns synaptic coupling strengths proportional to the overlap of the stored patterns at the two coupled neurons. Here we simulate the Hopfield model on the Barabási-Albert scale-free network, in which each newly added neuron is connected to only m other neurons, and at the end the number of neurons with q neighbours decays as 1/q 3. Although the quality of retrieval decreases for small m, we find good associative memory for 1 ≪ mN. Hence, these networks gain a factor N/m ≫ 1 in the computer memory and time. Received 12 January 2003 Published online 11 April 2003 RID="a" ID="a"e-mail: stauffer@thp.uni-koeln.de
Keywords:PACS. 05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion –   05.50.+q Lattice theory and statistics (Ising, Potts, etc.) –   87.18.Sn Neural networks
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号