首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A Tsallis’ statistics based neural network model for novel word learning
Authors:Tarik Hadzibeganovic  Sergio A Cannas
Institution:a Cognitive Science Section, Department of Psychology, University of Graz, A-8010, Austria
b Cognitive Neuroscience Research Unit, Department of Psychiatry & Forensic Medicine, Faculty of Medicine, Hospital del Mar, Universitat Autònoma de Barcelona, 08003 Barcelona, Spain
c Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba, Argentina
Abstract:We invoke the Tsallis entropy formalism, a nonextensive entropy measure, to include some degree of non-locality in a neural network that is used for simulation of novel word learning in adults. A generalization of the gradient descent dynamics, realized via nonextensive cost functions, is used as a learning rule in a simple perceptron. The model is first investigated for general properties, and then tested against the empirical data, gathered from simple memorization experiments involving two populations of linguistically different subjects. Numerical solutions of the model equations corresponded to the measured performance states of human learners. In particular, we found that the memorization tasks were executed with rather small but population-specific amounts of nonextensivity, quantified by the entropic index q. Our findings raise the possibility of using entropic nonextensivity as a means of characterizing the degree of complexity of learning in both natural and artificial systems.
Keywords:87  19  lj  05  10  -a  87  19  lv  43  71  Hw
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号