Sparse coding for layered neural networks |
| |
Authors: | Katsuki Katayama Yasuo Sakata and Tsuyoshi Horiguchi |
| |
Institution: | Department of Computer and Mathematical Sciences, GSIS, Tohoku University, Sendai 980-8579, Japan |
| |
Abstract: | We investigate storage capacity of two types of fully connected layered neural networks with sparse coding when binary patterns are embedded into the networks by a Hebbian learning rule. One of them is a layered network, in which a transfer function of even layers is different from that of odd layers. The other is a layered network with intra-layer connections, in which the transfer function of inter-layer is different from that of intra-layer, and inter-layered neurons and intra-layered neurons are updated alternately. We derive recursion relations for order parameters by means of the signal-to-noise ratio method, and then apply the self-control threshold method proposed by Dominguez and Bollé to both layered networks with monotonic transfer functions. We find that a critical value C of storage capacity is about 0.11|a ln a|−1 (a 1) for both layered networks, where a is a neuronal activity. It turns out that the basin of attraction is larger for both layered networks when the self-control threshold method is applied. |
| |
Keywords: | Layered neural network Sparse coding Hebb rule Storage capacity Basin of attraction Self-control threshold method Signal-to-noise ratio method |
本文献已被 ScienceDirect 等数据库收录! |
|