(1) Massachusetts Institute of Technology, Cambridge, Massachusetts
Abstract:
Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on any finite number of cells without causing irrevocable sampling bias ifA = orA =. Total energy transfer from inputs ofA to outputs of depends on the entropy of the input distribution. Pattern completion on recall trials can occur without destroying perfect memory even ifA = by choosing the signal thresholds sufficiently large. The mathematical results are global limit and oscillation theorems for a class of nonlinear functional-differential systems.The preparation of this work was supported in part by the National Science Foundation (GP 9003), the Office of Naval Research (N00014-67-A-024-OQ16), and the A.P. Sloan Foundation.