首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Learning finite binary sequences from half‐space data
Authors:Shao C Fang  Santosh S Venkatesh
Abstract:The problem of inferring a finite binary sequence w *∈{?1, 1}n is considered. It is supposed that at epochs t=1, 2,…, the learner is provided with random half‐space data in the form of finite binary sequences u (t)∈{?1, 1}n which have positive inner‐product with w *. The goal of the learner is to determine the underlying sequence w * in an efficient, on‐line fashion from the data { u (t), t≥1}. In this context, it is shown that the randomized, on‐line directed drift algorithm produces a sequence of hypotheses {w(t)∈{?1, 1}n, t≥1} which converges to w * in finite time with probability 1. It is shown that while the algorithm has a minimal space complexity of 2n bits of scratch memory, it has exponential time complexity with an expected mistake bound of order Ω(e0.139n). Batch incarnations of the algorithm are introduced which allow for massive improvements in running time with a relatively small cost in space (batch size). In particular, using a batch of ??(n log n) examples at each update epoch reduces the expected mistake bound of the (batch) algorithm to ??(n) (in an asynchronous bit update mode) and ??(1) (in a synchronous bit update mode). The problem considered here is related to binary integer programming and to learning in a mathematical model of a neuron. ©1999 John Wiley & Sons, Inc. Random Struct. Alg., 14, 345–381 (1999)
Keywords:on‐line learning  batch learning  binary perceptron  neuron  directed drift
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号