Unregularized online learning algorithms with general loss functions |
| |
Authors: | Yiming Ying Ding-Xuan Zhou |
| |
Affiliation: | 1. Department of Mathematics and Statistics, State University of New York at Albany, Albany, NY, 12222, USA;2. Department of Mathematics, City University of Hong Kong, Kowloon, Hong Kong, China |
| |
Abstract: | In this paper, we consider unregularized online learning algorithms in a Reproducing Kernel Hilbert Space (RKHS). Firstly, we derive explicit convergence rates of the unregularized online learning algorithms for classification associated with a general α-activating loss (see Definition 1 below). Our results extend and refine the results in [30] for the least square loss and the recent result [3] for the loss function with a Lipschitz-continuous gradient. Moreover, we establish a very general condition on the step sizes which guarantees the convergence of the last iterate of such algorithms. Secondly, we establish, for the first time, the convergence of the unregularized pairwise learning algorithm with a general loss function and derive explicit rates under the assumption of polynomially decaying step sizes. Concrete examples are used to illustrate our main results. The main techniques are tools from convex analysis, refined inequalities of Gaussian averages [5], and an induction approach. |
| |
Keywords: | Learning theory Online learning Reproducing kernel Hilbert space Pairwise learning Bipartite ranking |
本文献已被 ScienceDirect 等数据库收录! |
|