首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Linearly constrained reconstruction of functions by kernels with applications to machine learning
Authors:R Schaback  J Werner
Institution:001. G?ttingen, Germany
Abstract:This paper investigates the approximation of multivariate functions from data via linear combinations of translates of a positive definite kernel from a reproducing kernel Hilbert space. If standard interpolation conditions are relaxed by Chebyshev-type constraints, one can minimize the norm of the approximant in the Hilbert space under these constraints. By standard arguments of optimization theory, the solutions will take a simple form, based on the data related to the active constraints, called support vectors in the context of machine learning. The corresponding quadratic programming problems are investigated to some extent. Using monotonicity results concerning the Hilbert space norm, iterative techniques based on small quadratic subproblems on active sets are shown to be finite, even if they drop part of their previous information and even if they are used for infinite data, e.g., in the context of online learning. Numerical experiments confirm the theoretical results. Dedicated to C.A. Micchelli at the occasion of his 60th birthday Mathematics subject classifications (2000) 65D05, 65D10, 41A15, 41A17, 41A27, 41A30, 41A40, 41A63.
Keywords:positive definite radial basis functions  quadratic programming  kernel machines  machine learning  support vector machines  regression
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号