首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Robustness of reweighted Least Squares Kernel Based Regression
Authors:Michiel Debruyne  Andreas Christmann  Johan AK Suykens
Institution:a Department of Mathematics and Computer Science, Universiteit Antwerpen, Middelheimlaan 1G, B-2020, Antwerpen, Belgium
b Department of Mathematics, University of Bayreuth, D-95440 Bayreuth, Germany
c Department of Mathematics-LStat, K.U.Leuven, Celestijnenlaan 200B, B-3001 Leuven, Belgium
d ESAT-SCD/SISTA, K.U.Leuven, Kasteelpark Arenberg 10, B-3001 Leuven, Belgium
Abstract:Kernel Based Regression (KBR) minimizes a convex risk over a possibly infinite dimensional reproducing kernel Hilbert space. Recently, it was shown that KBR with a least squares loss function may have some undesirable properties from a robustness point of view: even very small amounts of outliers can dramatically affect the estimates. KBR with other loss functions is more robust, but often gives rise to more complicated computations (e.g. for Huber or logistic losses). In classical statistics robustness is often improved by reweighting the original estimate. In this paper we provide a theoretical framework for reweighted Least Squares KBR (LS-KBR) and analyze its robustness. Some important differences are found with respect to linear regression, indicating that LS-KBR with a bounded kernel is much more suited for reweighting. In two special cases our results can be translated into practical guidelines for a good choice of weights, providing robustness as well as fast convergence. In particular a logistic weight function seems an appropriate choice, not only to downweight outliers, but also to improve performance at heavy tailed distributions. For the latter some heuristic arguments are given comparing concepts from robustness and stability.
Keywords:62G35  62G08
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号