首页 | 本学科首页   官方微博 | 高级检索  
     


Automatic Feature Selection via Weighted Kernels and Regularization
Authors:Genevera I. Allen
Affiliation:1. Department of Pediatrics-Neurology, Baylor College of Medicine , Jan and Dan Duncan Neurological Research Institute, Texas Children's Hospital , Houston , TX , 77030;2. Department of Statistics , Rice University , Houston , TX , 77005
Abstract:Selecting important features in nonlinear kernel spaces is a difficult challenge in both classification and regression problems. This article proposes to achieve feature selection by optimizing a simple criterion: a feature-regularized loss function. Features within the kernel are weighted, and a lasso penalty is placed on these weights to encourage sparsity. This feature-regularized loss function is minimized by estimating the weights in conjunction with the coefficients of the original classification or regression problem, thereby automatically procuring a subset of important features. The algorithm, KerNel Iterative Feature Extraction (KNIFE), is applicable to a wide variety of kernels and high-dimensional kernel problems. In addition, a modification of KNIFE gives a computationally attractive method for graphically depicting nonlinear relationships between features by estimating their feature weights over a range of regularization parameters. The utility of KNIFE in selecting features through simulations and examples for both kernel regression and support vector machines is demonstrated. Feature path realizations also give graphical representations of important features and the nonlinear relationships among variables. Supplementary materials with computer code and an appendix on convergence analysis are available online.
Keywords:Reproducing kernel Hilbert space  Kernel ridge regression  Lasso  Nonlinear regression  Regularization paths  Support vector machine
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号