首页 | 本学科首页   官方微博 | 高级检索  
     


Nonlinear approximation using Gaussian kernels
Authors:Thomas Hangelbroek  Amos Ron
Affiliation:a Department of Mathematics, Texas A&M University, College Station, TX 77843, United States
b Computer Science Department, University of Wisconsin, Madison, WI 53706, United States
Abstract:It is well known that nonlinear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently by DeVore and Ron (in press) [2] for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function, the preferred kernel in machine learning and several engineering problems. We introduce and analyze in this paper a new algorithm for approximating functions using translates of Gaussian functions with varying tension parameters. At heart it employs the strategy for nonlinear approximation of DeVore-Ron, but it selects kernels by a method that is not straightforward. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian function spatially according to local information about the approximand: error analysis of Gaussian approximation schemes with varying tension are, by and large, an elusive target for approximators. We show that our algorithm is suitably optimal in the sense that it provides approximation rates similar to other established nonlinear methodologies like spline and wavelet approximations. As expected and desired, the approximation rates can be as high as needed and are essentially saturated only by the smoothness of the approximand.
Keywords:Machine learning   Gaussians   Kernels   Radial basis functions   Nonlinear approximation   Besov space   Triebel-Lizorkin space
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号