首页 | 本学科首页   官方微博 | 高级检索  
     


Information-Corrected Estimation: A Generalization Error Reducing Parameter Estimation Method
Authors:Matthew Dixon  Tyler Ward
Affiliation:1.Department of Applied Mathematics, Illinois Institute of Technology, Chicago, IL 60616, USA;2.Department of Financial Engineering, NYU Tandon School of Engineering, New York, NY 11201, USA
Abstract:
Modern computational models in supervised machine learning are often highly parameterized universal approximators. As such, the value of the parameters is unimportant, and only the out of sample performance is considered. On the other hand much of the literature on model estimation assumes that the parameters themselves have intrinsic value, and thus is concerned with bias and variance of parameter estimates, which may not have any simple relationship to out of sample model performance. Therefore, within supervised machine learning, heavy use is made of ridge regression (i.e., L2 regularization), which requires the the estimation of hyperparameters and can be rendered ineffective by certain model parameterizations. We introduce an objective function which we refer to as Information-Corrected Estimation (ICE) that reduces KL divergence based generalization error for supervised machine learning. ICE attempts to directly maximize a corrected likelihood function as an estimator of the KL divergence. Such an approach is proven, theoretically, to be effective for a wide class of models, with only mild regularity restrictions. Under finite sample sizes, this corrected estimation procedure is shown experimentally to lead to significant reduction in generalization error compared to maximum likelihood estimation and L2 regularization.
Keywords:generalization error   overfitting   information criteria   entropy
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号