首页 | 本学科首页   官方微博 | 高级检索  
     检索      


An information-theoretic framework for robustness
Authors:Stephan Morgenthaler  Clifford Hurvich
Institution:(1) EPFL-DMA, Swiss Federal Institute of Technology, 1015 Lausanne, Switzerland;(2) New York University, 735 Tisch Hall, Washington Sq., 10003 New York, NY, U.S.A.
Abstract:This is a paper about the foundation of robust inference. As a specific example, we consider semiparametric location models that involve a shape parameter. We argue that robust methods result via the selection of a representative shape from a set of allowable shapes. To perform this selection, we need a measure of disparity between the true shape and the shape to be used in the inference. Given such a disparity, we propose to solve a certain minimax problem. The paper discusses in detail the use of the Kullback-Leibler divergence for the selection of shapes. The resulting estimators are shown to have redescending influence functions when the set of allowable shapes contains heavy-tailed members. The paper closes with a brief discussion of the next logical step, namely the representation of a set of shapes by a pair of selected shapes.
Keywords:Robustness  distributional shapes  Kullback-Leibler divergence
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号