首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Approximating conditional density functions using dimension reduction
Authors:Jian-qing Fan  Liang Peng  Qi-wei Yao  Wen-yang Zhang
Institution:[1]Department of Operations Research, and Financial Engineering, Princeton University, Princeton, NJ 08540, USA [2]School of Mathematics, Georgia Institute of Technology, Atlanta, GA 30332-0160, USA [3]Department of Statistics, London School of Economics, London WC2A 2AE, UK [4]Department of Mathematical Sciences, University of Bath, Bath BA2 7AY, UK
Abstract:We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given θ τ X, where the unit vector θ is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index θ in the sense that the first order asymptotic mean squared error of the estimator is the same as that when θ was known. The proposed method is illustrated using both simulated and real-data examples. Yao was partially supported by an EPSRC research grant EP/C549058/1.
Keywords:Conditional density function  dimension reduction  Kullback-Leibler discrepancy  local linear regression  nonparametric regression  Shannon’  s entropy
本文献已被 维普 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号