首页 | 本学科首页   官方微博 | 高级检索  
     


Information Bottleneck Analysis by a Conditional Mutual Information Bound
Authors:Taro Tezuka  Shizuma Namekawa
Affiliation:1.Faculty of Library, Information and Media Science, University of Tsukuba, Tsukuba, Ibaraki 305-8577, Japan;2.Graduate School of Library, Information and Media Studies, University of Tsukuba, Tsukuba, Ibaraki 305-8577, Japan;
Abstract:Task-nuisance decomposition describes why the information bottleneck loss I(z;x)βI(z;y) is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z. When n is a nuisance independent from y, I(z;n) can be decreased by reducing I(z;x) since the latter upper bounds the former. We extend this framework by demonstrating that conditional mutual information I(z;x|y) provides an alternative upper bound for I(z;n). This bound is applicable even if z is not a sufficient representation of x, that is, I(z;y)I(x;y). We used mutual information neural estimation (MINE) to estimate I(z;x|y). Experiments demonstrated that I(z;x|y) is smaller than I(z;x) for layers closer to the input, matching the claim that the former is a tighter bound than the latter. Because of this difference, the information plane differs when I(z;x|y) is used instead of I(z;x).
Keywords:conditional mutual information   information bottleneck   deep learning
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号