首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Computation of Kullback–Leibler Divergence in Bayesian Networks
Authors:Serafín Moral  Andrs Cano  Manuel Gmez-Olmedo
Institution:Computer Science and Artificial Intelligent Department, University of Granada, 18071 Granada, Spain; (S.M.); (A.C.)
Abstract:Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models.
Keywords:probabilistic graphical models  learning algorithms  Kullback–  Leibler divergence
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号