首页 | 本学科首页   官方微博 | 高级检索  
     检索      

任意信源与马氏信源比较及小偏差定理
引用本文:刘文,杨卫国.任意信源与马氏信源比较及小偏差定理[J].数学学报,1997,40(1):22-36.
作者姓名:刘文  杨卫国
作者单位:[1]河北工业大学数理系 [2]河北煤炭建筑工程学院
基金项目:河北省自然科学基金项目
摘    要:设{X_n,n≥0}是在S={1,2,…N}中取值的可测函数列,P、Q是测度空间上的两个概率测度,其中Q关于{X_n,n≥0}是马氏测度.本文引进了P关于Q的样本散度率距离的概念,并利用这个概念得到了任意信源二元函数一类平均值的小偏差定理,作为推论得到了任意信源熵密度的小偏差定理.最后我们将Shannon-McMillan定理推广到非齐次马氏信源情形.

关 键 词:小偏差定理    熵密度  样本散度率距离  Shannon-McMillan定理
收稿时间:1995-1-29
修稿时间:1996-5-6

The Comparison between Arbitrary Information Sources and Nonhomogeneous Markov Information Sources and the Small Deviations Theorems
Liu Wen.The Comparison between Arbitrary Information Sources and Nonhomogeneous Markov Information Sources and the Small Deviations Theorems[J].Acta Mathematica Sinica,1997,40(1):22-36.
Authors:Liu Wen
Institution:Liu Wen (Department of Mathematics, Hebei University of Technology, Tianjin 300130, China) Yang Weiguo (Hebei Mining and Civil Engineering Institute, Handan 056038, China)
Abstract:Let {Xn, n≥0} be a sequence of measurable functions taking their values in the alphabet S = {1,2,…, N}. Let P,Q be two probability measures on the measurable space, such that {Xn,n ≥ 0} is Markovian under Q, Let h(P \ Q) = limsupn-1 logP(X0,…, Xn)/Q(X0,…, Xn)} be the sample divergence-rate distance n→∞ of P relative to Q. In this paper, a class of small deviations theorems for the averages of the functions of two variables of an arbitrary information sources are discussed by using the concept h(P \ Q), and, as a corollary, a small deviations theorem for the entropy densities of arbitrary information sources is obtained. Finally, an extension of Shannon-McMillan Theorem on the case of nonhomogeneous Markov information sources is given.
Keywords:Small-deviations theorem  Entropy  Entropy density  Sample divergence-rate distance  Shannon-McMillan theorem
本文献已被 CNKI 维普 等数据库收录!
点击此处可从《数学学报》浏览原始摘要信息
点击此处可从《数学学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号