首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to information transmission rate approaches to zero. All these unified and rigorous theoretical formulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories.  相似文献   

2.
We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of apparent memory stored in a source and the amounts of information that must be extracted from observations of a source in order for it to be optimally predicted and for an observer to synchronize to it. To measure the difficulty of synchronization, we define the transient information and prove that, for Markov processes, it is related to the total uncertainty experienced while synchronizing to a process. One consequence of ignoring a process's structural properties is that the missed regularities are converted to apparent randomness. We demonstrate that this problem arises particularly for settings where one has access only to short measurement sequences. Numerically and analytically, we determine the Shannon entropy growth curve, and related quantities, for a range of stochastic and deterministic processes. We conclude by looking at the relationships between a process's entropy convergence behavior and its underlying computational structure.  相似文献   

3.
Transfer entropy is a measure of the magnitude and the direction of information flow between jointly distributed stochastic processes. In recent years, its permutation analogues are considered in the literature to estimate the transfer entropy by counting the number of occurrences of orderings of values, not the values themselves. It has been suggested that the method of permutation is easy to implement, computationally low cost and robust to noise when applying to real world time series data. In this paper, we initiate a theoretical treatment of the corresponding rates. In particular, we consider the transfer entropy rate and its permutation analogue, the symbolic transfer entropy rate, and show that they are equal for any bivariate finite-alphabet stationary ergodic Markov process. This result is an illustration of the duality method introduced in [T. Haruna, K. Nakajima, Physica D 240, 1370 (2011)]. We also discuss the relationship among the transfer entropy rate, the time-delayed mutual information rate and their permutation analogues.  相似文献   

4.
In this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from empirical data, and review both parametric and non-parametric techniques. We look at many different assumptions on properties of the processes for parametric processes, in particular focussing on Markov and Gaussian assumptions. Non-parametric estimation relies on limit theorems which involve the entropy rate from observations, and to discuss these, we introduce some theory and the practical implementations of estimators of this type.  相似文献   

5.
非平衡统计信息理论   总被引:5,自引:0,他引:5       下载免费PDF全文
邢修三 《物理学报》2004,53(9):2852-2863
阐述了以表述信息演化规律的信息(熵)演化方程为核心的非平 衡统计信息理论.推导出了 Shannon信息(熵)的非线性演化方程,引入了统计物理信息并 推导出了它的非线性演化方程.这两种信息(熵)演化方程一致表明:统计信息(熵)密度 随时间的变化率是由其在坐标空间(和态变量空间)的漂移、扩散和减损(产生)三者引起 的.由此方程出发,给出了统计信息减损率和统计熵产生率的简明公式、漂移信息流和扩散 信息流的表达式,证明了非平衡系统内的统计信息减损(或增加)率等于它的统计熵产生( 或减少)率、信息扩散与信息减损同时 关键词: 统计信息(熵)演化方程 统计信息减损率 统计熵产 生率 信息(熵)流 信息(熵)扩散 动态互信息  相似文献   

6.
Asymptotic fluctuation theorems are statements of a Gallavotti-Cohen symmetry in the rate function of either the time-averaged entropy production or heat dissipation of a process. Such theorems have been proved for various general classes of continuous-time deterministic and stochastic processes, but always under the assumption that the forces driving the system are time independent, and often relying on the existence of a limiting ergodic distribution. In this paper we extend the asymptotic fluctuation theorem for the first time to inhomogeneous continuous-time processes without a stationary distribution, considering specifically a finite state Markov chain driven by periodic transition rates. We find that for both entropy production and heat dissipation, the usual Gallavotti-Cohen symmetry of the rate function is generalized to an analogous relation between the rate functions of the original process and its corresponding backward process, in which the trajectory and the driving protocol have been time-reversed. The effect is that spontaneous positive fluctuations in the long time average of each quantity in the forward process are exponentially more likely than spontaneous negative fluctuations in the backward process, and vice-versa, revealing that the distributions of fluctuations in universes in which time moves forward and backward are related. As an additional result, the asymptotic time-averaged entropy production is obtained as the integral of a periodic entropy production rate that generalizes the constant rate pertaining to homogeneous dynamics.  相似文献   

7.
We introduce the minimal maximally predictive models (\(\epsilon \text{-machines }\)) of processes generated by certain hidden semi-Markov models. Their causal states are either discrete, mixed, or continuous random variables and causal-state transitions are described by partial differential equations. As an application, we present a complete analysis of the \(\epsilon \text{-machines }\) of continuous-time renewal processes. This leads to closed-form expressions for their entropy rate, statistical complexity, excess entropy, and differential information anatomy rates.  相似文献   

8.
The duality between values and orderings is a powerful tool to discuss relationships between various information-theoretic measures and their permutation analogues for discrete-time finite-alphabet stationary stochastic processes (SSPs). Applying it to output processes of hidden Markov models with ergodic internal processes, we have shown in our previous work that the excess entropy and the transfer entropy rate coincide with their permutation analogues. In this paper, we discuss two permutation characterizations of the two measures for general ergodic SSPs not necessarily having the Markov property assumed in our previous work. In the first approach, we show that the excess entropy and the transfer entropy rate of an ergodic SSP can be obtained as the limits of permutation analogues of them for the N-th order approximation by hidden Markov models, respectively. In the second approach, we employ the modified permutation partition of the set of words which considers equalities of symbols in addition to permutations of words. We show that the excess entropy and the transfer entropy rate of an ergodic SSP are equal to their modified permutation analogues, respectively.  相似文献   

9.
The time-dependent information theory entropy and generalized Gibbs entropy are evaluated for a continuous time Markov process (single mode radiation in a cavity).  相似文献   

10.
We analyze the functioning of Gibbs-type entropy functionals in the time domain, with emphasis on Shannon and Kullback-Leibler entropies of time-dependent continuous probability distributions. The Shannon entropy validity is extended to probability distributions inferred from L 2(R n ) quantum wave packets. In contrast to the von Neumann entropy which simply vanishes on pure states, the differential entropy quantifies the degree of probability (de)localization and its time development. The associated dynamics of the Fisher information functional quantifies nontrivial power transfer processes in the mean, both in dissipative and quantum mechanical cases. PACS NUMBERS: 05.45.+b, 02.50.-r, 03.65.Ta, 03.67.-a  相似文献   

11.
There is a relation between the irreversibility of thermodynamic processes as expressed by the breaking of time-reversal symmetry, and the entropy production in such processes. We explain on an elementary mathematical level the relations between entropy production, phase-space contraction and time-reversal starting from a deterministic dynamics. Both closed and open systems, in the transient and in the steady regime, are considered. The main result identifies under general conditions the statistical mechanical entropy production as the source term of time-reversal breaking in the path space measure for the evolution of reduced variables. This provides a general algorithm for computing the entropy production and to understand in a unified way a number of useful (in)equalities. We also discuss the Markov approximation. Important are a number of old theoretical ideas for connecting the microscopic dynamics with thermodynamic behavior.  相似文献   

12.
上官丹骅  邓力  张宝印  姬志成  李刚 《物理学报》2016,65(14):142801-142801
在多计算步的非定常输运问题的蒙特卡罗模拟中,为自动调整每一步的样本数以获得较高的计算效率,可以有多种准则.一种可选的方法是在每一步每隔若干样本监测一次系统中未死亡粒子属性分布对应的香农熵的收敛情况以决定何时停止追加样本,此种方法需要在每一步频繁计算香农熵值.由于在MPI消息传递并行编程环境下香农熵的经典计算方法必须广播大量的数据,导致每一步的计算时间随香农熵计算频率的提高而快速增大,这显然是不能满足实际需求的.本文提出了一种适应于消息传递并行编程环境的香农熵计算新方法,该方法计算得到的香农熵值并不等价于经典方法,但二者之间的差别会随着样本数的增加而趋于零.新方法的最大优势是高频计算香农熵值的时间代价大为降低,为最终实现基于香农熵收敛判断的每步样本数的自动调整奠定了必要的基础.  相似文献   

13.
Considerable advances in automatic speech recognition have been made in the last decades, thanks specially to the use of hidden Markov models. In the field of speech signal analysis, different techniques have been developed. However, deterioration in the performance of the speech recognizers has been observed when they are trained with clean signal and tested with noisy signals. This is still an open problem in this field. Continuous multiresolution entropy has been shown to be robust to additive noise in applications to different physiological signals. In previous works we have included Shannon and Tsallis entropies, and their corresponding divergences, in different speech analysis and recognition systems. In this paper we present an extension of the continuous multiresolution entropy to different divergences and we propose them as new dimensions for the pre-processing stage of a speech recognition system. This approach takes into account information about changes in the dynamics of speech signal at different scales. The methods proposed here are tested with speech signals corrupted with babble and white noise. Their performance is compared with classical mel cepstral parametrization. The results suggest that these continuous multiresolution entropy related measures provide valuable information to the speech recognition system and that they could be considered to be included as an extra component in the pre-processing stage.  相似文献   

14.
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.  相似文献   

15.
We propose and analyze a new candidate Lyapunov function for relaxation towards general nonequilibrium steady states. The proposed functional is obtained from the large time asymptotics of time-symmetric fluctuations. For driven Markov jump or diffusion processes it measures an excess in dynamical activity rates. We present numerical evidence and we report on a rigorous argument for its monotonic time dependence close to the steady nonequilibrium or in general after a long enough time. This is in contrast with the behavior of approximate Lyapunov functions based on entropy production that when driven far from equilibrium often keep exhibiting temporal oscillations even close to stationarity.  相似文献   

16.
In classical information theory, one of the most important theorems are the coding theorems, which were discussed by calculating the mean entropy and the mean mutual entropy defined by the classical dynamical entropy (Kolmogorov-Sinai). The quantum dynamical entropy was first studied by Emch [13] and Connes-Stormer [11]. After that, several approaches for introducing the quantum dynamical entropy are done [10, 3, 8, 39, 15, 44, 9, 27, 28, 2, 19, 45]. The efficiency of information transmission for the quantum processes is investigated by using the von Neumann entropy [22] and the Ohya mutual entropy [24]. These entropies were extended to S- mixing entropy by Ohya [26, 27] in general quantum systems. The mean entropy and the mean mutual entropy for the quantum dynamical systems were introduced based on the S- mixing entropy. In this paper, we discuss the efficiency of information transmission to calculate the mean mutual entropy with respect to the modulated initial states and the connected channel for the quantum dynamical systems.  相似文献   

17.
邢修三 《物理学报》2014,63(23):230201-230201
本文综述了作者的研究成果.近十年,作者将现有静态统计信息理论拓展至动态过程,建立了以表述动态信息演化规律的动态信息演化方程为核心的动态统计信息理论.基于服从随机性规律的动力学系统(如随机动力学系统和非平衡态统计物理系统)与遵守确定性规律的动力学系统(如电动力学系统)的态变量概率密度演化方程都可看成是其信息符号演化方程,推导出了动态信息(熵)演化方程.它们表明:对于服从随机性规律的动力学系统,动态信息密度随时间的变化率是由其在系统内部的态变量空间和传递过程的坐标空间的漂移、扩散和耗损三者引起的,而动态信息熵密度随时间的变化率则是由其在系统内部的态变量空间和传递过程的坐标空间的漂移、扩散和产生三者引起的.对于遵守确定性规律的动力学系统,动态信息(熵)演化方程与前者的相比,除动态信息(熵)密度在系统内部的态变量空间仅有漂移外,其余皆相同.信息和熵已与系统的状态和变化规律结合在一起,信息扩散和信息耗损同时存在.当空间噪声可略去时,将会出现信息波.若仅研究系统内部的信息变化,动态信息演化方程就约化为与表述上述动力学系统变化规律的动力学方程相对应的信息方程,它既可看成是表述动力学系统动态信息的演化规律,亦可看成是动力学系统的变化规律都可由信息方程表述.进而给出了漂移和扩散信息流公式、信息耗散率公式和信息熵产生率公式及动力学系统退化和进化的统一信息表述公式.得到了反映信息在传递过程中耗散特性的动态互信息公式和动态信道容量公式,它们在信道长度和信号传递速度之比趋于零的极限情况下变为现有的静态互信息公式和静态信道容量公式.所有这些新的理论公式和结果都是从动态信息演化方程统一推导出的.  相似文献   

18.
We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is no solution based on the principle of maximum Shannon entropy. The result demonstrates the merits of Fisher information, and the demerits of Shannon entropy, in treating some fundamental quantum problems. It also provides a quantitative example in support of a general philosophy: Nature intends to hide Fisher information, while obeying some simple rules.  相似文献   

19.
20.
We measure the content of random uncorrelated noise in heart rate variability using a general method of noise level estimation using a coarse-grained entropy. We show that usually, except for atrial fibrillation, the level of such noise is within 5-15% of the variance of the data and that the variability due to the linearly correlated processes is dominant in all cases analyzed but atrial fibrillation. The nonlinear deterministic content of heart rate variability remains significant and may not be ignored.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号