首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In classical information theory, one of the most important theorems are the coding theorems, which were discussed by calculating the mean entropy and the mean mutual entropy defined by the classical dynamical entropy (Kolmogorov-Sinai). The quantum dynamical entropy was first studied by Emch [13] and Connes-Stormer [11]. After that, several approaches for introducing the quantum dynamical entropy are done [10, 3, 8, 39, 15, 44, 9, 27, 28, 2, 19, 45]. The efficiency of information transmission for the quantum processes is investigated by using the von Neumann entropy [22] and the Ohya mutual entropy [24]. These entropies were extended to S- mixing entropy by Ohya [26, 27] in general quantum systems. The mean entropy and the mean mutual entropy for the quantum dynamical systems were introduced based on the S- mixing entropy. In this paper, we discuss the efficiency of information transmission to calculate the mean mutual entropy with respect to the modulated initial states and the connected channel for the quantum dynamical systems.  相似文献   

2.
The critical behaviors of the entropy correlation effects in the one dimensional J1-J2 Heisenberg model are studied. It is shown that the mutual information or the correlation entropy captures the key features of information about critical fluctuations and can be used to quantify the quantum and finite-temperature phase transitions. At the critical point, the mutual information is power-law decay and the entanglement correlation length is infinite. While far away from the critical point, the mutual information is exponential decay and the entanglement correlation length is finite. A universal property of the mutual information is also found. Based on the critical behaviors of the mutual information, a new method to quantify the infinite order phase transition in the system is proposed.  相似文献   

3.
Feedback is proposed for distinguishing between two weak coherent states with phases differing by ∼π. The mutual nonorthogonality of such states gives rise to a discrimination error, which can be reduced by using feedback. An optical quantum channel is discussed where the input is classical information encoded in two weak coherent states. For a channel with feedback, the discrimination error probability is calculated, and the mutual entropy that quantifies the fidelity between input and output is evaluated. We find that the use of a feedback loop in a quantum communication channel can increase the mutual entropy when canonical position or photon number is measured.  相似文献   

4.
This article investigates the behavior of a Moshinsky atom in a 1D harmonic trap. Focus is given on the theoretical foundations of confinement and its impact on the correlation between particles in the Moshinsky atom. The investigation begins by illustrating the (de)localization of the probability density function using Shannon entropy. The basics of correlation and interpretation of correlation using tools such as mutual information and statistical correlation coefficients and how these can be quantified are discussed. Then the concept of confinement is explored. The impact of interaction strength and confinement on Shannon entropy, statistical correlation coefficients, and mutual information is investigated. How interaction strength and confinement can be used to induce correlations between previously uncorrelated particles, as well as how they can be used to suppress correlations between previously correlated particles is discussed. Their implications for quantum information processing and quantum simulation are discussed. In conclusion, confinement is a powerful tool for controlling correlations in quantum systems, and its impact on correlation can be understood through theoretical models. The importance of experimental studies in this field, which provide insights into the behavior of quantum systems under confinement and pave the way for future applications in quantum technology is also emphasized.  相似文献   

5.
用量子信息理论研究具有原子运动的双光子Jaynes_Cumming模型动力学.给出了该模型中表示原子态变化的量子力学通道,导出了量子互熵和原子约化熵,考察了原子运动及场模结构对量子互熵的影响,以及原子量子力学通道“开启”与“关闭”状态和原子与场纠缠程度的关系.结果表明量子力学通道特性强烈依赖于原子运动、场模结构以及原子与场的纠缠. 关键词: 双光子J_C模型 原子运动 量子互熵 量子约化熵 量子学通道  相似文献   

6.
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to information transmission rate approaches to zero. All these unified and rigorous theoretical formulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories.  相似文献   

7.
The problem of calculating the rate of mutual information between two coarse-grained variables that together specify a continuous time Markov process is addressed. As a main obstacle, the coarse-grained variables are in general non-Markovian, therefore, an expression for their Shannon entropy rates in terms of the stationary probability distribution is not known. A numerical method to estimate the Shannon entropy rate of continuous time hidden-Markov processes from a single time series is developed. With this method the rate of mutual information can be determined numerically. Moreover, an analytical upper bound on the rate of mutual information is calculated for a class of Markov processes for which the transition rates have a bipartite character. Our general results are illustrated with explicit calculations for four-state networks.  相似文献   

8.
S.M. Apenko 《Physica A》2012,391(1-2):62-77
We present a possible approach to the study of the renormalization group (RG) flow based entirely on the information theory. The average information loss under a single step of Wilsonian RG transformation is evaluated as a conditional entropy of the fast variables, which are integrated out, when the slow ones are held fixed. Its positivity results in the monotonic decrease of the informational entropy under renormalization. This, however, does not necessarily imply the irreversibility of the RG flow, because entropy is an extensive quantity and explicitly depends on the total number of degrees of freedom, which is reduced. Only some size-independent additive part of the entropy could possibly provide the required Lyapunov function. We also introduce a mutual information of fast and slow variables as probably a more adequate quantity to represent the changes in the system under renormalization and evaluate it for some simple systems. It is shown that for certain real space decimation transformations the positivity of the mutual information directly leads to the monotonic growth of the entropy per lattice site along the RG flow and hence to its irreversibility.  相似文献   

9.
Information entropy and the related quantity mutual information are used extensively as measures of complexity and to identify nonlinearity in dynamical systems. Expressions for the probability distribution of entropies and mutual informations calculated from finite amounts of data exist in the literature but the expressions have seldom been used in the field of nonlinear dynamics. In this paper formulae for estimating the errors on observed information entropies and mutual informations are derived using the standard error analysis familiar to physicists. Their validity is demonstrated by numerical experiment. For illustration the formulae are then used to evaluate the errors on the time-lagged mutual information of the logistic map.  相似文献   

10.
The order book is a list of all current buy or sell orders for a given financial security. The rise of electronic stock exchanges introduced a debate about the relevance of the information it encapsulates of the activity of traders. Here, we approach this topic from a theoretical perspective, estimating the amount of mutual information between order book layers, i.e., different buy/sell layers, which are aggregated by buy/sell orders. We show that (i) layers are not independent (in the sense that the mutual information is statistically larger than zero), (ii) the mutual information between layers is small (compared to the joint entropy), and (iii) the mutual information between layers increases when comparing the uppermost layers to the deepest layers analyzed (i.e., further away from the market price). Our findings, and our method for estimating mutual information, are relevant to developing trading strategies that attempt to utilize the information content of the limit order book.  相似文献   

11.
Information, relative entropy of entanglement, and irreversibility   总被引:1,自引:0,他引:1  
Previously proposed measures of entanglement, such as entanglement of formation and assistance, are shown to be special cases of the relative entropy of entanglement. The difference between these measures for an ensemble of mixed states is shown to depend on the availability of classical information about particular members of the ensemble. Based on this, relations between relative entropy of entanglement and mutual information are derived.  相似文献   

12.
Deep learning methods have had outstanding performances in various fields. A fundamental query is why they are so effective. Information theory provides a potential answer by interpreting the learning process as the information transmission and compression of data. The information flows can be visualized on the information plane of the mutual information among the input, hidden, and output layers. In this study, we examine how the information flows are shaped by the network parameters, such as depth, sparsity, weight constraints, and hidden representations. Here, we adopt autoencoders as models of deep learning, because (i) they have clear guidelines for their information flows, and (ii) they have various species, such as vanilla, sparse, tied, variational, and label autoencoders. We measured their information flows using Rényi’s matrix-based α-order entropy functional. As learning progresses, they show a typical fitting phase where the amounts of input-to-hidden and hidden-to-output mutual information both increase. In the last stage of learning, however, some autoencoders show a simplifying phase, previously called the “compression phase”, where input-to-hidden mutual information diminishes. In particular, the sparsity regularization of hidden activities amplifies the simplifying phase. However, tied, variational, and label autoencoders do not have a simplifying phase. Nevertheless, all autoencoders have similar reconstruction errors for training and test data. Thus, the simplifying phase does not seem to be necessary for the generalization of learning.  相似文献   

13.
利用信息论和统计学的方法并结合生物学的特征研究人类Y染色体回文序列的互信息、“n字”熵、条件熵,定量分析了回文序列的长程关联和短程关联,发现其中既存在长程关联也存在短程关联,并且它们主要是由序列中的重复序列引起的. 研究表明重复序列含量越高碱基之间的关联越强.  相似文献   

14.
Transfer entropy is a measure of the magnitude and the direction of information flow between jointly distributed stochastic processes. In recent years, its permutation analogues are considered in the literature to estimate the transfer entropy by counting the number of occurrences of orderings of values, not the values themselves. It has been suggested that the method of permutation is easy to implement, computationally low cost and robust to noise when applying to real world time series data. In this paper, we initiate a theoretical treatment of the corresponding rates. In particular, we consider the transfer entropy rate and its permutation analogue, the symbolic transfer entropy rate, and show that they are equal for any bivariate finite-alphabet stationary ergodic Markov process. This result is an illustration of the duality method introduced in [T. Haruna, K. Nakajima, Physica D 240, 1370 (2011)]. We also discuss the relationship among the transfer entropy rate, the time-delayed mutual information rate and their permutation analogues.  相似文献   

15.
Using standard statistical method, we discover the existence of correlations among Hawking radiations (of tunneled particles) from a black hole. The information carried by such correlations is quantified by mutual information between sequential emissions. Through a careful counting of the entropy taken out by the emitted particles, we show that the black hole radiation as tunneling is an entropy conservation process. While information is leaked out through the radiation, the total entropy is conserved. Thus, we conclude the black hole evaporation process is unitary.  相似文献   

16.
In this paper, we introduce and investigate the mutual information and relative entropy on the sequential effect algebra, we also give a comparison of these mutual information and relative entropy with the classical ones by the venn diagrams. Finally, a nice example shows that the entropies of sequential effect algebra depend extremely on the order of its sequential product.  相似文献   

17.
非平衡统计信息理论   总被引:5,自引:0,他引:5       下载免费PDF全文
邢修三 《物理学报》2004,53(9):2852-2863
阐述了以表述信息演化规律的信息(熵)演化方程为核心的非平 衡统计信息理论.推导出了 Shannon信息(熵)的非线性演化方程,引入了统计物理信息并 推导出了它的非线性演化方程.这两种信息(熵)演化方程一致表明:统计信息(熵)密度 随时间的变化率是由其在坐标空间(和态变量空间)的漂移、扩散和减损(产生)三者引起 的.由此方程出发,给出了统计信息减损率和统计熵产生率的简明公式、漂移信息流和扩散 信息流的表达式,证明了非平衡系统内的统计信息减损(或增加)率等于它的统计熵产生( 或减少)率、信息扩散与信息减损同时 关键词: 统计信息(熵)演化方程 统计信息减损率 统计熵产 生率 信息(熵)流 信息(熵)扩散 动态互信息  相似文献   

18.
Deep learning has proven to be an important element of modern data processing technology, which has found its application in many areas such as multimodal sensor data processing and understanding, data generation and anomaly detection. While the use of deep learning is booming in many real-world tasks, the internal processes of how it draws results is still uncertain. Understanding the data processing pathways within a deep neural network is important for transparency and better resource utilisation. In this paper, a method utilising information theoretic measures is used to reveal the typical learning patterns of convolutional neural networks, which are commonly used for image processing tasks. For this purpose, training samples, true labels and estimated labels are considered to be random variables. The mutual information and conditional entropy between these variables are then studied using information theoretical measures. This paper shows that more convolutional layers in the network improve its learning and unnecessarily higher numbers of convolutional layers do not improve the learning any further. The number of convolutional layers that need to be added to a neural network to gain the desired learning level can be determined with the help of theoretic information quantities including entropy, inequality and mutual information among the inputs to the network. The kernel size of convolutional layers only affects the learning speed of the network. This study also shows that where the dropout layer is applied to has no significant effects on the learning of networks with a lower dropout rate, and it is better placed immediately after the last convolutional layer with higher dropout rates.  相似文献   

19.
Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied in physical, biological and other natural sciences, as well as in social sciences, economy and finance. While studying such complex systems, it is important not only to detect synchronized states, but also to identify causal relationships (i.e. who drives whom) between concerned (sub) systems. The knowledge of information-theoretic measures (i.e. mutual information, conditional entropy) is essential for the analysis of information flow between two systems or between constituent subsystems of a complex system. However, the estimation of these measures from a set of finite samples is not trivial. The current extensive literatures on entropy and mutual information estimation provides a wide variety of approaches, from approximation-statistical, studying rate of convergence or consistency of an estimator for a general distribution, over learning algorithms operating on partitioned data space to heuristical approaches. The aim of this paper is to provide a detailed overview of information theoretic approaches for measuring causal influence in multivariate time series and to focus on diverse approaches to the entropy and mutual information estimation.  相似文献   

20.
Lithosphere-ionosphere non-linear interactions create a complex system where links between different phenomena can remain hidden. The statistical correlation between West Pacific strong earthquakes and high-energy electron bursts escaping trapped conditions was demonstrated in past works. Here, it is investigated from the point of view of information. Starting from the conditional probability statistical model, which was deduced from the correlation, the Shannon entropy, the joint entropy, and the conditional entropy are calculated. Time-delayed mutual information and transfer entropy have also been calculated analytically here for binary events: by including correlations between consecutive earthquake events, and between consecutive earthquakes and electron bursts. These quantities have been evaluated for the complex dynamical system of lithosphere-ionosphere; although the expressions calculated by probabilities resulted in being valid for each pair of binary events. Peaks occurred for the same time delay as in the correlations, Δt = 1.5–3.5 h, and as well as for a new time delay, Δt = −58.5–−56.5 h, for the transfer entropy; this last is linked to EQ self-correlations from the analysis. Even if the low number of self-correlated EQs makes this second peak insignificant in this case, it is of interest to separate the non-linear contribution of the transfer entropy of binary events in the study of a complex system.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号