首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, we develop dynamic statistical information theory established by the author. Starting from the ideas that the state variable evolution equations of stochastic dynamic systems, classical and quantum nonequilibrium statistical physical systems and special electromagnetic field systems can be regarded as their information symbol evolution equations and the definitions of dynamic information and dynamic entropy, we derive the evolution equations of dynamic information and dynamic entropy that des...  相似文献   

2.
Where do entangled multiple-qubit systems store information? For information injected into a qubit, this question is nontrivial and interesting since the entanglement delocalizes the information. So far, a common picture is that of a qubit and its purification partner sharing the information quantum mechanically. Here, we introduce a new picture of a single qubit in the correlation space, referred to as quantum information capsule (QIC), confining the information perfectly. This picture is applicable for the entangled multiple-qubit system in an arbitrary state. Unlike the partner picture, in the QIC picture, by swapping the single-body state, leaving other subsystems untouched, the whole information can be retrieved out of the system. After the swapping process, no information remains in the system.  相似文献   

3.
非平衡统计信息理论   总被引:5,自引:0,他引:5       下载免费PDF全文
邢修三 《物理学报》2004,53(9):2852-2863
阐述了以表述信息演化规律的信息(熵)演化方程为核心的非平 衡统计信息理论.推导出了 Shannon信息(熵)的非线性演化方程,引入了统计物理信息并 推导出了它的非线性演化方程.这两种信息(熵)演化方程一致表明:统计信息(熵)密度 随时间的变化率是由其在坐标空间(和态变量空间)的漂移、扩散和减损(产生)三者引起 的.由此方程出发,给出了统计信息减损率和统计熵产生率的简明公式、漂移信息流和扩散 信息流的表达式,证明了非平衡系统内的统计信息减损(或增加)率等于它的统计熵产生( 或减少)率、信息扩散与信息减损同时 关键词: 统计信息(熵)演化方程 统计信息减损率 统计熵产 生率 信息(熵)流 信息(熵)扩散 动态互信息  相似文献   

4.
邢修三 《物理学报》2014,63(23):230201-230201
本文综述了作者的研究成果.近十年,作者将现有静态统计信息理论拓展至动态过程,建立了以表述动态信息演化规律的动态信息演化方程为核心的动态统计信息理论.基于服从随机性规律的动力学系统(如随机动力学系统和非平衡态统计物理系统)与遵守确定性规律的动力学系统(如电动力学系统)的态变量概率密度演化方程都可看成是其信息符号演化方程,推导出了动态信息(熵)演化方程.它们表明:对于服从随机性规律的动力学系统,动态信息密度随时间的变化率是由其在系统内部的态变量空间和传递过程的坐标空间的漂移、扩散和耗损三者引起的,而动态信息熵密度随时间的变化率则是由其在系统内部的态变量空间和传递过程的坐标空间的漂移、扩散和产生三者引起的.对于遵守确定性规律的动力学系统,动态信息(熵)演化方程与前者的相比,除动态信息(熵)密度在系统内部的态变量空间仅有漂移外,其余皆相同.信息和熵已与系统的状态和变化规律结合在一起,信息扩散和信息耗损同时存在.当空间噪声可略去时,将会出现信息波.若仅研究系统内部的信息变化,动态信息演化方程就约化为与表述上述动力学系统变化规律的动力学方程相对应的信息方程,它既可看成是表述动力学系统动态信息的演化规律,亦可看成是动力学系统的变化规律都可由信息方程表述.进而给出了漂移和扩散信息流公式、信息耗散率公式和信息熵产生率公式及动力学系统退化和进化的统一信息表述公式.得到了反映信息在传递过程中耗散特性的动态互信息公式和动态信道容量公式,它们在信道长度和信号传递速度之比趋于零的极限情况下变为现有的静态互信息公式和静态信道容量公式.所有这些新的理论公式和结果都是从动态信息演化方程统一推导出的.  相似文献   

5.
杨振寰 《应用光学》2016,37(6):789-795
光不仅是维持生命活动的主要能量来源,也是一种有效的信息载体。阐述了光作为载体与信息之间的密切联系,讨论了海森堡不确定性原理对信息的解释,即每一比特信息都对应着一个光量子,受限于所在的信息元,并且与熵的消耗相关。分别给出了满足不确定性原理条件以及超出该限制时的例子,从而证明了时间和频率的分辨率不能被同时观测,但成像仍然可以在确定性条件的限制下获得。  相似文献   

6.
Measuring information transfer   总被引:4,自引:0,他引:4  
An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish effectively driving and responding elements and to detect asymmetry in the interaction of subsystems.  相似文献   

7.
On the basis of local realism theory, nonlocal information is necessary for violation of Bell's inequality. From a theoretical point of view, nonlocal information is essentially the mutual information on distant outcome and measurement setting. In this work we prove that if the measurement is free and unbiased, the mutual information about the distant outcome and setting is both necessary for the violation of Bell's inequality in the case with unbiased marginal probabilities. In the case with biased marginal probabilities, we point out that the mutual information about distant outcome cease to be necessary for violation of Bell's inequality, while the mutual information about distant measurement settings is still required. We also prove that the mutual information about distant measurement settings must be contained in the transmitted messages due to the freedom of measurement choices. Finally we point out that the mutual information about both distant outcome and measurement settings are necessary for a violation of information causality.  相似文献   

8.
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to information transmission rate approaches to zero. All these unified and rigorous theoretical formulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories.  相似文献   

9.
Measurement can drive quantum dynamics, for example in ancilla driven quantum computation where unitary evolution is generated by measurements that extract no information. Where a measurement does reveal some information about the system, it may sometimes be possible to “unlearn” this information and restore unitary evolution through subsequent measurements. Here we analyse two methods of quantum “unlearning” and present a simplified proof of the bound on the probability of successfully applying the required correction operators. The probability of successful recovery is inversely related to the ability of the initial measurement to exclude the possibility of a state. As a consequence there exist unrecoverable measurements that provide little information gain.  相似文献   

10.

Background

To understand the functioning of distributed networks such as the brain, it is important to characterize their ability to integrate information. The paper considers a measure based on effective information, a quantity capturing all causal interactions that can occur between two parts of a system.

Results

The capacity to integrate information, or Φ, is given by the minimum amount of effective information that can be exchanged between two complementary parts of a subset. It is shown that this measure can be used to identify the subsets of a system that can integrate information, or complexes. The analysis is applied to idealized neural systems that differ in the organization of their connections. The results indicate that Φ is maximized by having each element develop a different connection pattern with the rest of the complex (functional specialization) while ensuring that a large amount of information can be exchanged across any bipartition of the network (functional integration).

Conclusion

Based on this analysis, the connectional organization of certain neural architectures, such as the thalamocortical system, are well suited to information integration, while that of others, such as the cerebellum, are not, with significant functional consequences. The proposed analysis of information integration should be applicable to other systems and networks.
  相似文献   

11.
After a brief introduction to information theory, we review the close relationship between the theory of spin glasses and information processing, error-correcting codes in particular. An interesting equivalence of the solvability condition of the spin glass problem and the optimal inference condition in information theory is pointed out.  相似文献   

12.
Tolman's paradox forbidding time-reversed information transmission is nonexistent and rests only on our ingrained thought processes involving hidden, unnecessary assumptions. When the assumption of a passive channel is removed, the paradox cannot be derived and information can flow intermittently or nondeterministically from the future over a simple computer with at least one independent decision-making component.  相似文献   

13.
Following recent studies concerning the use of information theory in electronic structure theory of atomic and molecular systems, an analytical relationship between Onicescu information energy and densities of Shannon entropy and the two forms of the Fisher information has been presented. The established proof must be viewed in the light of the exponentially decaying nature of the asymptotic density of atoms and molecules.  相似文献   

14.
Haken recently applied the slaving principle to decompose expressions for information, information gain and efficiency of self-organizing systems into two parts. The first part in each case depends on the order parameter while the second on the slaved modes. Close to the instability points the latter contribution can be ignored. In this paper we elaborate on the formulas given by Haken and find exact expressions which are analytical in the entire domain of the values of control parameters. We also find their asymptotic limits in the immediate proximity of the instability point and far away from it. Our results provide a direct way of relating information, information gain and efficiency of nonequilibrium critical systems to their control parameters.  相似文献   

15.
阮秋琦 《物理》2005,34(03):205-213
图像是人类传递信息的重要媒体,图像信息处理又是信息科学的重要研究分支.目前,起源于20世纪20年代的数学图像处理技术已成为工程学、计算机科学、信息科学、统计学、物理、化学、生物学、医学甚至社会科学等领域中各学科之间学习和研究的对象.文章全面综述了图像处理理论与技术的起源、涵盖的内容、对人类的重要意义及发展趋势.以期读者对图像信息处理有一个全面的理解.  相似文献   

16.
F. Pennini  A. Plastino 《Physica A》2007,383(2):782-796
Escort distributions are a well established but (for physicists) a relatively new concept that is rapidly gaining wide acceptance in world. In this work we wish to revisit the concept within the strictures of the celebrated semiclassical Husimi distributions (HDs) and thereby investigate the possibility of extracting new semiclassical information contained, not in the HD themselves, but in their associated escort Husimi distributions. We will also establish relations, for various information measures, between their deformed versions [J. Naudts, Physica A 316 (2002) 323] and those built up with escort HDs. Bounds on the concomitant power exponents will be determined.  相似文献   

17.
阮善明  安宇森  李理 《物理》2020,49(12):797-805
黑洞信息佯谬作为理论物理领域最著名的问题之一,长期以来一直被认为是研究量子引力的重要途径。黑洞信息佯谬的一个核心问题是给出在黑洞蒸发过程中的佩奇曲线行为。近一年,对该问题的研究迎来了突破性进展。研究人员第一次在半经典引力框架下实现了对佩奇曲线的计算,表明黑洞在蒸发过程中信息可以被释放出来,不存在信息丢失问题。文章将按照历史发展的顺序,对黑洞信息佯谬这一重要问题以及最新进展进行介绍,包括霍金辐射、佩奇曲线、全息原理、广义熵、量子极端面和量子极端孤岛等重要内容。  相似文献   

18.
Information processing with light is ubiquitous, from communication, metrology and imaging to computing. When we consider light as a quantum mechanical object, new ways of information processing become possible. In this review I give an overview of how quantum information processing can be implemented with single photons, and what hurdles still need to be overcome to implement the various applications in practice. I will place special emphasis on the quantum mechanical properties of light that make it different from classical light, and how these properties relate to quantum information processing tasks.  相似文献   

19.
We obtain the mutual information of Ising systems, which shows singular behavior near the critical point. We connect the mutual information with the magnetization and the correlation function. The mutual information is a suitable measure for the critical behavior of Ising systems.  相似文献   

20.
C G Chakrabarti 《Pramana》1983,20(1):65-72
The interrelation between some basic concepts of information, theory and thermodynamics has been studied on the basis of the statistical measures of entropy and information. The problems studied are the negentropy principle of information, a minimax information principle of thermal equilibrium and the role of information correlation in the derivation of a functional equation characterising the statistical equilibrium of a system.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号