首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Macroscopic Time Evolution and MaxEnt Inference for Closed Systems with Hamiltonian Dynamics
Authors:Domagoj Kui?  Pa?ko ?upanovi?  Davor Jureti?
Institution:(1) Faculty of Science, University of Split, N. Tesle 12, 21000 Split, Croatia
Abstract:MaxEnt inference algorithm and information theory are relevant for the time evolution of macroscopic systems considered as problem of incomplete information. Two different MaxEnt approaches are introduced in this work, both applied to prediction of time evolution for closed Hamiltonian systems. The first one is based on Liouville equation for the conditional probability distribution, introduced as a strict microscopic constraint on time evolution in phase space. The conditional probability distribution is defined for the set of microstates associated with the set of phase space paths determined by solutions of Hamilton’s equations. The MaxEnt inference algorithm with Shannon’s concept of the conditional information entropy is then applied to prediction, consistently with this strict microscopic constraint on time evolution in phase space. The second approach is based on the same concepts, with a difference that Liouville equation for the conditional probability distribution is introduced as a macroscopic constraint given by a phase space average. We consider the incomplete nature of our information about microscopic dynamics in a rational way that is consistent with Jaynes’ formulation of predictive statistical mechanics, and the concept of macroscopic reproducibility for time dependent processes. Maximization of the conditional information entropy subject to this macroscopic constraint leads to a loss of correlation between the initial phase space paths and final microstates. Information entropy is the theoretic upper bound on the conditional information entropy, with the upper bound attained only in case of the complete loss of correlation. In this alternative approach to prediction of macroscopic time evolution, maximization of the conditional information entropy is equivalent to the loss of statistical correlation, and leads to corresponding loss of information. In accordance with the original idea of Jaynes, irreversibility appears as a consequence of gradual loss of information about possible microstates of the system.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号