首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The maximum entropy principle consists of two steps: The first step is to find the distribution which maximizes entropy under given constraints. The second step is to calculate the corresponding thermodynamic quantities. The second part is determined by Lagrange multipliers’ relation to the measurable physical quantities as temperature or Helmholtz free energy/free entropy. We show that for a given MaxEnt distribution, the whole class of entropies and constraints leads to the same distribution but generally different thermodynamics. Two simple classes of transformations that preserve the MaxEnt distributions are studied: The first case is a transform of the entropy to an arbitrary increasing function of that entropy. The second case is the transform of the energetic constraint to a combination of the normalization and energetic constraints. We derive group transformations of the Lagrange multipliers corresponding to these transformations and determine their connections to thermodynamic quantities. For each case, we provide a simple example of this transformation.  相似文献   

2.
We revisit the concavity property of the thermodynamic entropy in order to formulate a general proof of the minimum energy principle as well as of other equivalent extremum principles that are valid for thermodynamic potentials and corresponding Massieu functions under different constraints. The current derivation aims at providing a coherent formal framework for such principles which may be also pedagogically useful as it fully exploits and highlights the equivalence between different schemes. We also elucidate the consequences of the extremum principles for the general shape of thermodynamic potentials in relation to first-order phase transitions.  相似文献   

3.
A new method to obtain a series of reduced dynamics at various stages of coarse-graining is proposed. This ranges from the most coarse-grained one which agrees with the deterministic time evolution equation for averages of the relevant variables to the least coarse-grained one which is the generalized Fokker-Planck equation for the probability distribution function of the relevant variables. The method is based on the extention of the Kawasaki-Gunton operator with the help of the principle of maximum entropy.  相似文献   

4.
Quantum physics, despite its intrinsically probabilistic nature, lacks a definition of entropy fully accounting for the randomness of a quantum state. For example, von Neumann entropy quantifies only the incomplete specification of a quantum state and does not quantify the probabilistic distribution of its observables; it trivially vanishes for pure quantum states. We propose a quantum entropy that quantifies the randomness of a pure quantum state via a conjugate pair of observables/operators forming the quantum phase space. The entropy is dimensionless, it is a relativistic scalar, it is invariant under canonical transformations and under CPT transformations, and its minimum has been established by the entropic uncertainty principle. We expand the entropy to also include mixed states. We show that the entropy is monotonically increasing during a time evolution of coherent states under a Dirac Hamiltonian. However, in a mathematical scenario, when two fermions come closer to each other, each evolving as a coherent state, the total system’s entropy oscillates due to the increasing spatial entanglement. We hypothesize an entropy law governing physical systems whereby the entropy of a closed system never decreases, implying a time arrow for particle physics. We then explore the possibility that as the oscillations of the entropy must by the law be barred in quantum physics, potential entropy oscillations trigger annihilation and creation of particles.  相似文献   

5.
In the double carbon background, riding the wind of new energy vehicles and the battery high nickelization, nickel resources rise along with the trend. In recent years, due to the influence of geopolitical conflicts and emergencies, as well as the speculation and control of international capital with its advantages and rules, the world may face price and security supply risks to a certain extent. Therefore, to obtain the most objective trade redistribution strategy, this paper first constructs the nickel material trade network, identifies the core trading countries and the main trade relations of nickel material trade, and finds that the flow of nickel material mainly occurred between a few countries. On this basis, a trade redistribution model is constructed based on the maximum entropy principle. Taking Indonesia, the largest exporter, and the largest trade relationship (Indonesia exports to China) as examples, the nickel material redistribution between countries when different supply risks occur are simulated. The results can provide an important reference for national resource recovery after the risk of the nickel trade.  相似文献   

6.
Wealth Rheology     
We study wealth rank correlations in a simple model of macroeconomy. To quantify rank correlations between wealth rankings at different times, we use Kendall’s τ and Spearman’s ρ, Goodman–Kruskal’s γ, and the lists’ overlap ratio. We show that the dynamics of wealth flow and the speed of reshuffling in the ranking list depend on parameters of the model controlling the wealth exchange rate and the wealth growth volatility. As an example of the rheology of wealth in real data, we analyze the lists of the richest people in Poland, Germany, the USA and the world.  相似文献   

7.
8.
S N Karbelkar 《Pramana》1986,26(4):301-310
Recent axiomatic derivations of the maximum entropy principle from consistency conditions are critically examined. We show that proper application of consistency conditions alone allows a wider class of functionals, essentially of the form ∝ dx p(x)[p(x)/g(x)] s , for some real numbers, to be used for inductive inference and the commonly used form − ∝ dx p(x)ln[p(x)/g(x)] is only a particular case. The role of the prior densityg(x) is clarified. It is possible to regard it as a geometric factor, describing the coordinate system used and it does not represent information of the same kind as obtained by measurements on the system in the form of expectation values.  相似文献   

9.
大量粒子系统的课程教学体系探讨   总被引:1,自引:1,他引:1  
王彬 《物理与工程》2001,11(1):17-21
本文讨论了分布函数的概念、分布函数与熵的联系和最大熵原理。阐述了引入信息熵与最大熵原理对于大量粒子系统的课程教学体系改革的必要性和重要性。  相似文献   

10.
The degradation and recovery processes are multi-scale phenomena in many physical, engineering, biological, and social systems, and determine the aging of the entire system. Therefore, understanding the interplay between the two processes at the component level is the key to evaluate the reliability of the system. Based on the principle of maximum entropy, an approach is proposed to model and infer the processes at the component level, and is applied to repairable and non-repairable systems. By incorporating the reliability block diagram, this approach allows for integrating the information of network connectivity and statistical moments to infer the hazard or recovery rates of the degradation or recovery processes. The overall approach is demonstrated with numerical examples.  相似文献   

11.
A recent application of the Maximum Entropy Formalism on liquid atomization problems led to the development of a mathematical volume‐based drop‐size distribution. This function, which depends on three parameters, is a reduction of the four‐parameter generalized Gamma function. The aim of the present work is to investigate the relevance of the three parameters in the characterization of liquid atomization processes. To achieve this, a variety of experimental drop‐size distributions of ultrasonic sprays were analyzed with the mathematical function. Firstly, it is found that the mathematical drop‐size distribution is very suitable to represent the volume‐based drop‐size distribution of ultrasonic sprays. Furthermore, it is seen that when considering the three parameters introduced by the function, one of them is constant for all the situations investigated, and the other two are linked to a non‐dimensional group that includes the main parameters controlling the drop production. These results are very important, since they suggest a possible development of physical models of primary atomization based on the M.E.F., which would allow for the prediction of the spray drop‐size distribution. Thusfar, such a model does not exist.  相似文献   

12.
The financial market is a complex system in which the assets influence each other, causing, among other factors, price interactions and co-movement of returns. Using the Maximum Entropy Principle approach, we analyze the interactions between a selected set of stock assets and equity indices under different high and low return volatility episodes at the 2008 Subprime Crisis and the 2020 COVID-19 outbreak. We carry out an inference process to identify the interactions, in which we implement the a pairwise Ising distribution model describing the first and second moments of the distribution of the discretized returns of each asset. Our results indicate that second-order interactions explain more than 80% of the entropy in the system during the Subprime Crisis and slightly higher than 50% during the COVID-19 outbreak independently of the period of high or low volatility analyzed. The evidence shows that during these periods, slight changes in the second-order interactions are enough to induce large changes in assets correlations but the proportion of positive and negative interactions remains virtually unchanged. Although some interactions change signs, the proportion of these changes are the same period to period, which keeps the system in a ferromagnetic state. These results are similar even when analyzing triadic structures in the signed network of couplings.  相似文献   

13.
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single posterior, and also addresses the question of how much less probable other distributions might be, which provides a direct bridge to the theories of fluctuations and large deviations.  相似文献   

14.
张虎龙 《应用声学》2017,25(5):236-239
图像分割是图像分析、识别和理解的基础。图像分割主要是指将图像分成各具特性的区域并提取出感兴趣目标的技术,其研究多年来一直受到人们的高度重视;阈值化法是图像分割的一种重要方法,在图像处理与识别中广为应用;针对图像分割中细节往往被忽略导致后续处理困难的问题,基于模糊关系和最大模糊熵原理提出了一种阈值化方法,对二维直方图进行模糊分割;为了获得图像分割中的细节,提出的方法根据最大熵原则自动确定模糊区域和门限,进而获得二维模糊熵和遗传算法最优解,最后获得图像细节;通过对不同灰度水平和颜色类型图像进行实验比较,实验结果表明提出的方法优于二维非模糊方法和一维模糊熵分割法,得到该方法在图像分割中获得细节的结论。  相似文献   

15.
Considering the differences and disagreements involving the previous application of the Maximum Entropy Formalism to modeling the drop‐size distribution of liquid sprays, a new formulation is suggested. The constraints introduced in this formulation are based on characteristic features common to any liquid atomization process, i. e., the production of large and small drops is always limited. These limitations are a consequence of the action of both destabilizing and stabilizing forces such as aerodynamic and surface tension forces, respectively. The solution resulting from this approach, which makes use of statistical mechanics, is a three‐parameter Generalized Gamma Distribution, which can treat any type of distribution. It is shown that this solution is identical to a Nukiyama‐Tanasawa distribution that should no longer be regarded as an empirical distribution. Although this new formulation clearly answers the question concerning the amount of information required to describe a spray drop‐size distribution, it raises the problem of the mathematical form to be given to this information, and is discussed here.  相似文献   

16.
The spread of the COVID-19 pandemic has highlighted the close link between economics and health in the context of emergency management. A widespread vaccination campaign is considered the main tool to contain the economic consequences. This paper will focus, at the level of wealth distribution modeling, on the economic improvements induced by the vaccination campaign in terms of its effectiveness rate. The economic trend during the pandemic is evaluated, resorting to a mathematical model joining a classical compartmental model including vaccinated individuals with a kinetic model of wealth distribution based on binary wealth exchanges. The interplay between wealth exchanges and the progress of the infectious disease is realized by assuming, on the one hand, that individuals in different compartments act differently in the economic process and, on the other hand, that the epidemic affects risk in economic transactions. Using the mathematical tools of kinetic theory, it is possible to identify the equilibrium states of the system and the formation of inequalities due to the pandemic in the wealth distribution of the population. Numerical experiments highlight the importance of the vaccination campaign and its positive effects in reducing economic inequalities in the multi-agent society.  相似文献   

17.
Bounded rationality is an important consideration stemming from the fact that agents often have limits on their processing abilities, making the assumption of perfect rationality inapplicable to many real tasks. We propose an information-theoretic approach to the inference of agent decisions under Smithian competition. The model explicitly captures the boundedness of agents (limited in their information-processing capacity) as the cost of information acquisition for expanding their prior beliefs. The expansion is measured as the Kullblack–Leibler divergence between posterior decisions and prior beliefs. When information acquisition is free, the homo economicus agent is recovered, while in cases when information acquisition becomes costly, agents instead revert to their prior beliefs. The maximum entropy principle is used to infer least biased decisions based upon the notion of Smithian competition formalised within the Quantal Response Statistical Equilibrium framework. The incorporation of prior beliefs into such a framework allowed us to systematically explore the effects of prior beliefs on decision-making in the presence of market feedback, as well as importantly adding a temporal interpretation to the framework. We verified the proposed model using Australian housing market data, showing how the incorporation of prior knowledge alters the resulting agent decisions. Specifically, it allowed for the separation of past beliefs and utility maximisation behaviour of the agent as well as the analysis into the evolution of agent beliefs.  相似文献   

18.
The intrinsic Helmholtz free energy, commonly used as a basis for density functional theories, is here given explicitly as a cluster diagram expansion with density field points. Also given are explicit variational procedures for determining the chemical potential for a given density, the pair potential for a given pair correlation function, and the pair correlation function for a given pair potential. The physical meaning of the density functional is established within the context of a new derivation of statistical mechanics based on entropy that supplies a variational principle for equilibrium by generalizing the thermodynamic potential to nonequlibrium states. This shows that the conventional density functional determines not only the equilibrium density, but also the probability of fluctuations about that density.  相似文献   

19.
In this paper, the fractional cumulative entropy is considered to get its further properties and also its developments to dynamic cases. The measure is used to characterize a family of symmetric distributions and also another location family of distributions. The links between the fractional cumulative entropy and the classical differential entropy and some reliability quantities are also unveiled. In addition, the connection the measure has with the standard deviation is also found. We provide some examples to establish the variability property of this measure.  相似文献   

20.
Considering corrections to all orders in Planck length on the quantum state density from a generalized uncertainty principle (GUP), we calculate the statistical entropy of the Bose field and Fermi field on the background of the four-dimensional spherically symmetric black holes without any cutoff. It is obtained that the statistical entropy is directly proportional to the area of horizon.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号