首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A brief history is presented, outlining the development of rate theory during the past century. Starting from Arrhenius [Z. Phys. Chem. 4, 226 (1889)], we follow especially the formulation of transition state theory by Wigner [Z. Phys. Chem. Abt. B 19, 203 (1932)] and Eyring [J. Chem. Phys. 3, 107 (1935)]. Transition state theory (TST) made it possible to obtain quick estimates for reaction rates for a broad variety of processes even during the days when sophisticated computers were not available. Arrhenius' suggestion that a transition state exists which is intermediate between reactants and products was central to the development of rate theory. Although Wigner gave an abstract definition of the transition state as a surface of minimal unidirectional flux, it took almost half of a century until the transition state was precisely defined by Pechukas [Dynamics of Molecular Collisions B, edited by W. H. Miller (Plenum, New York, 1976)], but even this only in the realm of classical mechanics. Eyring, considered by many to be the father of TST, never resolved the question as to the definition of the activation energy for which Arrhenius became famous. In 1978, Chandler [J. Chem. Phys. 68, 2959 (1978)] finally showed that especially when considering condensed phases, the activation energy is a free energy, it is the barrier height in the potential of mean force felt by the reacting system. Parallel to the development of rate theory in the chemistry community, Kramers published in 1940 [Physica (Amsterdam) 7, 284 (1940)] a seminal paper on the relation between Einstein's theory of Brownian motion [Einstein, Ann. Phys. 17, 549 (1905)] and rate theory. Kramers' paper provided a solution for the effect of friction on reaction rates but left us also with some challenges. He could not derive a uniform expression for the rate, valid for all values of the friction coefficient, known as the Kramers turnover problem. He also did not establish the connection between his approach and the TST developed by the chemistry community. For many years, Kramers' theory was considered as providing a dynamic correction to the thermodynamic TST. Both of these questions were resolved in the 1980s when Pollak [J. Chem. Phys. 85, 865 (1986)] showed that Kramers' expression in the moderate to strong friction regime could be derived from TST, provided that the bath, which is the source of the friction, is handled at the same level as the system which is observed. This then led to the Mel'nikov-Pollak-Grabert-Hanggi [Mel'nikov and Meshkov, J. Chem. Phys. 85, 1018 (1986); Pollak, Grabert, and Hanggi, ibid. 91, 4073 (1989)] solution of the turnover problem posed by Kramers. Although classical rate theory reached a high level of maturity, its quantum analog leaves the theorist with serious challenges to this very day. As noted by Wigner [Trans. Faraday Soc. 34, 29 (1938)], TST is an inherently classical theory. A definite quantum TST has not been formulated to date although some very useful approximate quantum rate theories have been invented. The successes and challenges facing quantum rate theory are outlined. An open problem which is being investigated intensively is rate theory away from equilibrium. TST is no longer valid and cannot even serve as a conceptual guide for understanding the critical factors which determine rates away from equilibrium. The nonequilibrium quantum theory is even less well developed than the classical, and suffers from the fact that even today, we do not know how to solve the real time quantum dynamics for systems with "many" degrees of freedom.  相似文献   

2.
3.

In this paper, we substantiate the at very low temperatures entanglement is exhibited manifestly. Also, we show that the phenomena of entanglement can be manipulated with respect to certain parameters in 2D crystals like graphene.

  相似文献   

4.
It is well known in quantum optics that fluctuations and dissipation inevitably intervene in the dynamics of open quantum systems. Density matrix elements may all decay exponentially and smoothly but we show that two-party entanglement, a valuable quantum coherence, may nevertheless abruptly decrease to zero in a finite time. This is Entanglement Sudden Death. In this talk we show how entanglement sudden death occurs under either phase or amplitude noise, either quantum or classical in origin. Moreover, we show that when two or more noises are active at the same time, the effects of the combined noises is even more unexpected.  相似文献   

5.
The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user’s desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.  相似文献   

6.
7.
We discuss the possibility that the recent detection of 511 keV gamma rays from the galactic bulge, as observed by INTEGRAL, is a consequence of low mass (1-100 MeV) particle dark matter annihilations. We discuss the type of halo profile favored by the observations as well as the size of the annihilation cross section needed to account for the signal. We find that such a scenario is consistent with the observed dark matter relic density and other constraints from astrophysics and particle physics.  相似文献   

8.
9.
J.M.D. Coey 《哲学杂志》2013,93(31):3857-3865
Claims that passing hard water through a magnetic field somehow influences the structure and morphology of the calcium carbonate that forms when the water is subsequently heated have been met with robust scepticism. This was largely due to the absence of any plausible mechanism whereby water could acquire a long-lasting magnetically-imprinted memory. Recent work challenging classical nucleation theory, insofar as calcium carbonate is concerned, has advanced the idea of liquid-like prenucleation clusters of indeterminate shape that are thermodynamically-stable in calcium carbonate solutions. These nanometer-scale clusters may be the key to the problem; the possible influence on them of a magnetic field via Maxwell-like stress or singlet–triplet mixing of proton dimers leading to a long-lived change in the number of ionic bonds is discussed.  相似文献   

10.
11.
We review several numerical approaches which have been developed during the last 10 years in order to perform more realistic radiative modelling of solar and stellar atmospheres. However, we shall restrict ourselves to the same ‘family’ of method, the ones which are derived from the so-called Accelerated Lambda-Iteration technique.  相似文献   

12.
《Physics letters. A》1988,133(9):461-465
The effect of a quasi-periodic shutter inserted between the source and one of the detectors in a correlated-photon type EPRB experiment is studied. It is shown that the time and polarization correlations are expected to disappear because an intermediate quantized field system is introduced by the shutter between the source field and the detector.  相似文献   

13.
According to the standard interpretation of quantum mechanics (QM), no meaning can be assigned to the statement that a particle has a precise value of any one of the variables describing its physical propertes before having interacted with a suitable measuring instrument. On the other hand, it is well known that QM tends to classical statistical mechanics (CSM) when a suitable classical limit is performed. One may ask therefore how is it that in this limit, the statement, meaningless in QM, that a given variable has always a precise value independently of having been measured, gradually becomes meaningful. In other words, one may ask how can it be that QM, which is a theory describing the intrinsically probabilistic properties of a quantum object, becomes a statistical theory describing a probabilistic knowledge of intrinsically well determined properties of classical objects.In the present paper we try to answer to this question and show that an inconsistency arises between the conventional interpretation of CSM which presupposes objectively existing Newtonian trajectories, and the standard interpretation of QM. We conclude that the latter needs revisiting unnless we wish to adopt a strictly subjective conception of the world around us, implying that macroscopic objects as well are not localized anywhere before we look at them.  相似文献   

14.
《Comptes Rendus Physique》2019,20(4):244-261
We present a short review based on the nonlinear q-voter model about problems and methods raised within statistical physics of opinion formation (SPOOF). We describe relations between models of opinion formation, developed by physicists, and theoretical models of social response, known in social psychology. We draw attention to issues that are interesting for social psychologists and physicists. We show examples of studies directly inspired by social psychology like: “independence vs. anticonformity” or “personality vs. situation”. We summarize the results that have been already obtained and point out what else can be done, also with respect to other models in SPOOF. Finally, we demonstrate several analytical methods useful in SPOOF, such as the concept of effective force and potential, Landau's approach to phase transitions, or mean-field and pair approximations.  相似文献   

15.
We consider different classes of scalar field models including quintessence and tachyon scalar fields with a variety of generic potentials belonging to the thawing type. We focus on observational quantities like Hubble parameter, luminosity distance as well as quantities related to the Baryon Acoustic Oscillation measurement. Our study shows that with present state of observations, one cannot distinguish amongst various models which in turn cannot be distinguished from cosmological constant. Our analysis indicates that there is a small chance to observe the dark energy metamorphosis in near future.  相似文献   

16.
17.
18.
《Physics letters. [Part B]》1986,172(2):180-183
It is shown that if our universe is described by the cosmological, classical, Einstein equations based on a manifold M4 × G where G is the product of any number of Ricci flat manifolds (as is expected to occur in superstring theories) then our universe must, today, be radiation dominated. A matter-dominated universe (or an inflationary universe) would lead to large variations in the size of the extra dimensions. This would lead to changes in coupling constants of sufficiently large magnitude to be rules out by observation. If a model based on M4 x G with G Ricci flat is to viable, there are only two possibilities. Either the universe is radiation dominated today (for which no satisfactory model exists) or the compactification is controlled dominantly by some (thus far) unquantifiable “quantum gravity” effects.  相似文献   

19.
We explore how the Peccei-Quinn (PQ) axion parameter space can be constrained by the frequency-dependence dimming of radiation from astrophysical objects. To do so we perform accurate calculations of photon-axion conversion in the presence of a variable magnetic field. We propose several tests where the PQ axion parameter space can be explored with current and future astronomical surveys: the observed spectra of isolated neutron stars, occultations of background objects by white dwarfs and neutron stars, the light-curves of eclipsing binaries containing a white dwarf. We find that the lack of dimming of the light-curve of a detached eclipsing white dwarf binary recently observed, leads to relevant constraints on the photon-axion conversion. Current surveys designed for Earth-like planet searches are well matched to strengthen and improve the constraints on the PQ axion using astrophysical objects radiation dimming.  相似文献   

20.
The currently synthesized noble gas (Ng) molecules are mostly xenon or krypton compounds. HArF is the only experimentally prepared compound containing a light Ng atom. In this work, a new argon compound with the formula HArC4CN, was predicted to be theoretically stable (5.66 kcal/mol at ROCCSD(T)/6-311++g(2d,2p) level of theory). Two decomposition transition states were found, i.e. the 3-body and 2-body decomposition, which produces H, Ar and C4CN or HC4CN and Ar respectively. The HArC4CN molecule is meta-stable, but the low energy barrier between the stable molecule and the 3-body transition state makes it possible to synthesize from the three fragments and the high energy barrier between the minimum and the 2-body transition state prevented it from decomposing. Natural bond orbital (NBO) and electron localization function (ELF) analysis show a strong ionic bond between Ar and C atoms whereas covalent between Ar and H. Compared with previous work, it is the conjugation effect of ?C4CN as well as the its electronegativity that activates the noble gas atom and results in a stable noble gas compound.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号