首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We extend the constructions of previous papers, showing the equivalence of quantum mechanics and a classical probability formalism with constraints assuring differentiable probability densities without contradictions, to show that these constructions also yield Maxwell's equations and the Lorentz force. These constructions have already yielded Schroedinger's equation for a charged particle in an electromagnetic field, but here it is shown that this statistical construction provides the basis for gauge conditions and defines a specific gauge for this non-relativistic formalism. These constructions also provide new insight into the relationship of Schroedinger quantum mechanics and a classical diffusion process.  相似文献   

2.
An extension of the Born rule, the quantum typicality rule, has recently been proposed [B. Galvan in Found. Phys. 37:1540–1562 (2007)]. Roughly speaking, this rule states that if the wave function of a particle is split into non-overlapping wave packets, the particle stays approximately inside the support of one of the wave packets, without jumping to the others. In this paper a formal definition of this rule is given in terms of imprecise probability. An imprecise probability space is a measurable space endowed with a set of probability measures ℘. The quantum formalism and the quantum typicality rule allow us to define a set of probabilities on (X T ,ℱ), where X is the configuration space of a quantum system, T is a time interval and ℱ is the σ-algebra generated by the cylinder sets. Thus, it is proposed that a quantum system can be represented as the imprecise stochastic process , which is a canonical stochastic process in which the single probability measure is replaced by a set of measures. It is argued that this mathematical model, when used to represent macroscopic systems, has sufficient predictive power to explain both the results of the statistical experiments and the quasi-classical structure of the macroscopic evolution.  相似文献   

3.
The notion of fuzzy event is introduced in the theory of measurement in quantum mechanics by indicating in which sense measurements can be considered to yield fuzzy sets. The concept of probability measure on fuzzy events is defined, and its general properties are deduced from the operational meaning assigned to it. It is pointed out that such probabilities can be derived from the formalism of quantum mechanics. Any such probability on a given fuzzy set is related to the frequency of occurrence within that set of points in a random sample, where the sample points are themselves fuzzy sets obtained as outcomes of measurements of, in general, incompatible observables on replicas of the system in the same prepared state.  相似文献   

4.
5.
We analyze the origin of quantum randomness within the framework of a completely deterministic theory of particle motion—Bohmian mechanics. We show that a universe governed by this mechanics evolves in such a way as to give rise to the appearance of randomness, with empirical distributions in agreement with the predictions of the quantum formalism. Crucial ingredients in our analysis are the concept of the effective wave function of a subsystem and that of a random system. The latter is a notion of interest in its own right and is relevant to any discussion of the role of probability in a deterministic universe.Research supported in part by NSF Grant DMS-9105661.  相似文献   

6.
In the foundations of quantum mechanics Gleason’s theorem dictates the uniqueness of the state transition probability via the inner product of the corresponding state vectors in Hilbert space, independent of which measurement context induces this transition. We argue that the state transition probability should not be regarded as a secondary concept which can be derived from the structure on the set of states and properties, but instead should be regarded as a primitive concept for which measurement context is crucial. Accordingly, we adopt an operational approach to quantum mechanics in which a physical entity is defined by the structure of its set of states, set of properties and the possible (measurement) contexts which can be applied to this entity. We put forward some elementary definitions to derive an operational theory from this State–COntext–Property (SCOP) formalism. We show that if the SCOP satisfies a Gleason-like condition, namely that the state transition probability is independent of which measurement context induces the change of state, then the lattice of properties is orthocomplemented, which is one of the ‘quantum axioms’ used in the Piron–Solèr representation theorem for quantum systems. In this sense we obtain a possible physical meaning for the orthocomplementation widely used in quantum structures.  相似文献   

7.
We construct a stochastic mechanics by replacing Bohm‧s first-order ordinary differential equation of motion with a stochastic differential equation where the stochastic process is defined by the set of Bohmian momentum time histories from an ensemble of particles. We show that, if the stochastic process is a purely random process with n-th order joint probability density in the form of products of delta functions, then the stochastic mechanics is equivalent to quantum mechanics in the sense that the former yields the same position probability density as the latter. However, for a particular non-purely random process, we show that the stochastic mechanics is not equivalent to quantum mechanics. Whether the equivalence between the stochastic mechanics and quantum mechanics holds for all purely random processes but breaks down for all non-purely random processes remains an open question.  相似文献   

8.
9.
Operational statistics is an operational theory of probability and statistics which generalizes classical probability and statistics and provides a formalism particularly suited to the needs of quantum mechanics. Within this formalism, statistical inference can be accomplished using the Bayesian inference strategy. In a hierarchical Bayesian approach, a second-order probability measure, or credibility, represents degrees of belief in statistical hypotheses. A credibility determines an assignment of simple and conditioned betting rates to events in a natural way. In the setting of operational statistics, we show that a credibility is completely determined by the assignment of the betting rates it induces. This result suggests a certain unity between the Bayesian philosophy which deems that betting rates are central and the one which advocates the hierarchical approach.  相似文献   

10.
To complete our ontological interpretation of quantum theory we have to conclude a treatment of quantum statistical mechanics. The basic concepts in the ontological approach are the particle and the wave function. The density matrix cannot play a fundamental role here. Therefore quantum statistical mechanics will require a further statistical distribution over wave functions in addition to the distribution of particles that have a specified wave function. Ultimately the wave function of the universe will he required, but we show that if the universe in not in thermodynamic equilibrium then it can he treated in terms of weakly interacting large scale constituents that are very nearly independent of each other. In this way we obtain the same results as those of the usual approach within the framework of the ontological interpretation.Professor D. Bohm died on 28 October 1992, shortly after this paper was completed.  相似文献   

11.
It is shown that the hallmark quantum phenomenon of contextuality is present in classical statistical mechanics (CSM). It is first shown that the occurrence of contextuality is equivalent to there being observables that can differentiate between pure and mixed states. CSM is formulated in the formalism of quantum mechanics (FQM), a formulation commonly known as the Koopman–von Neumann formulation (KvN). In KvN, one can then show that such a differentiation between mixed and pure states is possible. As contextuality is a probabilistic phenomenon and as it is exhibited in both classical physics and ordinary quantum mechanics (OQM), it is concluded that the foundational issues regarding quantum mechanics are really issues regarding the foundations of probability.  相似文献   

12.
Classical relativistic cosmology is known to have the space-time singularity as an inevitable feature. The standard big bang models have very small particle horizons in the early stages which make it difficult to understand the observed homogeneity in the universe. The relatively narrow range of the observed matter density in the neighbourhood of closure density requires highly fine tuning of the early universe. In this paper it is argued that these three problems can be satisfactorily resolved in quantum cosmology. It is shown that it is extremely unlikely that the universe evolved to the present state from quantum states with singularity and particle horizon. Similarly, it is shown that of all possible states the Robertson-Walker model of flat spatial sections is the most likely state for the universe to evolve out of a quantum fluctuation. To demonstrate these results a suitable formalism for quantum cosmology is first developed.  相似文献   

13.
A scheme for constructing quantum mechanics is given that does not have Hilbert space and linear operators as its basic elements. Instead, a version of algebraic approach is considered. Elements of a noncommutative algebra (observables) and functionals on this algebra (elementary states) associated with results of single measurements are used as primary components of the scheme. On the one hand, it is possible to use within the scheme the formalism of the standard (Kolmogorov) probability theory, and, on the other hand, it is possible to reproduce the mathematical formalism of standard quantum mechanics, and to study the limits of its applicability. A short outline is given of the necessary material from the theory of algebras and probability theory. It is described how the mathematical scheme of the paper agrees with the theory of quantum measurements, and avoids quantum paradoxes.  相似文献   

14.
The quantum formalism is a measurement formalism-a phenomenological formalism describing certain macroscopic regularities. We argue that it can be regarded, and best be understood, as arising from Bohmian mechanics, which is what emerges from Schrödinger's equation for a system of particles when we merely insist that particles means particles. While distinctly non-Newtonian, Bohmian mechanics is a fully deterministic theory of particles in motion, a motion choreographed by the wave function. We find that a Bohmian universe, though deterministic, evolves in such a manner that anappearance of randomness emerges, precisely as described by the quantum formalism and given, for example, by = ¦¦ 2. A crucial ingredient in our analysis of the origin of this randomness is the notion of the effective wave function of a subsystem, a notion of interest in its own right and of relevance to any discussion of quantum theory. When the quantum formalism is regarded as arising in this way, the paradoxes and perplexities so often associated with (nonrelativistic) quantum theory simply evaporate.This paper is dedicated to the memory of J. S. Bell.  相似文献   

15.
An SR model is presented that shows how an objective (noncontextual and local) interpretation of quantum mechanics can be constructed, which contradicts some well-established beliefs following from the standard interpretation of the theory and from known no-go theorems. The SR model is not a hidden variables theory in the standard sense, but it can be considered a hidden parameters theory which satisfies constraints that are weaker than those usually imposed on standard hidden variables theories. The SR model is also extended in a natural way that shows how a broader theory embodying quantum mechanics can be envisaged which is realistic in a semantic sense, hence compatible with various realistic perspectives.  相似文献   

16.
17.
In [5] M.J. Maczyński showed that the Hilbert space formalism of quantum mechanics (see [3]) can be derived from a set of seven axioms involving only the probability function p(A, α, E) and the complex field postulate. The purpose of this note is to give the next axiom that is equivalent to the analogous real field postulate.  相似文献   

18.
The Ghirardi–Rimini–Weber (GRW) theory of spontaneous wave function collapse is known to provide a quantum theory without observers, in fact two different ones by using either the matter density ontology (GRWm) or the flash ontology (GRWf). Both theories are known to make predictions different from those of quantum mechanics, but the difference is so small that no decisive experiment can as yet be performed. While some testable deviations from quantum mechanics have long been known, we provide here something that has until now been missing: a formalism that succinctly summarizes the empirical predictions of GRWm and GRWf. We call it the GRW formalism. Its structure is similar to that of the quantum formalism but involves different operators. In other words, we establish the validity of a general algorithm for directly computing the testable predictions of GRWm and GRWf. We further show that some well-defined quantities cannot be measured in a GRWm or GRWf world.  相似文献   

19.
The fundamental equations of equilibrium quantum statistical mechanics are derived in the context of a measure-theoretic approach to the quantum mechanical ergodic problem. The method employed is an extension, to quantum mechanical systems, of the techniques developed by R. M. Lewis for establishing the foundations of classical statistical mechanics. The existence of a complete set of commuting observables is assumed, but no reference is made a priori to probability or statistical ensembles. Expressions for infinite-time averages in the microcanonical, canonical, and grand canonical ensembles are developed which reduce to conventional quantum statistical mechanics for systems in equilibrium when the total energy is the only conserved quantity. No attempt is made to extend the formalism at this time to deal with the difficult problem of the approach to equilibrium.  相似文献   

20.
We reconsider the decoherent histories approach to quantum mechanics and analyze some problems related to its interpretation which we believe have not been adequately clarified by its proponents. We put forward some assumptions which, in our opinion, are necessary for a realistic interpretation of the probabilities that the formalism attaches to decoherent histories. We prove that such assumptions, unless one limits the set of the decoherent families which can be taken into account, lead to a logical contradiction. The line of reasoning we follow is conceptually different from other arguments which have been presented and which have been rejected by the supporters of the decoherent histories approach. The conclusion is that the decoherent histories approach, to be considered as an interesting realistic alternative to the orthodox interpretation of quantum mechanics, requires the identification of a mathematically precise criterion to characterize an appropriate set of decoherent families which does not give rise to any problem.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号