首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

2.
We present a critique of the many-world interpretation of quantum mechanics, based on different “pictures” that describe the time evolution of an isolated quantum system. Without an externally imposed frame to restrict these possible pictures, the theory cannot yield non-trivial interpretational statements. This is analogous to Goodman’s famous “grue-bleen” problem of language and induction. Using a general framework applicable to many kinds of dynamical theories, we try to identify the kind of additional structure (if any) required for the meaningful interpretation of a theory. We find that the “grue-bleen” problem is not restricted to quantum mechanics, but also affects other theories including classical Hamiltonian mechanics. For all such theories, absent external frame information, an isolated system has no interpretation.  相似文献   

3.
The consensus regarding quantum measurements rests on two statements: (i) von Neumann’s standard quantum measurement theory leaves undetermined the basis in which observables are measured, and (ii) the environmental decoherence of the measuring device (the “meter”) unambiguously determines the measuring (“pointer”) basis. The latter statement means that the environment monitors (measures) selected observables of the meter and (indirectly) of the system. Equivalently, a measured quantum state must end up in one of the “pointer states” that persist in the presence of the environment. We find that, unless we restrict ourselves to projective measurements, decoherence does not necessarily determine the pointer basis of the meter. Namely, generalized measurements commonly allow the observer to choose from a multitude of alternative pointer bases that provide the same information on the observables, regardless of decoherence. By contrast, the measured observable does not depend on the pointer basis, whether in the presence or in the absence of decoherence. These results grant further support to our notion of Quantum Lamarckism, whereby the observer’s choices play an indispensable role in quantum mechanics.  相似文献   

4.
Quantum information theorists have created axiomatic reconstructions of quantum mechanics (QM) that are very successful at identifying precisely what distinguishes quantum probability theory from classical and more general probability theories in terms of information-theoretic principles. Herein, we show how one such principle, Information Invariance and Continuity, at the foundation of those axiomatic reconstructions, maps to “no preferred reference frame” (NPRF, aka “the relativity principle”) as it pertains to the invariant measurement of Planck’s constant h for Stern-Gerlach (SG) spin measurements. This is in exact analogy to the relativity principle as it pertains to the invariant measurement of the speed of light c at the foundation of special relativity (SR). Essentially, quantum information theorists have extended Einstein’s use of NPRF from the boost invariance of measurements of c to include the SO(3) invariance of measurements of h between different reference frames of mutually complementary spin measurements via the principle of Information Invariance and Continuity. Consequently, the “mystery” of the Bell states is understood to result from conservation per Information Invariance and Continuity between different reference frames of mutually complementary qubit measurements, and this maps to conservation per NPRF in spacetime. If one falsely conflates the relativity principle with the classical theory of SR, then it may seem impossible that the relativity principle resides at the foundation of non-relativisitic QM. In fact, there is nothing inherently classical or quantum about NPRF. Thus, the axiomatic reconstructions of QM have succeeded in producing a principle account of QM that reveals as much about Nature as the postulates of SR.  相似文献   

5.
Quantum digital signatures (QDS) are able to verify the authenticity and integrity of a message in modern communication. However, the current QDS protocols are restricted by the fundamental rate-loss bound and the secure signature distance cannot be further improved. We propose a twin-field quantum digital signature (TF-QDS) protocol with fully discrete phase randomization and investigate its performance under the two-intensity decoy-state setting. For better performance, we optimize intensities of the signal state and the decoy state for each given distance. Numerical simulation results show that our TF-QDS with as few as six discrete random phases can give a higher signature rate and a longer secure transmission distance compared with current quantum digital signatures (QDSs), such as BB84-QDS and measurement-device-independent QDS (MDI-QDS). Moreover, we provide a clear comparison among some possible TF-QDSs constructed by different twin-field key generation protocols (TF-KGPs) and find that the proposed TF-QDS exhibits the best performance. Conclusively, the advantages of the proposed TF-QDS protocol in signature rate and secure transmission distance are mainly due to the single-photon interference applied in the measurement module and precise matching of discrete phases. Besides, our TF-QDS shows the feasibility of experimental implementation with current devices in practical QDS system.  相似文献   

6.
Wigner’s friend scenarios involve an Observer, or Observers, measuring a Friend, or Friends, who themselves make quantum measurements. In recent discussions, it has been suggested that quantum mechanics may not always be able to provide a consistent account of a situation involving two Observers and two Friends. We investigate this problem by invoking the basic rules of quantum mechanics as outlined by Feynman in the well-known “Feynman Lectures on Physics”. We show here that these “Feynman rules” constrain the a priori assumptions which can be made in generalised Wigner’s friend scenarios, because the existence of the probabilities of interest ultimately depends on the availability of physical evidence (material records) of the system’s past. With these constraints obeyed, a non-ambiguous and consistent account of all measurement outcomes is obtained for all agents, taking part in various Wigner’s Friend scenarios.  相似文献   

7.
Einstein-Podolsky-Rosen steering is a kind of powerful nonlocal quantum resource in quantum information processing such as quantum cryptography and quantum communication. Many criteria have been proposed in the past few years to detect steerability, both analytically and numerically, for bipartite quantum systems. We propose effective criteria for tripartite steerability and genuine tripartite steerability of three-qubit quantum states by establishing connections between the tripartite steerability (resp. genuine tripartite steerability) and the tripartite entanglement (resp. genuine tripartite entanglement) of certain corresponding quantum states. From these connections, tripartite steerability and genuine tripartite steerability can be detected without using any steering inequalities. The “complex cost” of determining tripartite steering and genuine tripartite steering can be reduced by detecting the entanglement of the newly constructed states in the experiment. Detailed examples are given to illustrate the power of our criteria in detecting the (genuine) tripartite steerability of tripartite states.  相似文献   

8.
The task of reconstructing the system’s state from the measurements results, known as the Pauli problem, usually requires repetition of two successive steps. Preparation in an initial state to be determined is followed by an accurate measurement of one of the several chosen operators in order to provide the necessary “Pauli data”. We consider a similar yet more general problem of recovering Feynman’s transition (path) amplitudes from the results of at least three consecutive measurements. The three-step histories of a pre- and post-selected quantum system are subjected to a type of interference not available to their two-step counterparts. We show that this interference can be exploited, and if the intermediate measurement is “fuzzy”, the path amplitudes can be successfully recovered. The simplest case of a two-level system is analysed in detail. The “weak measurement” limit and the usefulness of the path amplitudes are also discussed.  相似文献   

9.
The article argues that—at least in certain interpretations, such as the one assumed in this article under the heading of “reality without realism”—the quantum-theoretical situation appears as follows: While—in terms of probabilistic predictions—connected to and connecting the information obtained in quantum phenomena, the mathematics of quantum theory (QM or QFT), which is continuous, does not represent and is discontinuous with both the emergence of quantum phenomena and the physics of these phenomena, phenomena that are physically discontinuous with each other as well. These phenomena, and thus this information, are described by classical physics. All actually available information (in the mathematical sense of information theory) is classical: it is composed of units, such as bits, that are—or are contained in—entities described by classical physics. On the other hand, classical physics cannot predict this information when it is created, as manifested in measuring instruments, in quantum experiments, while quantum theory can. In this epistemological sense, this information is quantum. The article designates the discontinuity between quantum theory and the emergence of quantum phenomena the “Heisenberg discontinuity”, because it was introduced by W. Heisenberg along with QM, and the discontinuity between QM or QFT and the classical physics of quantum phenomena, the “Bohr discontinuity”, because it was introduced as part of Bohr’s interpretation of quantum phenomena and QM, under the assumption of Heisenberg discontinuity. Combining both discontinuities precludes QM or QFT from being connected to either physical reality, that ultimately responsible for quantum phenomena or that of these phenomena themselves, other than by means of probabilistic predictions concerning the information, classical in character, contained in quantum phenomena. The nature of quantum information is, in this view, defined by this situation. A major implication, discussed in the Conclusion, is the existence and arguably the necessity of two—classical and quantum—or with relativity, three and possibly more essentially different theories in fundamental physics.  相似文献   

10.
I take non-locality to be the Michelson–Morley experiment of the early 21st century, assume its universal validity, and try to derive its consequences. Spacetime, with its locality, cannot be fundamental, but must somehow be emergent from entangled coherent quantum variables and their behaviors. There are, then, two immediate consequences: (i). if we start with non-locality, we need not explain non-locality. We must instead explain an emergence of locality and spacetime. (ii). There can be no emergence of spacetime without matter. These propositions flatly contradict General Relativity, which is foundationally local, can be formulated without matter, and in which there is no “emergence” of spacetime. If these be true, then quantum gravity cannot be a minor alteration of General Relativity but must demand its deep reformulation. This will almost inevitably lead to: matter not only curves spacetime, but “creates” spacetime. We will see independent grounds for the assertion that matter both curves and creates spacetime that may invite a new union of quantum gravity and General Relativity. This quantum creation of spacetime consists of: (i) fully non-local entangled coherent quantum variables. (ii) The onset of locality via decoherence. (iii) A metric in Hilbert space among entangled quantum variables by the sub-additive von Neumann entropy between pairs of variables. (iv) Mapping from metric distances in Hilbert space to metric distances in classical spacetime by episodic actualization events. (v) Discrete spacetime is the relations among these discrete actualization events. (vi) “Now” is the shared moment of actualization of one among the entangled variables when the amplitudes of the remaining entangled variables change instantaneously. (vii) The discrete, successive, episodic, irreversible actualization events constitute a quantum arrow of time. (viii) The arrow of time history of these events is recorded in the very structure of the spacetime constructed. (ix) Actual Time is a succession of two or more actual events. The theory inevitably yields a UV cutoff of a new type. The cutoff is a phase transition between continuous spacetime before the transition and discontinuous spacetime beyond the phase transition. This quantum creation of spacetime modifies General Relativity and may account for Dark Energy, Dark Matter, and the possible elimination of the singularities of General Relativity. Relations to Causal Set Theory, faithful Lorentzian manifolds, and past and future light cones joined at “Actual Now” are discussed. Possible observational and experimental tests based on: (i). the existence of Sub- Planckian photons, (ii). knee and ankle discontinuities in the high-energy gamma ray spectrum, and (iii). possible experiments to detect a creation of spacetime in the Casimir system are discussed. A quantum actualization enhancement of repulsive Casimir effect would be anti-gravitational and of possible practical use. The ideas and concepts discussed here are not yet a theory, but at most the start of a framework that may be useful.  相似文献   

11.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

12.
We propose the first correct special-purpose quantum circuits for preparation of Bell diagonal states (BDS), and implement them on the IBM Quantum computer, characterizing and testing complex aspects of their quantum correlations in the full parameter space. Among the circuits proposed, one involves only two quantum bits but requires adapted quantum tomography routines handling classical bits in parallel. The entire class of Bell diagonal states is generated, and several characteristic indicators, namely entanglement of formation and concurrence, CHSH non-locality, steering and discord, are experimentally evaluated over the full parameter space and compared with theory. As a by-product of this work, we also find a remarkable general inequality between “quantum discord” and “asymmetric relative entropy of discord”: the former never exceeds the latter. We also prove that for all BDS the two coincide.  相似文献   

13.
This article reconsiders the double-slit experiment from the nonrealist or, in terms of this article, “reality-without-realism” (RWR) perspective, grounded in the combination of three forms of quantum discontinuity: (1) “Heisenberg discontinuity”, defined by the impossibility of a representation or even conception of how quantum phenomena come about, even though quantum theory (such as quantum mechanics or quantum field theory) predicts the data in question strictly in accord with what is observed in quantum experiments); (2) “Bohr discontinuity”, defined, under the assumption of Heisenberg discontinuity, by the view that quantum phenomena and the data observed therein are described by classical and not quantum theory, even though classical physics cannot predict them; and (3) “Dirac discontinuity” (not considered by Dirac himself, but suggested by his equation), according to which the concept of a quantum object, such as a photon or electron, is an idealization only applicable at the time of observation and not to something that exists independently in nature. Dirac discontinuity is of particular importance for the article’s foundational argument and its analysis of the double-slit experiment.  相似文献   

14.
It is known that “quantum non locality”, leading to the violation of Bell’s inequality and more generally of classical local realism, can be attributed to the conjunction of two properties, which we call here elementary locality and predictive completeness. Taking this point of view, we show again that quantum mechanics violates predictive completeness, allowing the making of contextual inferences, which can, in turn, explain why quantum non locality does not contradict relativistic causality. An important question remains: if the usual quantum state ψ is predictively incomplete, how do we complete it? We give here a set of new arguments to show that ψ should be completed indeed, not by looking for any “hidden variables”, but rather by specifying the measurement context, which is required to define actual probabilities over a set of mutually exclusive physical events.  相似文献   

15.
We consider a semantics based on the peculiar holistic features of the quantum formalism. Any formula of the language gives rise to a quantum circuit that transforms the density operator associated to the formula into the density operator associated to the atomic subformulas in a reversible way. The procedure goes from the whole to the parts against the compositionality-principle and gives rise to a semantic characterization for a new form of quantum logic that has been called “Łukasiewicz quantum computational logic”. It is interesting to compare the logic based on qubit-semantics with that on qudit-semantics. Having in mind the relationships between classical logic and Łukasiewicz-many valued logics, one could expect that the former is stronger than the fragment of the latter. However, this is not the case. From an intuitive point of view, this can be explained by recalling that the former is a very weak form of logic. Many important logical arguments, which are valid either in Birkhoff and von Neumann’s quantum logic or in classical logic, are generally violated.  相似文献   

16.
The meaning and evolution of the notion of “temperature” (which is a key concept for the condensed and gaseous matter theories) are addressed from different points of view. The concept of temperature has turned out to be much more fundamental than conventionally thought. In particular, the temperature may be introduced for systems built of a “small” number of particles and particles at rest. The Kelvin temperature scale may be introduced into quantum and relativistic physics due to the fact that the efficiency of the quantum and relativistic Carnot cycles coincides with that of the classical one. The relation of temperature with the metrics of the configurational space describing the behavior of systems built from non-interacting particles is demonstrated. The role of temperature in constituting inertia and gravity forces treated as entropy forces is addressed. The Landauer principle asserts that the temperature of a system is the only physical value defining the energy cost of the isothermal erasure of a single bit of information. The fundamental role of the temperature of the cosmic microwave background in modern cosmology is discussed. The range of problems and controversies related to the negative absolute temperature is treated.  相似文献   

17.
18.
Core quantum postulates including the superposition principle and the unitarity of evolutions are natural and strikingly simple. I show that—when supplemented with a limited version of predictability (captured in the textbook accounts by the repeatability postulate)—these core postulates can account for all the symptoms of classicality. In particular, both objective classical reality and elusive information about reality arise, via quantum Darwinism, from the quantum substrate. This approach shares with the Relative State Interpretation of Everett the view that collapse of the wavepacket reflects perception of the state of the rest of the Universe relative to the state of observer’s records. However, our “let quantum be quantum” approach poses questions absent in Bohr’s Copenhagen Interpretation that relied on the preexisting classical domain. Thus, one is now forced to seek preferred, predictable, hence effectively classical but ultimately quantum states that allow observers keep reliable records. Without such (i) preferred basis relative states are simply “too relative”, and the ensuing basis ambiguity makes it difficult to identify events (e.g., measurement outcomes). Moreover, universal validity of quantum theory raises the issue of (ii) the origin of Born’s rule, pk=|ψk|2, relating probabilities and amplitudes (that is simply postulated in textbooks). Last not least, even preferred pointer states (defined by einselectionenvironment—induced superselection)—are still quantum. Therefore, unlike classical states that exist objectively, quantum states of an individual system cannot be found out by an initially ignorant observer through direct measurement without being disrupted. So, to complete the ‘quantum theory of the classical’ one must identify (iii) quantum origin of objective existence and explain how the information about objectively existing states can appear to be essentially inconsequential for them (as it does for states in Newtonian physics) and yet matter in other settings (e.g., thermodynamics). I show how the mathematical structure of quantum theory supplemented by the only uncontroversial measurement postulate (that demands immediate repeatability—hence, predictability) leads to preferred states. These (i) pointer states correspond to measurement outcomes. Their stability is a prerequisite for objective existence of effectively classical states and for events such as quantum jumps. Events at hand, one can now enquire about their probability—the probability of a pointer state (or of a measurement record). I show that the symmetry of entangled states—(ii) entanglement—assisted invariance or envariance—implies Born’s rule. Envariance also accounts for the loss of phase coherence between pointer states. Thus, decoherence can be traced to symmetries of entanglement and understood without its usual tool—reduced density matrices. A simple and manifestly noncircular derivation of pk=|ψk|2 follows. Monitoring of the system by its environment in course of decoherence typically leaves behind multiple copies of its pointer states in the environment. Only pointer states can survive decoherence and can spawn such plentiful information-theoretic progeny. This (iii) quantum Darwinism allows observers to use environment as a witness—to find out pointer states indirectly, leaving systems of interest untouched. Quantum Darwinism shows how epistemic and ontic (coexisting in epiontic quantum state) separate into robust objective existence of pointer states and detached information about them, giving rise to extantons—composite objects with system of interest in the core and multiple records of its pointer states in the halo comprising of environment subsystems (e.g., photons) which disseminates that information throughout the Universe.  相似文献   

19.
Integrated information has been recently suggested as a possible measure to identify a necessary condition for a system to display conscious features. Recently, we have shown that astrocytes contribute to the generation of integrated information through the complex behavior of neuron–astrocyte networks. Still, it remained unclear which underlying mechanisms governing the complex behavior of a neuron–astrocyte network are essential to generating positive integrated information. This study presents an analytic consideration of this question based on exact and asymptotic expressions for integrated information in terms of exactly known probability distributions for a reduced mathematical model (discrete-time, discrete-state stochastic model) reflecting the main features of the “spiking–bursting” dynamics of a neuron–astrocyte network. The analysis was performed in terms of the empirical “whole minus sum” version of integrated information in comparison to the “decoder based” version. The “whole minus sum” information may change sign, and an interpretation of this transition in terms of “net synergy” is available in the literature. This motivated our particular interest in the sign of the “whole minus sum” information in our analytical considerations. The behaviors of the “whole minus sum” and “decoder based” information measures are found to bear a lot of similarity—they have mutual asymptotic convergence as time-uncorrelated activity increases, and the sign transition of the “whole minus sum” information is associated with a rapid growth in the “decoder based” information. The study aims at creating a theoretical framework for using the spiking–bursting model as an analytically tractable reference point for applying integrated information concepts to systems exhibiting similar bursting behavior. The model can also be of interest as a new discrete-state test bench for different formulations of integrated information.  相似文献   

20.
If quantum mechanics is taken for granted, the randomness derived from it may be vacuous or even delusional, yet sufficient for many practical purposes. “Random” quantum events are intimately related to the emergence of both space-time as well as the identification of physical properties through which so-called objects are aggregated. We also present a brief review of the metaphysics of indeterminism.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号