首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In 2016, Steve Gull has outlined has outlined a proof of Bell’s theorem using Fourier theory. Gull’s philosophy is that Bell’s theorem (or perhaps a key lemma in its proof) can be seen as a no-go theorem for a project in distributed computing with classical, not quantum, computers. We present his argument, correcting misprints and filling gaps. In his argument, there were two completely separated computers in the network. We need three in order to fill all the gaps in his proof: a third computer supplies a stream of random numbers to the two computers representing the two measurement stations in Bell’s work. One could also imagine that computer replaced by a cloned, virtual computer, generating the same pseudo-random numbers within each of Alice and Bob’s computers. Either way, we need an assumption of the presence of shared i.i.d. randomness in the form of a synchronised sequence of realisations of i.i.d. hidden variables underlying the otherwise deterministic physics of the sequence of trials. Gull’s proof then just needs a third step: rewriting an expectation as the expectation of a conditional expectation given the hidden variables.  相似文献   

2.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

3.
Our account provides a local, realist and fully non-causal principle explanation for EPR correlations, contextuality, no-signalling, and the Tsirelson bound. Indeed, the account herein is fully consistent with the causal structure of Minkowski spacetime. We argue that retrocausal accounts of quantum mechanics are problematic precisely because they do not fully transcend the assumption that causal or constructive explanation must always be fundamental. Unlike retrocausal accounts, our principle explanation is a complete rejection of Reichenbach’s Principle. Furthermore, we will argue that the basis for our principle account of quantum mechanics is the physical principle sought by quantum information theorists for their reconstructions of quantum mechanics. Finally, we explain why our account is both fully realist and psi-epistemic.  相似文献   

4.
It is well-known that the law of total probability does not generally hold in quantum theory. However, recent arguments on some of the fundamental assumptions in quantum theory based on the extended Wigner’s friend scenario show a need to clarify how the law of total probability should be formulated in quantum theory and under what conditions it still holds. In this work, the definition of conditional probability in quantum theory is extended to POVM measurements. A rule to assign two-time conditional probability is proposed for incompatible POVM operators, which leads to a more general and precise formulation of the law of total probability. Sufficient conditions under which the law of total probability holds are identified. Applying the theory developed here to analyze several quantum no-go theorems related to the extended Wigner’s friend scenario reveals logical loopholes in these no-go theorems. The loopholes exist as a consequence of taking for granted the validity of the law of total probability without verifying the sufficient conditions. Consequently, the contradictions in these no-go theorems only reconfirm the invalidity of the law of total probability in quantum theory rather than invalidating the physical statements that the no-go theorems attempt to refute.  相似文献   

5.
Wigner’s friend scenarios involve an Observer, or Observers, measuring a Friend, or Friends, who themselves make quantum measurements. In recent discussions, it has been suggested that quantum mechanics may not always be able to provide a consistent account of a situation involving two Observers and two Friends. We investigate this problem by invoking the basic rules of quantum mechanics as outlined by Feynman in the well-known “Feynman Lectures on Physics”. We show here that these “Feynman rules” constrain the a priori assumptions which can be made in generalised Wigner’s friend scenarios, because the existence of the probabilities of interest ultimately depends on the availability of physical evidence (material records) of the system’s past. With these constraints obeyed, a non-ambiguous and consistent account of all measurement outcomes is obtained for all agents, taking part in various Wigner’s Friend scenarios.  相似文献   

6.
Max Born’s statistical interpretation made probabilities play a major role in quantum theory. Here we show that these quantum probabilities and the classical probabilities have very different origins. Although the latter always result from an assumed probability measure, the first include transition probabilities with a purely algebraic origin. Moreover, the general definition of transition probability introduced here comprises not only the well-known quantum mechanical transition probabilities between pure states or wave functions, but further physically meaningful and experimentally verifiable novel cases. A transition probability that differs from 0 and 1 manifests the typical quantum indeterminacy in a similar way as Heisenberg’s and others’ uncertainty relations and, furthermore, rules out deterministic states in the same way as the Bell-Kochen-Specker theorem. However, the transition probability defined here achieves a lot more beyond that: it demonstrates that the algebraic structure of the Hilbert space quantum logic dictates the precise values of certain probabilities and it provides an unexpected access to these quantum probabilities that does not rely on states or wave functions.  相似文献   

7.
During the physical foundation of his radiation formula in his December 1900 talk and subsequent 1901 article, Planck refers to Boltzmann’s 1877 combinatorial-probabilistic treatment and obtains his quantum distribution function, while Boltzmann did not. For this, Boltzmann’s memoirs are usually ascribed to classical statistical mechanics. Agreeing with Bach, it is shown that Boltzmann’s 1868 and 1877 calculations can lead to a Planckian distribution function, where those of 1868 are even closer to Planck than that of 1877. Boltzmann’s and Planck’s calculations are compared based on Bach’s three-level scheme ‘configuration–occupation–occupancy’. Special attention is paid to the concepts of interchangeability and the indistinguishability of particles and states. In contrast to Bach, the level of exposition is most elementary. I hope to make Boltzmann’s work better known in English and to remove misunderstandings in the literature.  相似文献   

8.
We introduce a quantum key distribution protocol using mean multi-kings’ problem. Using this protocol, a sender can share a bit sequence as a secret key with receivers. We consider a relation between information gain by an eavesdropper and disturbance contained in legitimate users’ information. In BB84 protocol, such relation is known as the so-called information disturbance theorem. We focus on a setting that the sender and two receivers try to share bit sequences and the eavesdropper tries to extract information by interacting legitimate users’ systems and an ancilla system. We derive trade-off inequalities between distinguishability of quantum states corresponding to the bit sequence for the eavesdropper and error probability of the bit sequence shared with the legitimate users. Our inequalities show that eavesdropper’s extracting information regarding the secret keys inevitably induces disturbing the states and increasing the error probability.  相似文献   

9.
Maxwell’s demon is an entity in a 150-year-old thought experiment that paradoxically appears to violate the second law of thermodynamics by reducing entropy without doing work. It has increasingly practical implications as advances in nanomachinery produce devices that push the thermodynamic limits imposed by the second law. A well-known explanation claiming that information erasure restores second law compliance fails to resolve the paradox because it assumes the second law a priori, and does not predict irreversibility. Instead, a purely mechanical resolution that does not require information theory is presented. The transport fluxes of mass, momentum, and energy involved in the demon’s operation are analyzed and show that they imply “hidden” external work and dissipation. Computing the dissipation leads to a new lower bound on entropy production by the demon. It is strictly positive in all nontrivial cases, providing a more stringent limit than the second law and implying intrinsic thermodynamic irreversibility. The thermodynamic irreversibility is linked with mechanical irreversibility resulting from the spatial asymmetry of the demon’s speed selection criteria, indicating one mechanism by which macroscopic irreversibility may emerge from microscopic dynamics.  相似文献   

10.
We present a new post-processing method for Quantum Key Distribution (QKD) that raises cubically the secret key rate in the number of double matching detection events. In Shannon’s communication model, information is prepared at Alice’s side, and it is then intended to pass it over a noisy channel. In our approach, secret bits do not rely in Alice’s transmitted quantum bits but in Bob’s basis measurement choices. Therefore, measured bits are publicly revealed, while bases selections remain secret. Our method implements sifting, reconciliation, and amplification in a unique process, and it just requires a round iteration; no redundancy bits are sent, and there is no limit in the correctable error percentage. Moreover, this method can be implemented as a post-processing software into QKD technologies already in use.  相似文献   

11.
Landauer’s principle provides a fundamental lower bound for energy dissipation occurring with information erasure in the quantum regime. While most studies have related the entropy reduction incorporated with the erasure to the lower bound (entropic bound), recent efforts have also provided another lower bound associated with the thermal fluctuation of the dissipated energy (thermodynamic bound). The coexistence of the two bounds has stimulated comparative studies of their properties; however, these studies were performed for systems where the time-evolution of diagonal (population) and off-diagonal (coherence) elements of the density matrix are decoupled. In this paper, we aimed to broaden the comparative study to include the influence of quantum coherence induced by the tilted system–reservoir interaction direction. By examining their dependence on the initial state of the information-bearing system, we find that the following properties of the bounds are generically held regardless of whether the influence of the coherence is present or not: the entropic bound serves as the tighter bound for a sufficiently mixed initial state, while the thermodynamic bound is tighter when the purity of the initial state is sufficiently high. The exception is the case where the system dynamics involve only phase relaxation; in this case, the two bounds coincide when the initial coherence is zero; otherwise, the thermodynamic bound serves the tighter bound. We also find the quantum information erasure inevitably accompanies constant energy dissipation caused by the creation of system–reservoir correlation, which may cause an additional source of energetic cost for the erasure.  相似文献   

12.
Bell inequalities were created with the goal of improving the understanding of foundational questions in quantum mechanics. To this end, they are typically applied to measurement results generated from entangled systems of particles. They can, however, also be used as a statistical tool for macroscopic systems, where they can describe the connection strength between two components of a system under a causal model. We show that, in principle, data from macroscopic observations analyzed with Bell’ s approach can invalidate certain causal models. To illustrate this use, we describe a macroscopic game setting, without a quantum mechanical measurement process, and analyze it using the framework of Bell experiments. In the macroscopic game, violations of the inequalities can be created by cheating with classically defined strategies. In the physical context, the meaning of violations is less clear and is still vigorously debated. We discuss two measures for optimal strategies to generate a given statistic that violates the inequalities. We show their mathematical equivalence and how they can be computed from CHSH-quantities alone, if non-signaling applies. As a macroscopic example from the financial world, we show how the unfair use of insider knowledge could be picked up using Bell statistics. Finally, in the discussion of realist interpretations of quantum mechanical Bell experiments, cheating strategies are often expressed through the ideas of free choice and locality. In this regard, violations of free choice and locality can be interpreted as two sides of the same coin, which underscores the view that the meaning these terms are given in Bell’s approach should not be confused with their everyday use. In general, we conclude that Bell’s approach also carries lessons for understanding macroscopic systems of which the connectedness conforms to different causal structures.  相似文献   

13.
14.
Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s entropy’s plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon’s entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon’s entropy.  相似文献   

15.
We provide a new formulation of the Local Friendliness no-go theorem of Bong et al. [Nat. Phys. 16, 1199 (2020)] from fundamental causal principles, providing another perspective on how it puts strictly stronger bounds on quantum reality than Bell’s theorem. In particular, quantum causal models have been proposed as a way to maintain a peaceful coexistence between quantum mechanics and relativistic causality while respecting Leibniz’s methodological principle. This works for Bell’s theorem but does not work for the Local Friendliness no-go theorem, which considers an extended Wigner’s Friend scenario. More radical conceptual renewal is required; we suggest that cleaving to Leibniz’s principle requires extending relativity to events themselves.  相似文献   

16.
This study investigates the conformity to Benford’s Law of the information disclosed in financial statements. Using the first digit test of Benford’s Law, the study analyses the reliability of financial information provided by listed companies on an emerging capital market before and after the implementation of International Financial Reporting Standards (IFRS). The results of the study confirm the increase of reliability on the information disclosed in the financial statements after IFRS implementation. The study contributes to the existing literature by bringing new insights into the types of financial information that do not comply with Benford’s Law such as the amounts determined by estimates or by applying professional judgment.  相似文献   

17.
We consider the negotiation problem, in which an agent negotiates on behalf of a principal. Our considerations are focused on the Inspire negotiation support system in which the principal’s preferences are visualised by circles. In this way, the principal describes the importance of each negotiation issue and the relative utility of each considered option. The paper proposes how this preference information may be implemented by the agent for determining a scoring function used to support decisions throughout the negotiation process. The starting point of our considerations is a discussion regarding the visualisation of the principal’s preferences. We assume here that the importance of each issue and the utility of each option increases with the size of the circle representing them. The imprecise meaning of the notion of “circle size” implies that in a considered case, the utility of an option should be evaluated by a fuzzy number. The proposed utility fuzzification is justified by a simple analysis of results obtained from the empirical prenegotiation experiment. A novel method is proposed to determine trapezoidal fuzzy numbers, which evaluates an option’s utility using a series of answers given by the participants of the experiment. The utilities obtained this way are applied to determine the fuzzy scoring function for an agent. By determining such a common generalised fuzzy scoring system, our approach helps agents handle the differences in human cognitive processes associated with understanding the principal’s preferences. This work is the first approach to fuzzification of the preferences in the Inspire negotiation support system.  相似文献   

18.
We discuss novel many-fermions thermodynamics’ features. They refer to the energy cost associated to order-disorder changes. Our thermal quantum statistical scenario is controlled by suitable fermion-fermion interactions. We deal with two well-known quantum interactions that operate within an exactly solvable model. This model is able to adequately describe some aspects of fermion-dynamics, particularly level-crossings. We describe things via employment of Gibbs’ canonical ensemble strictures. We show that judicious manipulation of the energy cost associated to statistical order (disorder) variations generates useful information-quantifiers. The underlying idea is that changes in the degree of order are intimately linked to level-crossings energetic costs.  相似文献   

19.
Many small biological objects, such as viruses, survive in a water environment and cannot remain active in dry air without condensation of water vapor. From a physical point of view, these objects belong to the mesoscale, where small thermal fluctuations with the characteristic kinetic energy of kBT (where kB is the Boltzmann’s constant and T is the absolute temperature) play a significant role. The self-assembly of viruses, including protein folding and the formation of a protein capsid and lipid bilayer membrane, is controlled by hydrophobic forces (i.e., the repulsing forces between hydrophobic particles and regions of molecules) in a water environment. Hydrophobic forces are entropic, and they are driven by a system’s tendency to attain the maximum disordered state. On the other hand, in information systems, entropic forces are responsible for erasing information, if the energy barrier between two states of a switch is on the order of kBT, which is referred to as Landauer’s principle. We treated hydrophobic interactions responsible for the self-assembly of viruses as an information-processing mechanism. We further showed a similarity of these submicron-scale processes with the self-assembly in colloidal crystals, droplet clusters, and liquid marbles.  相似文献   

20.
Shinagawa and Iwata are considered quantum security for the sum of Even–Mansour (SoEM) construction and provided quantum key recovery attacks by Simon’s algorithm and Grover’s algorithm. Furthermore, quantum key recovery attacks are also presented for natural generalizations of SoEM. For some variants of SoEM, they found that their quantum attacks are not obvious and left it as an open problem to discuss the security of such constructions. This paper focuses on this open problem and presents a positive response. We provide quantum key recovery attacks against such constructions by quantum algorithms. For natural generalizations of SoEM with linear key schedules, we also present similar quantum key recovery attacks by quantum algorithms (Simon’s algorithm, Grover’s algorithm, and Grover-meet-Simon algorithm).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号