首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

2.
Our account provides a local, realist and fully non-causal principle explanation for EPR correlations, contextuality, no-signalling, and the Tsirelson bound. Indeed, the account herein is fully consistent with the causal structure of Minkowski spacetime. We argue that retrocausal accounts of quantum mechanics are problematic precisely because they do not fully transcend the assumption that causal or constructive explanation must always be fundamental. Unlike retrocausal accounts, our principle explanation is a complete rejection of Reichenbach’s Principle. Furthermore, we will argue that the basis for our principle account of quantum mechanics is the physical principle sought by quantum information theorists for their reconstructions of quantum mechanics. Finally, we explain why our account is both fully realist and psi-epistemic.  相似文献   

3.
Wigner’s friend scenarios involve an Observer, or Observers, measuring a Friend, or Friends, who themselves make quantum measurements. In recent discussions, it has been suggested that quantum mechanics may not always be able to provide a consistent account of a situation involving two Observers and two Friends. We investigate this problem by invoking the basic rules of quantum mechanics as outlined by Feynman in the well-known “Feynman Lectures on Physics”. We show here that these “Feynman rules” constrain the a priori assumptions which can be made in generalised Wigner’s friend scenarios, because the existence of the probabilities of interest ultimately depends on the availability of physical evidence (material records) of the system’s past. With these constraints obeyed, a non-ambiguous and consistent account of all measurement outcomes is obtained for all agents, taking part in various Wigner’s Friend scenarios.  相似文献   

4.
Max Born’s statistical interpretation made probabilities play a major role in quantum theory. Here we show that these quantum probabilities and the classical probabilities have very different origins. Although the latter always result from an assumed probability measure, the first include transition probabilities with a purely algebraic origin. Moreover, the general definition of transition probability introduced here comprises not only the well-known quantum mechanical transition probabilities between pure states or wave functions, but further physically meaningful and experimentally verifiable novel cases. A transition probability that differs from 0 and 1 manifests the typical quantum indeterminacy in a similar way as Heisenberg’s and others’ uncertainty relations and, furthermore, rules out deterministic states in the same way as the Bell-Kochen-Specker theorem. However, the transition probability defined here achieves a lot more beyond that: it demonstrates that the algebraic structure of the Hilbert space quantum logic dictates the precise values of certain probabilities and it provides an unexpected access to these quantum probabilities that does not rely on states or wave functions.  相似文献   

5.
During the physical foundation of his radiation formula in his December 1900 talk and subsequent 1901 article, Planck refers to Boltzmann’s 1877 combinatorial-probabilistic treatment and obtains his quantum distribution function, while Boltzmann did not. For this, Boltzmann’s memoirs are usually ascribed to classical statistical mechanics. Agreeing with Bach, it is shown that Boltzmann’s 1868 and 1877 calculations can lead to a Planckian distribution function, where those of 1868 are even closer to Planck than that of 1877. Boltzmann’s and Planck’s calculations are compared based on Bach’s three-level scheme ‘configuration–occupation–occupancy’. Special attention is paid to the concepts of interchangeability and the indistinguishability of particles and states. In contrast to Bach, the level of exposition is most elementary. I hope to make Boltzmann’s work better known in English and to remove misunderstandings in the literature.  相似文献   

6.
We introduce a quantum key distribution protocol using mean multi-kings’ problem. Using this protocol, a sender can share a bit sequence as a secret key with receivers. We consider a relation between information gain by an eavesdropper and disturbance contained in legitimate users’ information. In BB84 protocol, such relation is known as the so-called information disturbance theorem. We focus on a setting that the sender and two receivers try to share bit sequences and the eavesdropper tries to extract information by interacting legitimate users’ systems and an ancilla system. We derive trade-off inequalities between distinguishability of quantum states corresponding to the bit sequence for the eavesdropper and error probability of the bit sequence shared with the legitimate users. Our inequalities show that eavesdropper’s extracting information regarding the secret keys inevitably induces disturbing the states and increasing the error probability.  相似文献   

7.
Maxwell’s demon is an entity in a 150-year-old thought experiment that paradoxically appears to violate the second law of thermodynamics by reducing entropy without doing work. It has increasingly practical implications as advances in nanomachinery produce devices that push the thermodynamic limits imposed by the second law. A well-known explanation claiming that information erasure restores second law compliance fails to resolve the paradox because it assumes the second law a priori, and does not predict irreversibility. Instead, a purely mechanical resolution that does not require information theory is presented. The transport fluxes of mass, momentum, and energy involved in the demon’s operation are analyzed and show that they imply “hidden” external work and dissipation. Computing the dissipation leads to a new lower bound on entropy production by the demon. It is strictly positive in all nontrivial cases, providing a more stringent limit than the second law and implying intrinsic thermodynamic irreversibility. The thermodynamic irreversibility is linked with mechanical irreversibility resulting from the spatial asymmetry of the demon’s speed selection criteria, indicating one mechanism by which macroscopic irreversibility may emerge from microscopic dynamics.  相似文献   

8.
We present a new post-processing method for Quantum Key Distribution (QKD) that raises cubically the secret key rate in the number of double matching detection events. In Shannon’s communication model, information is prepared at Alice’s side, and it is then intended to pass it over a noisy channel. In our approach, secret bits do not rely in Alice’s transmitted quantum bits but in Bob’s basis measurement choices. Therefore, measured bits are publicly revealed, while bases selections remain secret. Our method implements sifting, reconciliation, and amplification in a unique process, and it just requires a round iteration; no redundancy bits are sent, and there is no limit in the correctable error percentage. Moreover, this method can be implemented as a post-processing software into QKD technologies already in use.  相似文献   

9.
Landauer’s principle provides a fundamental lower bound for energy dissipation occurring with information erasure in the quantum regime. While most studies have related the entropy reduction incorporated with the erasure to the lower bound (entropic bound), recent efforts have also provided another lower bound associated with the thermal fluctuation of the dissipated energy (thermodynamic bound). The coexistence of the two bounds has stimulated comparative studies of their properties; however, these studies were performed for systems where the time-evolution of diagonal (population) and off-diagonal (coherence) elements of the density matrix are decoupled. In this paper, we aimed to broaden the comparative study to include the influence of quantum coherence induced by the tilted system–reservoir interaction direction. By examining their dependence on the initial state of the information-bearing system, we find that the following properties of the bounds are generically held regardless of whether the influence of the coherence is present or not: the entropic bound serves as the tighter bound for a sufficiently mixed initial state, while the thermodynamic bound is tighter when the purity of the initial state is sufficiently high. The exception is the case where the system dynamics involve only phase relaxation; in this case, the two bounds coincide when the initial coherence is zero; otherwise, the thermodynamic bound serves the tighter bound. We also find the quantum information erasure inevitably accompanies constant energy dissipation caused by the creation of system–reservoir correlation, which may cause an additional source of energetic cost for the erasure.  相似文献   

10.
Bell inequalities were created with the goal of improving the understanding of foundational questions in quantum mechanics. To this end, they are typically applied to measurement results generated from entangled systems of particles. They can, however, also be used as a statistical tool for macroscopic systems, where they can describe the connection strength between two components of a system under a causal model. We show that, in principle, data from macroscopic observations analyzed with Bell’ s approach can invalidate certain causal models. To illustrate this use, we describe a macroscopic game setting, without a quantum mechanical measurement process, and analyze it using the framework of Bell experiments. In the macroscopic game, violations of the inequalities can be created by cheating with classically defined strategies. In the physical context, the meaning of violations is less clear and is still vigorously debated. We discuss two measures for optimal strategies to generate a given statistic that violates the inequalities. We show their mathematical equivalence and how they can be computed from CHSH-quantities alone, if non-signaling applies. As a macroscopic example from the financial world, we show how the unfair use of insider knowledge could be picked up using Bell statistics. Finally, in the discussion of realist interpretations of quantum mechanical Bell experiments, cheating strategies are often expressed through the ideas of free choice and locality. In this regard, violations of free choice and locality can be interpreted as two sides of the same coin, which underscores the view that the meaning these terms are given in Bell’s approach should not be confused with their everyday use. In general, we conclude that Bell’s approach also carries lessons for understanding macroscopic systems of which the connectedness conforms to different causal structures.  相似文献   

11.
12.
We provide a new formulation of the Local Friendliness no-go theorem of Bong et al. [Nat. Phys. 16, 1199 (2020)] from fundamental causal principles, providing another perspective on how it puts strictly stronger bounds on quantum reality than Bell’s theorem. In particular, quantum causal models have been proposed as a way to maintain a peaceful coexistence between quantum mechanics and relativistic causality while respecting Leibniz’s methodological principle. This works for Bell’s theorem but does not work for the Local Friendliness no-go theorem, which considers an extended Wigner’s Friend scenario. More radical conceptual renewal is required; we suggest that cleaving to Leibniz’s principle requires extending relativity to events themselves.  相似文献   

13.
This study investigates the conformity to Benford’s Law of the information disclosed in financial statements. Using the first digit test of Benford’s Law, the study analyses the reliability of financial information provided by listed companies on an emerging capital market before and after the implementation of International Financial Reporting Standards (IFRS). The results of the study confirm the increase of reliability on the information disclosed in the financial statements after IFRS implementation. The study contributes to the existing literature by bringing new insights into the types of financial information that do not comply with Benford’s Law such as the amounts determined by estimates or by applying professional judgment.  相似文献   

14.
We consider the negotiation problem, in which an agent negotiates on behalf of a principal. Our considerations are focused on the Inspire negotiation support system in which the principal’s preferences are visualised by circles. In this way, the principal describes the importance of each negotiation issue and the relative utility of each considered option. The paper proposes how this preference information may be implemented by the agent for determining a scoring function used to support decisions throughout the negotiation process. The starting point of our considerations is a discussion regarding the visualisation of the principal’s preferences. We assume here that the importance of each issue and the utility of each option increases with the size of the circle representing them. The imprecise meaning of the notion of “circle size” implies that in a considered case, the utility of an option should be evaluated by a fuzzy number. The proposed utility fuzzification is justified by a simple analysis of results obtained from the empirical prenegotiation experiment. A novel method is proposed to determine trapezoidal fuzzy numbers, which evaluates an option’s utility using a series of answers given by the participants of the experiment. The utilities obtained this way are applied to determine the fuzzy scoring function for an agent. By determining such a common generalised fuzzy scoring system, our approach helps agents handle the differences in human cognitive processes associated with understanding the principal’s preferences. This work is the first approach to fuzzification of the preferences in the Inspire negotiation support system.  相似文献   

15.
Many small biological objects, such as viruses, survive in a water environment and cannot remain active in dry air without condensation of water vapor. From a physical point of view, these objects belong to the mesoscale, where small thermal fluctuations with the characteristic kinetic energy of kBT (where kB is the Boltzmann’s constant and T is the absolute temperature) play a significant role. The self-assembly of viruses, including protein folding and the formation of a protein capsid and lipid bilayer membrane, is controlled by hydrophobic forces (i.e., the repulsing forces between hydrophobic particles and regions of molecules) in a water environment. Hydrophobic forces are entropic, and they are driven by a system’s tendency to attain the maximum disordered state. On the other hand, in information systems, entropic forces are responsible for erasing information, if the energy barrier between two states of a switch is on the order of kBT, which is referred to as Landauer’s principle. We treated hydrophobic interactions responsible for the self-assembly of viruses as an information-processing mechanism. We further showed a similarity of these submicron-scale processes with the self-assembly in colloidal crystals, droplet clusters, and liquid marbles.  相似文献   

16.
Shinagawa and Iwata are considered quantum security for the sum of Even–Mansour (SoEM) construction and provided quantum key recovery attacks by Simon’s algorithm and Grover’s algorithm. Furthermore, quantum key recovery attacks are also presented for natural generalizations of SoEM. For some variants of SoEM, they found that their quantum attacks are not obvious and left it as an open problem to discuss the security of such constructions. This paper focuses on this open problem and presents a positive response. We provide quantum key recovery attacks against such constructions by quantum algorithms. For natural generalizations of SoEM with linear key schedules, we also present similar quantum key recovery attacks by quantum algorithms (Simon’s algorithm, Grover’s algorithm, and Grover-meet-Simon algorithm).  相似文献   

17.
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption that the Earth’s complexity could, in principle, be modeled by (surrogated by) a numerical algorithm if enough computing power were granted. Yet, similar numerical algorithms also surrogate different systems having the same processes and dynamics, such as Mars or Jupiter, although being qualitatively different from the Earth system. Here, we argue that understanding the Earth as a complex system requires a consideration of the Gaia hypothesis: the Earth is a complex system because it instantiates life—and therefore an autopoietic, metabolic-repair (M,R) organization—at a planetary scale. This implies that the Earth’s complexity has formal equivalence to a self-referential system that inherently is non-algorithmic and, therefore, cannot be surrogated and simulated in a Turing machine. We discuss the consequences of this, with reference to in-silico climate models, tipping points, planetary boundaries, and planetary feedback loops as units of adaptive evolution and selection.  相似文献   

18.
19.
In this paper, we generalize the notion of Shannon’s entropy power to the Rényi-entropy setting. With this, we propose generalizations of the de Bruijn identity, isoperimetric inequality, or Stam inequality. This framework not only allows for finding new estimation inequalities, but it also provides a convenient technical framework for the derivation of a one-parameter family of Rényi-entropy-power-based quantum-mechanical uncertainty relations. To illustrate the usefulness of the Rényi entropy power obtained, we show how the information probability distribution associated with a quantum state can be reconstructed in a process that is akin to quantum-state tomography. We illustrate the inner workings of this with the so-called “cat states”, which are of fundamental interest and practical use in schemes such as quantum metrology. Salient issues, including the extension of the notion of entropy power to Tsallis entropy and ensuing implications in estimation theory, are also briefly discussed.  相似文献   

20.
Volatility, which represents the magnitude of fluctuating asset prices or returns, is used in the problems of finance to design optimal asset allocations and to calculate the price of derivatives. Since volatility is unobservable, it is identified and estimated by latent variable models known as volatility fluctuation models. Almost all conventional volatility fluctuation models are linear time-series models and thus are difficult to capture nonlinear and/or non-Gaussian properties of volatility dynamics. In this study, we propose an entropy based Student’s t-process Dynamical model (ETPDM) as a volatility fluctuation model combined with both nonlinear dynamics and non-Gaussian noise. The ETPDM estimates its latent variables and intrinsic parameters by a robust particle filtering based on a generalized H-theorem for a relative entropy. To test the performance of the ETPDM, we implement numerical experiments for financial time-series and confirm the robustness for a small number of particles by comparing with the conventional particle filtering.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号