首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 18 毫秒
1.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

2.
This paper assesses two different theories for explaining consciousness, a phenomenon that is widely considered amenable to scientific investigation despite its puzzling subjective aspects. I focus on Integrated Information Theory (IIT), which says that consciousness is integrated information (as ϕMax) and says even simple systems with interacting parts possess some consciousness. First, I evaluate IIT on its own merits. Second, I compare it to a more traditionally derived theory called Neurobiological Naturalism (NN), which says consciousness is an evolved, emergent feature of complex brains. Comparing these theories is informative because it reveals strengths and weaknesses of each, thereby suggesting better ways to study consciousness in the future. IIT’s strengths are the reasonable axioms at its core; its strong logic and mathematical formalism; its creative “experience-first” approach to studying consciousness; the way it avoids the mind-body (“hard”) problem; its consistency with evolutionary theory; and its many scientifically testable predictions. The potential weakness of IIT is that it contains stretches of logic-based reasoning that were not checked against hard evidence when the theory was being constructed, whereas scientific arguments require such supporting evidence to keep the reasoning on course. This is less of a concern for the other theory, NN, because it incorporated evidence much earlier in its construction process. NN is a less mature theory than IIT, less formalized and quantitative, and less well tested. However, it has identified its own neural correlates of consciousness (NCC) and offers a roadmap through which these NNCs may answer the questions of consciousness using the hypothesize-test-hypothesize-test steps of the scientific method.  相似文献   

3.
Wigner’s friend scenarios involve an Observer, or Observers, measuring a Friend, or Friends, who themselves make quantum measurements. In recent discussions, it has been suggested that quantum mechanics may not always be able to provide a consistent account of a situation involving two Observers and two Friends. We investigate this problem by invoking the basic rules of quantum mechanics as outlined by Feynman in the well-known “Feynman Lectures on Physics”. We show here that these “Feynman rules” constrain the a priori assumptions which can be made in generalised Wigner’s friend scenarios, because the existence of the probabilities of interest ultimately depends on the availability of physical evidence (material records) of the system’s past. With these constraints obeyed, a non-ambiguous and consistent account of all measurement outcomes is obtained for all agents, taking part in various Wigner’s Friend scenarios.  相似文献   

4.
We consider the negotiation problem, in which an agent negotiates on behalf of a principal. Our considerations are focused on the Inspire negotiation support system in which the principal’s preferences are visualised by circles. In this way, the principal describes the importance of each negotiation issue and the relative utility of each considered option. The paper proposes how this preference information may be implemented by the agent for determining a scoring function used to support decisions throughout the negotiation process. The starting point of our considerations is a discussion regarding the visualisation of the principal’s preferences. We assume here that the importance of each issue and the utility of each option increases with the size of the circle representing them. The imprecise meaning of the notion of “circle size” implies that in a considered case, the utility of an option should be evaluated by a fuzzy number. The proposed utility fuzzification is justified by a simple analysis of results obtained from the empirical prenegotiation experiment. A novel method is proposed to determine trapezoidal fuzzy numbers, which evaluates an option’s utility using a series of answers given by the participants of the experiment. The utilities obtained this way are applied to determine the fuzzy scoring function for an agent. By determining such a common generalised fuzzy scoring system, our approach helps agents handle the differences in human cognitive processes associated with understanding the principal’s preferences. This work is the first approach to fuzzification of the preferences in the Inspire negotiation support system.  相似文献   

5.
The present study investigates the similarity problem associated with the onset of the Mach reflection of Zel’dovich–von Neumann–Döring (ZND) detonations in the near field. The results reveal that the self-similarity in the frozen-limit regime is strictly valid only within a small scale, i.e., of the order of the induction length. The Mach reflection becomes non-self-similar during the transition of the Mach stem from “frozen” to “reactive” by coupling with the reaction zone. The triple-point trajectory first rises from the self-similar result due to compressive waves generated by the “hot spot”, and then decays after establishment of the reactive Mach stem. It is also found, by removing the restriction, that the frozen limit can be extended to a much larger distance than expected. The obtained results elucidate the physical origin of the onset of Mach reflection with chemical reactions, which has previously been observed in both experiments and numerical simulations.  相似文献   

6.
In 2016, Steve Gull has outlined has outlined a proof of Bell’s theorem using Fourier theory. Gull’s philosophy is that Bell’s theorem (or perhaps a key lemma in its proof) can be seen as a no-go theorem for a project in distributed computing with classical, not quantum, computers. We present his argument, correcting misprints and filling gaps. In his argument, there were two completely separated computers in the network. We need three in order to fill all the gaps in his proof: a third computer supplies a stream of random numbers to the two computers representing the two measurement stations in Bell’s work. One could also imagine that computer replaced by a cloned, virtual computer, generating the same pseudo-random numbers within each of Alice and Bob’s computers. Either way, we need an assumption of the presence of shared i.i.d. randomness in the form of a synchronised sequence of realisations of i.i.d. hidden variables underlying the otherwise deterministic physics of the sequence of trials. Gull’s proof then just needs a third step: rewriting an expectation as the expectation of a conditional expectation given the hidden variables.  相似文献   

7.
Maxwell’s demon is an entity in a 150-year-old thought experiment that paradoxically appears to violate the second law of thermodynamics by reducing entropy without doing work. It has increasingly practical implications as advances in nanomachinery produce devices that push the thermodynamic limits imposed by the second law. A well-known explanation claiming that information erasure restores second law compliance fails to resolve the paradox because it assumes the second law a priori, and does not predict irreversibility. Instead, a purely mechanical resolution that does not require information theory is presented. The transport fluxes of mass, momentum, and energy involved in the demon’s operation are analyzed and show that they imply “hidden” external work and dissipation. Computing the dissipation leads to a new lower bound on entropy production by the demon. It is strictly positive in all nontrivial cases, providing a more stringent limit than the second law and implying intrinsic thermodynamic irreversibility. The thermodynamic irreversibility is linked with mechanical irreversibility resulting from the spatial asymmetry of the demon’s speed selection criteria, indicating one mechanism by which macroscopic irreversibility may emerge from microscopic dynamics.  相似文献   

8.
During the physical foundation of his radiation formula in his December 1900 talk and subsequent 1901 article, Planck refers to Boltzmann’s 1877 combinatorial-probabilistic treatment and obtains his quantum distribution function, while Boltzmann did not. For this, Boltzmann’s memoirs are usually ascribed to classical statistical mechanics. Agreeing with Bach, it is shown that Boltzmann’s 1868 and 1877 calculations can lead to a Planckian distribution function, where those of 1868 are even closer to Planck than that of 1877. Boltzmann’s and Planck’s calculations are compared based on Bach’s three-level scheme ‘configuration–occupation–occupancy’. Special attention is paid to the concepts of interchangeability and the indistinguishability of particles and states. In contrast to Bach, the level of exposition is most elementary. I hope to make Boltzmann’s work better known in English and to remove misunderstandings in the literature.  相似文献   

9.
Max Born’s statistical interpretation made probabilities play a major role in quantum theory. Here we show that these quantum probabilities and the classical probabilities have very different origins. Although the latter always result from an assumed probability measure, the first include transition probabilities with a purely algebraic origin. Moreover, the general definition of transition probability introduced here comprises not only the well-known quantum mechanical transition probabilities between pure states or wave functions, but further physically meaningful and experimentally verifiable novel cases. A transition probability that differs from 0 and 1 manifests the typical quantum indeterminacy in a similar way as Heisenberg’s and others’ uncertainty relations and, furthermore, rules out deterministic states in the same way as the Bell-Kochen-Specker theorem. However, the transition probability defined here achieves a lot more beyond that: it demonstrates that the algebraic structure of the Hilbert space quantum logic dictates the precise values of certain probabilities and it provides an unexpected access to these quantum probabilities that does not rely on states or wave functions.  相似文献   

10.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

11.
Integrated information theory (IIT) provides a mathematical framework to characterize the cause-effect structure of a physical system and its amount of integrated information (Φ). An accompanying Python software package (“PyPhi”) was recently introduced to implement this framework for the causal analysis of discrete dynamical systems of binary elements. Here, we present an update to PyPhi that extends its applicability to systems constituted of discrete, but multi-valued elements. This allows us to analyze and compare general causal properties of random networks made up of binary, ternary, quaternary, and mixed nodes. Moreover, we apply the developed tools for causal analysis to a simple non-binary regulatory network model (p53-Mdm2) and discuss commonly used binarization methods in light of their capacity to preserve the causal structure of the original system with multi-valued elements.  相似文献   

12.
This study aimed to investigate consumers’ visual image evaluation of wrist wearables based on Kansei engineering. A total of 8 representative samples were screened from 99 samples using the multidimensional scaling (MDS) method. Five groups of adjectives were identified to allow participants to express their visual impressions of wrist wearable devices through a questionnaire survey and factor analysis. The evaluation of eight samples using the five groups of adjectives was analyzed utilizing the triangle fuzzy theory. The results showed a relatively different evaluation of the eight samples in the groups of “fashionable and individual” and “rational and decent”, but little distinction in the groups of “practical and durable”, “modern and smart” and “convenient and multiple”. Furthermore, wrist wearables with a shape close to a traditional watch dial (round), with a bezel and mechanical buttons (moderate complexity) and asymmetric forms received a higher evaluation. The acceptance of square- and elliptical-shaped wrist wearables was relatively low. Among the square- and rectangular-shaped wrist wearables, the greater the curvature of the chamfer, the higher the acceptance. Apparent contrast between the color of the screen and the casing had good acceptance. The influence of display size on consumer evaluations was relatively small. Similar results were obtained in the evaluation of preferences and willingness to purchase. The results of this study objectively and effectively reflect consumers’ evaluation and potential demand for the visual images of wrist wearables and provide a reference for designers and industry professionals.  相似文献   

13.
Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s entropy’s plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon’s entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon’s entropy.  相似文献   

14.
On the basis of his ‘Zürich Notebook’ I shall describe a particularly fruitful phase in Einstein's struggle on the way to general relativity. These research notes are an extremely illuminating source for understanding Einstein's main physical arguments and conceptual difficulties that delayed his discovery of general relativity by about three years. Together with the ‘Entwurf’ theory in collaboration with Marcel Grossmann, these notes also show that the final theory was missed late in 1912 within a hair's breadth. The Einstein‐Grossmann theory, published almost exactly hundred years ago, contains, however, virtually all essential elements of Einstein's definite gravitation theory.  相似文献   

15.
Our account provides a local, realist and fully non-causal principle explanation for EPR correlations, contextuality, no-signalling, and the Tsirelson bound. Indeed, the account herein is fully consistent with the causal structure of Minkowski spacetime. We argue that retrocausal accounts of quantum mechanics are problematic precisely because they do not fully transcend the assumption that causal or constructive explanation must always be fundamental. Unlike retrocausal accounts, our principle explanation is a complete rejection of Reichenbach’s Principle. Furthermore, we will argue that the basis for our principle account of quantum mechanics is the physical principle sought by quantum information theorists for their reconstructions of quantum mechanics. Finally, we explain why our account is both fully realist and psi-epistemic.  相似文献   

16.
The article argues that—at least in certain interpretations, such as the one assumed in this article under the heading of “reality without realism”—the quantum-theoretical situation appears as follows: While—in terms of probabilistic predictions—connected to and connecting the information obtained in quantum phenomena, the mathematics of quantum theory (QM or QFT), which is continuous, does not represent and is discontinuous with both the emergence of quantum phenomena and the physics of these phenomena, phenomena that are physically discontinuous with each other as well. These phenomena, and thus this information, are described by classical physics. All actually available information (in the mathematical sense of information theory) is classical: it is composed of units, such as bits, that are—or are contained in—entities described by classical physics. On the other hand, classical physics cannot predict this information when it is created, as manifested in measuring instruments, in quantum experiments, while quantum theory can. In this epistemological sense, this information is quantum. The article designates the discontinuity between quantum theory and the emergence of quantum phenomena the “Heisenberg discontinuity”, because it was introduced by W. Heisenberg along with QM, and the discontinuity between QM or QFT and the classical physics of quantum phenomena, the “Bohr discontinuity”, because it was introduced as part of Bohr’s interpretation of quantum phenomena and QM, under the assumption of Heisenberg discontinuity. Combining both discontinuities precludes QM or QFT from being connected to either physical reality, that ultimately responsible for quantum phenomena or that of these phenomena themselves, other than by means of probabilistic predictions concerning the information, classical in character, contained in quantum phenomena. The nature of quantum information is, in this view, defined by this situation. A major implication, discussed in the Conclusion, is the existence and arguably the necessity of two—classical and quantum—or with relativity, three and possibly more essentially different theories in fundamental physics.  相似文献   

17.
As multilayer networks are widely applied in modern society, numerous studies have shown the impact of a multilayer network structure and the network nature on the proportion of cooperators in the network. In this paper, we use Barabási–Albert scale-free networks (BA) and Watts and Strogatz networks (WS) to build a multilayer network structure, and we propose a new strategy-updating rule called “cooperation-defection dominance”, which can be likened to dominant and recessive traits in biogenetics. With the newly constructed multilayer network structure and the strategy-updating rules, based on the simulation results, we find that in the BA-BA network, the cooperation dominance strategy can make the networks with different rs show a cooperative trend, while the defection dominance strategy only has an obvious effect on the network cooperation with a larger r. When the BA network is connected to the WS network, we find that the effect of strategy on the proportion of cooperators in the network decreases, and the main influencing factor is the structure of the network. In the three-layer network, the cooperation dominance strategy has a greater impact on the BA network, and the proportion of the cooperators is enhanced more than under the natural evolution strategy, but the promotion effect is still smaller than that of the two-layer BA network because of the WS network. Under the defection dominance strategy, the WS layer appears different from the first two strategies, and we conclude through simulation that when the payoff parameter is at the middle level, its cooperator proportion will be suppressed, and we deduce that the proportion of cooperators and defectors, as well as the payoff, play an important role.  相似文献   

18.
Quantum mechanics predicts correlations between measurements performed in distant regions of a spatially spread entangled state to be higher than allowed by intuitive concepts of Locality and Realism. These high correlations forbid the use of nonlinear operators of evolution (which would be desirable for several reasons), for they may allow faster-than-light signaling. As a way out of this situation, it has been hypothesized that the high quantum correlations develop only after a time longer than L/c has elapsed (where L is the spread of the entangled state and c is the velocity of light). In shorter times, correlations compatible with Locality and Realism would be observed instead. A simple hidden variables model following this hypothesis is described. It is based on a modified Wheeler–Feynman theory of radiation. This hypothesis has not been disproved by any of the experiments performed to date. A test achievable with accessible means is proposed and described. It involves a pulsed source of entangled states and stroboscopic record of particle detection during the pulses. Data recorded in similar but incomplete optical experiments are analyzed, and found consistent with the proposed model. However, it is not claimed, in any sense, that the hypothesis has been validated. On the contrary, it is stressed that a complete, specific test is absolutely needed.  相似文献   

19.
20.
How, if at all, consciousness can be part of the physical universe remains a baffling problem. This article outlines a new, developing philosophical theory of how it could do so, and offers a preliminary mathematical formulation of a physical grounding for key aspects of the theory. Because the philosophical side has radical elements, so does the physical-theory side. The philosophical side is radical, first, in proposing that the productivity or dynamism in the universe that many believe to be responsible for its systematic regularities is actually itself a physical constituent of the universe, along with more familiar entities. Indeed, it proposes that instances of dynamism can themselves take part in physical interactions with other entities, this interaction then being “meta-dynamism” (a type of meta-causation). Secondly, the theory is radical, and unique, in arguing that consciousness is necessarily partly constituted of meta-dynamic auto-sensitivity, in other words it must react via meta-dynamism to its own dynamism, and also in conjecturing that some specific form of this sensitivity is sufficient for and indeed constitutive of consciousness. The article proposes a way for physical laws to be modified to accommodate meta-dynamism, via the radical step of including elements that explicitly refer to dynamism itself. Additionally, laws become, explicitly, temporally non-local in referring directly to quantity values holding at times prior to a given instant of application of the law. The approach therefore implicitly brings in considerations about what information determines states. Because of the temporal non-locality, and also because of the deep connections between dynamism and time-flow, the approach also implicitly connects to the topic of entropy insofar as this is related to time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号