首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
    
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities\" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

2.
    
Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s entropy’s plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon’s entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon’s entropy.  相似文献   

3.
    
The micro-canonical, canonical, and grand canonical ensembles of walks defined in finite connected undirected graphs are considered in the thermodynamic limit of infinite walk length. As infinitely long paths are extremely sensitive to structural irregularities and defects, their properties are used to describe the degree of structural imbalance, anisotropy, and navigability in finite graphs. For the first time, we introduce entropic force and pressure describing the effect of graph defects on mobility patterns associated with the very long walks in finite graphs; navigation in graphs and navigability to the nodes by the different types of ergodic walks; as well as node’s fugacity in the course of prospective network expansion or shrinking.  相似文献   

4.
This study uses the fourteen stock indices as the sample and then utilizes eight parametric volatility forecasting models and eight composed volatility forecasting models to explore whether the neural network approach and the settings of leverage effect and non-normal return distribution can promote the performance of volatility forecasting, and which one of the sixteen models possesses the best volatility forecasting performance. The eight parametric volatility forecasts models are composed of the generalized autoregressive conditional heteroskedasticity (GARCH) or GJR-GARCH volatility specification combining with the normal, Student’s t, skewed Student’s t, and generalized skewed Student’s t distributions. Empirical results show that, the performance for the composed volatility forecasting approach is significantly superior to that for the parametric volatility forecasting approach. Furthermore, the GJR-GARCH volatility specification has better performance than the GARCH one. In addition, the non-normal distribution does not have better forecasting performance than the normal distribution. In addition, the GJR-GARCH model combined with both the normal distribution and a neural network approach has the best performance of volatility forecasting among sixteen models. Thus, a neural network approach significantly promotes the performance of volatility forecasting. On the other hand, the setting of leverage effect can encourage the performance of volatility forecasting whereas the setting of non-normal distribution cannot.  相似文献   

5.
    
Wigner’s friend scenarios involve an Observer, or Observers, measuring a Friend, or Friends, who themselves make quantum measurements. In recent discussions, it has been suggested that quantum mechanics may not always be able to provide a consistent account of a situation involving two Observers and two Friends. We investigate this problem by invoking the basic rules of quantum mechanics as outlined by Feynman in the well-known “Feynman Lectures on Physics”. We show here that these “Feynman rules” constrain the a priori assumptions which can be made in generalised Wigner’s friend scenarios, because the existence of the probabilities of interest ultimately depends on the availability of physical evidence (material records) of the system’s past. With these constraints obeyed, a non-ambiguous and consistent account of all measurement outcomes is obtained for all agents, taking part in various Wigner’s Friend scenarios.  相似文献   

6.
    
We address the problem of telegraphic transport in several dimensions. We review the derivation of two and three dimensional telegrapher’s equations—as well as their fractional generalizations—from microscopic random walk models for transport (normal and anomalous). We also present new results on solutions of the higher dimensional fractional equations.  相似文献   

7.
    
Maxwell’s demon is an entity in a 150-year-old thought experiment that paradoxically appears to violate the second law of thermodynamics by reducing entropy without doing work. It has increasingly practical implications as advances in nanomachinery produce devices that push the thermodynamic limits imposed by the second law. A well-known explanation claiming that information erasure restores second law compliance fails to resolve the paradox because it assumes the second law a priori, and does not predict irreversibility. Instead, a purely mechanical resolution that does not require information theory is presented. The transport fluxes of mass, momentum, and energy involved in the demon’s operation are analyzed and show that they imply “hidden” external work and dissipation. Computing the dissipation leads to a new lower bound on entropy production by the demon. It is strictly positive in all nontrivial cases, providing a more stringent limit than the second law and implying intrinsic thermodynamic irreversibility. The thermodynamic irreversibility is linked with mechanical irreversibility resulting from the spatial asymmetry of the demon’s speed selection criteria, indicating one mechanism by which macroscopic irreversibility may emerge from microscopic dynamics.  相似文献   

8.
    
We introduce a quantum key distribution protocol using mean multi-kings’ problem. Using this protocol, a sender can share a bit sequence as a secret key with receivers. We consider a relation between information gain by an eavesdropper and disturbance contained in legitimate users’ information. In BB84 protocol, such relation is known as the so-called information disturbance theorem. We focus on a setting that the sender and two receivers try to share bit sequences and the eavesdropper tries to extract information by interacting legitimate users’ systems and an ancilla system. We derive trade-off inequalities between distinguishability of quantum states corresponding to the bit sequence for the eavesdropper and error probability of the bit sequence shared with the legitimate users. Our inequalities show that eavesdropper’s extracting information regarding the secret keys inevitably induces disturbing the states and increasing the error probability.  相似文献   

9.
    
We present a hypothetical argument against finite-state processes in statistical language modeling that is based on semantics rather than syntax. In this theoretical model, we suppose that the semantic properties of texts in a natural language could be approximately captured by a recently introduced concept of a perigraphic process. Perigraphic processes are a class of stochastic processes that satisfy a Zipf-law accumulation of a subset of factual knowledge, which is time-independent, compressed, and effectively inferrable from the process. We show that the classes of finite-state processes and of perigraphic processes are disjoint, and we present a new simple example of perigraphic processes over a finite alphabet called Oracle processes. The disjointness result makes use of the Hilberg condition, i.e., the almost sure power-law growth of algorithmic mutual information. Using a strongly consistent estimator of the number of hidden states, we show that finite-state processes do not satisfy the Hilberg condition whereas Oracle processes satisfy the Hilberg condition via the data-processing inequality. We discuss the relevance of these mathematical results for theoretical and computational linguistics.  相似文献   

10.
11.
    
Information entropy metrics have been applied to a wide range of problems that were abstracted as complex networks. This growing body of research is scattered in multiple disciplines, which makes it difficult to identify available metrics and understand the context in which they are applicable. In this work, a narrative literature review of information entropy metrics for complex networks is conducted following the PRISMA guidelines. Existing entropy metrics are classified according to three different criteria: whether the metric provides a property of the graph or a graph component (such as the nodes), the chosen probability distribution, and the types of complex networks to which the metrics are applicable. Consequently, this work identifies the areas in need for further development aiming to guide future research efforts.  相似文献   

12.
    
Landauer’s principle provides a fundamental lower bound for energy dissipation occurring with information erasure in the quantum regime. While most studies have related the entropy reduction incorporated with the erasure to the lower bound (entropic bound), recent efforts have also provided another lower bound associated with the thermal fluctuation of the dissipated energy (thermodynamic bound). The coexistence of the two bounds has stimulated comparative studies of their properties; however, these studies were performed for systems where the time-evolution of diagonal (population) and off-diagonal (coherence) elements of the density matrix are decoupled. In this paper, we aimed to broaden the comparative study to include the influence of quantum coherence induced by the tilted system–reservoir interaction direction. By examining their dependence on the initial state of the information-bearing system, we find that the following properties of the bounds are generically held regardless of whether the influence of the coherence is present or not: the entropic bound serves as the tighter bound for a sufficiently mixed initial state, while the thermodynamic bound is tighter when the purity of the initial state is sufficiently high. The exception is the case where the system dynamics involve only phase relaxation; in this case, the two bounds coincide when the initial coherence is zero; otherwise, the thermodynamic bound serves the tighter bound. We also find the quantum information erasure inevitably accompanies constant energy dissipation caused by the creation of system–reservoir correlation, which may cause an additional source of energetic cost for the erasure.  相似文献   

13.
    
In this paper, the formulation of time-fractional (TF) electrodynamics is derived based on the Riemann-Silberstein (RS) vector. With the use of this vector and fractional-order derivatives, one can write TF Maxwell’s equations in a compact form, which allows for modelling of energy dissipation and dynamics of electromagnetic systems with memory. Therefore, we formulate TF Maxwell’s equations using the RS vector and analyse their properties from the point of view of classical electrodynamics, i.e., energy and momentum conservation, reciprocity, causality. Afterwards, we derive classical solutions for wave-propagation problems, assuming helical, spherical, and cylindrical symmetries of solutions. The results are supported by numerical simulations and their analysis. Discussion of relations between the TF Schrödinger equation and TF electrodynamics is included as well.  相似文献   

14.
    
This paper shows if and how the predictability and complexity of stock market data changed over the last half-century and what influence the M1 money supply has. We use three different machine learning algorithms, i.e., a stochastic gradient descent linear regression, a lasso regression, and an XGBoost tree regression, to test the predictability of two stock market indices, the Dow Jones Industrial Average and the NASDAQ (National Association of Securities Dealers Automated Quotations) Composite. In addition, all data under study are discussed in the context of a variety of measures of signal complexity. The results of this complexity analysis are then linked with the machine learning results to discover trends and correlations between predictability and complexity. Our results show a decrease in predictability and an increase in complexity for more recent years. We find a correlation between approximate entropy, sample entropy, and the predictability of the employed machine learning algorithms on the data under study. This link between the predictability of machine learning algorithms and the mentioned entropy measures has not been shown before. It should be considered when analyzing and predicting complex time series data, e.g., stock market data, to e.g., identify regions of increased predictability.  相似文献   

15.
    
The Rao’s score, Wald and likelihood ratio tests are the most common procedures for testing hypotheses in parametric models. None of the three test statistics is uniformly superior to the other two in relation with the power function, and moreover, they are first-order equivalent and asymptotically optimal. Conversely, these three classical tests present serious robustness problems, as they are based on the maximum likelihood estimator, which is highly non-robust. To overcome this drawback, some test statistics have been introduced in the literature based on robust estimators, such as robust generalized Wald-type and Rao-type tests based on minimum divergence estimators. In this paper, restricted minimum Rényi’s pseudodistance estimators are defined, and their asymptotic distribution and influence function are derived. Further, robust Rao-type and divergence-based tests based on minimum Rényi’s pseudodistance and restricted minimum Rényi’s pseudodistance estimators are considered, and the asymptotic properties of the new families of tests statistics are obtained. Finally, the robustness of the proposed estimators and test statistics is empirically examined through a simulation study, and illustrative applications in real-life data are analyzed.  相似文献   

16.
    
Living cells are complex systems characterized by fluids crowded by hundreds of different elements, including, in particular, a high density of polymers. They are an excellent and challenging laboratory to study exotic emerging physical phenomena, where entropic forces emerge from the organization processes of many-body interactions. The competition between microscopic and entropic forces may generate complex behaviors, such as phase transitions, which living cells may use to accomplish their functions. In the era of big data, where biological information abounds, but general principles and precise understanding of the microscopic interactions is scarce, entropy methods may offer significant information. In this work, we developed a model where a complex thermodynamic equilibrium resulted from the competition between an effective electrostatic short-range interaction and the entropic forces emerging in a fluid crowded by different sized polymers. The target audience for this article are interdisciplinary researchers in complex systems, particularly in thermodynamics and biophysics modeling.  相似文献   

17.
    
Many small biological objects, such as viruses, survive in a water environment and cannot remain active in dry air without condensation of water vapor. From a physical point of view, these objects belong to the mesoscale, where small thermal fluctuations with the characteristic kinetic energy of kBT (where kB is the Boltzmann’s constant and T is the absolute temperature) play a significant role. The self-assembly of viruses, including protein folding and the formation of a protein capsid and lipid bilayer membrane, is controlled by hydrophobic forces (i.e., the repulsing forces between hydrophobic particles and regions of molecules) in a water environment. Hydrophobic forces are entropic, and they are driven by a system’s tendency to attain the maximum disordered state. On the other hand, in information systems, entropic forces are responsible for erasing information, if the energy barrier between two states of a switch is on the order of kBT, which is referred to as Landauer’s principle. We treated hydrophobic interactions responsible for the self-assembly of viruses as an information-processing mechanism. We further showed a similarity of these submicron-scale processes with the self-assembly in colloidal crystals, droplet clusters, and liquid marbles.  相似文献   

18.
    
We suggest a quantitative and objective notion of emergence. Our proposal uses algorithmic information theory as a basis for an objective framework in which a bit string encodes observational data. A plurality of drops in the Kolmogorov structure function of such a string is seen as the hallmark of emergence. Our definition offers some theoretical results, in addition to extending the notions of coarse-graining and boundary conditions. Finally, we confront our proposal with applications to dynamical systems and thermodynamics.  相似文献   

19.
    
We consider state changes in quantum theory due to “conditional action” and relate these to the discussion of entropy decrease due to interventions of “intelligent beings” and the principles of Szilard and Landauer/Bennett. The mathematical theory of conditional actions is a special case of the theory of “instruments”, which describes changes of state due to general measurements and will therefore be briefly outlined in the present paper. As a detailed example, we consider the imperfect erasure of a qubit that can also be viewed as a conditional action and will be realized by the coupling of a spin to another small spin system in its ground state.  相似文献   

20.
    
We studied the prisoner’s dilemma game as applied to signed networks. In signed networks, there are two types of links: positive and negative. To establish a payoff matrix between players connected with a negative link, we multiplied the payoff matrix between players connected with a positive link by −1. To investigate the effect of negative links on cooperating behavior, we performed simulations for different negative link densities. When the negative link density is low, the density of the cooperator becomes zero because there is an increasing temptation payoff, b. Here, parameter b is the payoff received by the defector from playing the game with a cooperator. Conversely, when the negative link density is high, the cooperator density becomes almost 1 as b increases. This is because players with a negative link will suffer more payoff damage if they do not cooperate with each other. The negative link forces players to cooperate, so cooperating behavior is enhanced.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号