首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Entropy profiling is a recently introduced approach that reduces parametric dependence in traditional Kolmogorov-Sinai (KS) entropy measurement algorithms. The choice of the threshold parameter r of vector distances in traditional entropy computations is crucial in deciding the accuracy of signal irregularity information retrieved by these methods. In addition to making parametric choices completely data-driven, entropy profiling generates a complete profile of entropy information as against a single entropy estimate (seen in traditional algorithms). The benefits of using “profiling” instead of “estimation” are: (a) precursory methods such as approximate and sample entropy that have had the limitation of handling short-term signals (less than 1000 samples) are now made capable of the same; (b) the entropy measure can capture complexity information from short and long-term signals without multi-scaling; and (c) this new approach facilitates enhanced information retrieval from short-term HRV signals. The novel concept of entropy profiling has greatly equipped traditional algorithms to overcome existing limitations and broaden applicability in the field of short-term signal analysis. In this work, we present a review of KS-entropy methods and their limitations in the context of short-term heart rate variability analysis and elucidate the benefits of using entropy profiling as an alternative for the same.  相似文献   

2.
3.
Simple SummaryThe second law of thermodynamics has a mystical appeal in disciplines with tenuous connections to its origins. We hypothesize that many of these appeals instead should be to another principle heretofore unrecognized: the law of mixed-up-ness (LOM). Instead of using a number such as entropy to characterize randomness, non-thermodynamic systems can be arranged in simple diagrams according to their degree of mixed-up-ness. Curiously, the evolution of such systems from degrees of low to high mixed-up-ness is both consistent with and richer than the principle of increasing entropy.AbstractMixed-up-ness can be traced to unpublished notes by Josiah Gibbs. Subsequently, the concept was developed independently, and under somewhat different names, by other investigators. The central idea of mixed-up-ness is that systems states can be organized in a hierarchy by their degree of mixed-up-ness. In its purest form, the organizing principle is independent of thermodynamic and statistical mechanics principles, nor does it imply irreversibility. Yet, Gibbs and subsequent investigators kept entropy as the essential concept in determining system evolution, thus retaining the notion that systems evolve from states of perfect “order” to states of total “disorder”. Nevertheless, increasing mixed-up-ness is consistent with increasing entropy; however, there is no unique one-to-one connection between the two. We illustrate the notion of mixed-up-ness with an application to the permutation function of integer partitions and then formalize the notion of mixed-up-ness as a fundamental hierarchal principle, the law of mixed-up-ness (LOM), for non-thermodynamic systems.  相似文献   

4.
Entropy indicates irregularity or randomness of a dynamic system. Over the decades, entropy calculated at different scales of the system through subsampling or coarse graining has been used as a surrogate measure of system complexity. One popular multi-scale entropy analysis is the multi-scale sample entropy (MSE), which calculates entropy through the sample entropy (SampEn) formula at each time scale. SampEn is defined by the “logarithmic likelihood” that a small section (within a window of a length m) of the data “matches” with other sections will still “match” the others if the section window length increases by one. “Match” is defined by a threshold of r times standard deviation of the entire time series. A problem of current MSE algorithm is that SampEn calculations at different scales are based on the same matching threshold defined by the original time series but data standard deviation actually changes with the subsampling scales. Using a fixed threshold will automatically introduce systematic bias to the calculation results. The purpose of this paper is to mathematically present this systematic bias and to provide methods for correcting it. Our work will help the large MSE user community avoiding introducing the bias to their multi-scale SampEn calculation results.  相似文献   

5.
The most known and used abstract model of the financial market is based on the concept of the informational efficiency (EMH) of that market. The paper proposes an alternative which could be named the behavioural efficiency of the financial market, which is based on the behavioural entropy instead of the informational entropy. More specifically, the paper supports the idea that, in the financial market, the only measure (if any) of the entropy is the available behaviours indicated by the implicit information. Therefore, the behavioural entropy is linked to the concept of behavioural efficiency. The paper argues that, in fact, in the financial markets, there is not a (real) informational efficiency, but there exists a behavioural efficiency instead. The proposal is based both on a new typology of information in the financial market (which provides the concept of implicit information—that is, that information ”translated” by the economic agents from observing the actual behaviours) and on a non-linear (more exactly, a logistic) curve linking the behavioural entropy to the behavioural efficiency of the financial markets. Finally, the paper proposes a synergic overcoming of both EMH and AMH based on the new concept of behavioural entropy in the financial market.  相似文献   

6.
Some problems of describing biological systems with the use of entropy as a measure of the complexity of these systems are considered. Entropy is studied both for the organism as a whole and for its parts down to the molecular level. Correlation of actions of various parts of the whole organism, intercellular interactions and control, as well as cooperativity on the microlevel lead to a more complex structure and lower statistical entropy. For a multicellular organism, entropy is much lower than entropy for the same mass of a colony of unicellular organisms. Cooperativity always reduces the entropy of the system; a simple example of ligand binding to a macromolecule carrying two reaction centers shows how entropy is consistent with the ambiguity of the result in the Bernoulli test scheme. Particular attention is paid to the qualitative and quantitative relationship between the entropy of the system and the cooperativity of ligand binding to macromolecules. A kinetic model of metabolism. corresponding to Schrödinger’s concept of the maintenance biosystems by “negentropy feeding”, is proposed. This model allows calculating the nonequilibrium local entropy and comparing it with the local equilibrium entropy inherent in non-living matter.  相似文献   

7.
In this cross-sectional study, the relationship between noninvasively measured neurocardiovascular signal entropy and physical frailty was explored in a sample of community-dwelling older adults from The Irish Longitudinal Study on Ageing (TILDA). The hypothesis under investigation was that dysfunction in the neurovascular and cardiovascular systems, as quantified by short-length signal complexity during a lying-to-stand test (active stand), could provide a marker for frailty. Frailty status (i.e., “non-frail”, “pre-frail”, and “frail”) was based on Fried’s criteria (i.e., exhaustion, unexplained weight loss, weakness, slowness, and low physical activity). Approximate entropy (ApEn) and sample entropy (SampEn) were calculated during resting (lying down), active standing, and recovery phases. There was continuously measured blood pressure/heart rate data from 2645 individuals (53.0% female) and frontal lobe tissue oxygenation data from 2225 participants (52.3% female); both samples had a mean (SD) age of 64.3 (7.7) years. Results revealed statistically significant associations between neurocardiovascular signal entropy and frailty status. Entropy differences between non-frail and pre-frail/frail were greater during resting state compared with standing and recovery phases. Compared with ApEn, SampEn seemed to have better discriminating power between non-frail and pre-frail/frail individuals. The quantification of entropy in short length neurocardiovascular signals could provide a clinically useful marker of the multiple physiological dysregulations that underlie physical frailty.  相似文献   

8.
The lack of adequate indicators in the research of digital economy may lead to the shortage of data support on decision making for governments. To solve this problem, first we establish a digital economy indicator evaluation system by dividing the digital economy into four types: “basic type”, “technology type”, “integration type” and “service type” and select 5 indicators for each type. On this basis, the weight of each indicator is calculated to find the deficiencies in the development of some digital economic fields by the improved entropy method. By drawing on the empowerment idea of Analytic Hierarchy Process, the improved entropy method firstly compares the difference coefficient of indicators in pairs and maps the comparison results to the scales 1–9. Then, the judgment matrix is constructed based on the information entropy, which can solve as much as possible the problem that the difference among the weight of each indicator is too large in traditional entropy method. The results indicate that: the development of digital economy in Guangdong Province was relatively balanced from 2015 to 2018 and will be better in the future while the development of rural e-commerce in Guangdong Province is relatively backward, and there is an obvious digital gap between urban and rural areas. Next we extract two new variables respectively to replace the 20 indicators we select through principal component analysis and factor analysis methods in multivariate statistical analysis, which can retain the original information to the greatest extent and provide convenience for further research in the future. Finally, we and provide constructive comments of digital economy in Guangdong Province from 2015 to 2018.  相似文献   

9.
We consider state changes in quantum theory due to “conditional action” and relate these to the discussion of entropy decrease due to interventions of “intelligent beings” and the principles of Szilard and Landauer/Bennett. The mathematical theory of conditional actions is a special case of the theory of “instruments”, which describes changes of state due to general measurements and will therefore be briefly outlined in the present paper. As a detailed example, we consider the imperfect erasure of a qubit that can also be viewed as a conditional action and will be realized by the coupling of a spin to another small spin system in its ground state.  相似文献   

10.
Causal Geometry     
Information geometry has offered a way to formally study the efficacy of scientific models by quantifying the impact of model parameters on the predicted effects. However, there has been little formal investigation of causation in this framework, despite causal models being a fundamental part of science and explanation. Here, we introduce causal geometry, which formalizes not only how outcomes are impacted by parameters, but also how the parameters of a model can be intervened upon. Therefore, we introduce a geometric version of “effective information”—a known measure of the informativeness of a causal relationship. We show that it is given by the matching between the space of effects and the space of interventions, in the form of their geometric congruence. Therefore, given a fixed intervention capability, an effective causal model is one that is well matched to those interventions. This is a consequence of “causal emergence,” wherein macroscopic causal relationships may carry more information than “fundamental” microscopic ones. We thus argue that a coarse-grained model may, paradoxically, be more informative than the microscopic one, especially when it better matches the scale of accessible interventions—as we illustrate on toy examples.  相似文献   

11.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

12.
In 2015, I wrote a book with the same title as this article. The book’s subtitle is: “What we know and what we do not know.” On the book’s dedication page, I wrote: “This book is dedicated to readers of popular science books who are baffled, perplexed, puzzled, astonished, confused, and discombobulated by reading about Information, Entropy, Life and the Universe.” In the first part of this article, I will present the definitions of two central concepts: the “Shannon measure of information” (SMI), in Information Theory, and “Entropy”, in Thermodynamics. Following these definitions, I will discuss the framework of their applicability. In the second part of the article, I will examine the question of whether living systems and the entire universe are, or are not within the framework of applicability of the concepts of SMI and Entropy. I will show that much of the confusion that exists in the literature arises because of people’s ignorance about the framework of applicability of these concepts.  相似文献   

13.
14.
This paper starts from Schrödinger’s famous question “what is life” and elucidates answers that invoke, in particular, Friston’s free energy principle and its relation to the method of Bayesian inference and to Synergetics 2nd foundation that utilizes Jaynes’ maximum entropy principle. Our presentation reflects the shift from the emphasis on physical principles to principles of information theory and Synergetics. In view of the expected general audience of this issue, we have chosen a somewhat tutorial style that does not require special knowledge on physics but familiarizes the reader with concepts rooted in information theory and Synergetics.  相似文献   

15.
The meaning and evolution of the notion of “temperature” (which is a key concept for the condensed and gaseous matter theories) are addressed from different points of view. The concept of temperature has turned out to be much more fundamental than conventionally thought. In particular, the temperature may be introduced for systems built of a “small” number of particles and particles at rest. The Kelvin temperature scale may be introduced into quantum and relativistic physics due to the fact that the efficiency of the quantum and relativistic Carnot cycles coincides with that of the classical one. The relation of temperature with the metrics of the configurational space describing the behavior of systems built from non-interacting particles is demonstrated. The role of temperature in constituting inertia and gravity forces treated as entropy forces is addressed. The Landauer principle asserts that the temperature of a system is the only physical value defining the energy cost of the isothermal erasure of a single bit of information. The fundamental role of the temperature of the cosmic microwave background in modern cosmology is discussed. The range of problems and controversies related to the negative absolute temperature is treated.  相似文献   

16.
Finding the critical factor and possible “Newton’s laws” in financial markets has been an important issue. However, with the development of information and communication technologies, financial models are becoming more realistic but complex, contradicting the objective law “Greatest truths are the simplest.” Therefore, this paper presents an evolutionary model independent of micro features and attempts to discover the most critical factor. In the model, information is the only critical factor, and stock price is the emergence of collective behavior. The statistical properties of the model are significantly similar to the real market. It also explains the correlations of stocks within an industry, which provides a new idea for studying critical factors and core structures in the financial markets.  相似文献   

17.
This article reconsiders the double-slit experiment from the nonrealist or, in terms of this article, “reality-without-realism” (RWR) perspective, grounded in the combination of three forms of quantum discontinuity: (1) “Heisenberg discontinuity”, defined by the impossibility of a representation or even conception of how quantum phenomena come about, even though quantum theory (such as quantum mechanics or quantum field theory) predicts the data in question strictly in accord with what is observed in quantum experiments); (2) “Bohr discontinuity”, defined, under the assumption of Heisenberg discontinuity, by the view that quantum phenomena and the data observed therein are described by classical and not quantum theory, even though classical physics cannot predict them; and (3) “Dirac discontinuity” (not considered by Dirac himself, but suggested by his equation), according to which the concept of a quantum object, such as a photon or electron, is an idealization only applicable at the time of observation and not to something that exists independently in nature. Dirac discontinuity is of particular importance for the article’s foundational argument and its analysis of the double-slit experiment.  相似文献   

18.
This paper assesses two different theories for explaining consciousness, a phenomenon that is widely considered amenable to scientific investigation despite its puzzling subjective aspects. I focus on Integrated Information Theory (IIT), which says that consciousness is integrated information (as ϕMax) and says even simple systems with interacting parts possess some consciousness. First, I evaluate IIT on its own merits. Second, I compare it to a more traditionally derived theory called Neurobiological Naturalism (NN), which says consciousness is an evolved, emergent feature of complex brains. Comparing these theories is informative because it reveals strengths and weaknesses of each, thereby suggesting better ways to study consciousness in the future. IIT’s strengths are the reasonable axioms at its core; its strong logic and mathematical formalism; its creative “experience-first” approach to studying consciousness; the way it avoids the mind-body (“hard”) problem; its consistency with evolutionary theory; and its many scientifically testable predictions. The potential weakness of IIT is that it contains stretches of logic-based reasoning that were not checked against hard evidence when the theory was being constructed, whereas scientific arguments require such supporting evidence to keep the reasoning on course. This is less of a concern for the other theory, NN, because it incorporated evidence much earlier in its construction process. NN is a less mature theory than IIT, less formalized and quantitative, and less well tested. However, it has identified its own neural correlates of consciousness (NCC) and offers a roadmap through which these NNCs may answer the questions of consciousness using the hypothesize-test-hypothesize-test steps of the scientific method.  相似文献   

19.
Integrated information has been recently suggested as a possible measure to identify a necessary condition for a system to display conscious features. Recently, we have shown that astrocytes contribute to the generation of integrated information through the complex behavior of neuron–astrocyte networks. Still, it remained unclear which underlying mechanisms governing the complex behavior of a neuron–astrocyte network are essential to generating positive integrated information. This study presents an analytic consideration of this question based on exact and asymptotic expressions for integrated information in terms of exactly known probability distributions for a reduced mathematical model (discrete-time, discrete-state stochastic model) reflecting the main features of the “spiking–bursting” dynamics of a neuron–astrocyte network. The analysis was performed in terms of the empirical “whole minus sum” version of integrated information in comparison to the “decoder based” version. The “whole minus sum” information may change sign, and an interpretation of this transition in terms of “net synergy” is available in the literature. This motivated our particular interest in the sign of the “whole minus sum” information in our analytical considerations. The behaviors of the “whole minus sum” and “decoder based” information measures are found to bear a lot of similarity—they have mutual asymptotic convergence as time-uncorrelated activity increases, and the sign transition of the “whole minus sum” information is associated with a rapid growth in the “decoder based” information. The study aims at creating a theoretical framework for using the spiking–bursting model as an analytically tractable reference point for applying integrated information concepts to systems exhibiting similar bursting behavior. The model can also be of interest as a new discrete-state test bench for different formulations of integrated information.  相似文献   

20.
In this paper, a new parametric compound G family of continuous probability distributions called the Poisson generalized exponential G (PGEG) family is derived and studied. Relevant mathematical properties are derived. Some new bivariate G families using the theorems of “Farlie-Gumbel-Morgenstern copula”, “the modified Farlie-Gumbel-Morgenstern copula”, “the Clayton copula”, and “the Renyi’s entropy copula” are presented. Many special members are derived, and a special attention is devoted to the exponential and the one parameter Pareto type II model. The maximum likelihood method is used to estimate the model parameters. A graphical simulation is performed to assess the finite sample behavior of the estimators of the maximum likelihood method. Two real-life data applications are proposed to illustrate the importance of the new family.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号