首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Integrated information has been recently suggested as a possible measure to identify a necessary condition for a system to display conscious features. Recently, we have shown that astrocytes contribute to the generation of integrated information through the complex behavior of neuron–astrocyte networks. Still, it remained unclear which underlying mechanisms governing the complex behavior of a neuron–astrocyte network are essential to generating positive integrated information. This study presents an analytic consideration of this question based on exact and asymptotic expressions for integrated information in terms of exactly known probability distributions for a reduced mathematical model (discrete-time, discrete-state stochastic model) reflecting the main features of the “spiking–bursting” dynamics of a neuron–astrocyte network. The analysis was performed in terms of the empirical “whole minus sum” version of integrated information in comparison to the “decoder based” version. The “whole minus sum” information may change sign, and an interpretation of this transition in terms of “net synergy” is available in the literature. This motivated our particular interest in the sign of the “whole minus sum” information in our analytical considerations. The behaviors of the “whole minus sum” and “decoder based” information measures are found to bear a lot of similarity—they have mutual asymptotic convergence as time-uncorrelated activity increases, and the sign transition of the “whole minus sum” information is associated with a rapid growth in the “decoder based” information. The study aims at creating a theoretical framework for using the spiking–bursting model as an analytically tractable reference point for applying integrated information concepts to systems exhibiting similar bursting behavior. The model can also be of interest as a new discrete-state test bench for different formulations of integrated information.  相似文献   

2.
The present study investigates the similarity problem associated with the onset of the Mach reflection of Zel’dovich–von Neumann–Döring (ZND) detonations in the near field. The results reveal that the self-similarity in the frozen-limit regime is strictly valid only within a small scale, i.e., of the order of the induction length. The Mach reflection becomes non-self-similar during the transition of the Mach stem from “frozen” to “reactive” by coupling with the reaction zone. The triple-point trajectory first rises from the self-similar result due to compressive waves generated by the “hot spot”, and then decays after establishment of the reactive Mach stem. It is also found, by removing the restriction, that the frozen limit can be extended to a much larger distance than expected. The obtained results elucidate the physical origin of the onset of Mach reflection with chemical reactions, which has previously been observed in both experiments and numerical simulations.  相似文献   

3.
In 2015, I wrote a book with the same title as this article. The book’s subtitle is: “What we know and what we do not know.” On the book’s dedication page, I wrote: “This book is dedicated to readers of popular science books who are baffled, perplexed, puzzled, astonished, confused, and discombobulated by reading about Information, Entropy, Life and the Universe.” In the first part of this article, I will present the definitions of two central concepts: the “Shannon measure of information” (SMI), in Information Theory, and “Entropy”, in Thermodynamics. Following these definitions, I will discuss the framework of their applicability. In the second part of the article, I will examine the question of whether living systems and the entire universe are, or are not within the framework of applicability of the concepts of SMI and Entropy. I will show that much of the confusion that exists in the literature arises because of people’s ignorance about the framework of applicability of these concepts.  相似文献   

4.
This is a review devoted to the complementarity–contextuality interplay with connection to the Bell inequalities. Starting the discussion with complementarity, I point to contextuality as its seed. Bohr contextuality is the dependence of an observable’s outcome on the experimental context; on the system–apparatus interaction. Probabilistically, complementarity means that the joint probability distribution (JPD) does not exist. Instead of the JPD, one has to operate with contextual probabilities. The Bell inequalities are interpreted as the statistical tests of contextuality, and hence, incompatibility. For context-dependent probabilities, these inequalities may be violated. I stress that contextuality tested by the Bell inequalities is the so-called joint measurement contextuality (JMC), the special case of Bohr’s contextuality. Then, I examine the role of signaling (marginal inconsistency). In QM, signaling can be considered as an experimental artifact. However, often, experimental data have signaling patterns. I discuss possible sources of signaling—for example, dependence of the state preparation on measurement settings. In principle, one can extract the measure of “pure contextuality” from data shadowed by signaling. This theory is known as contextuality by default (CbD). It leads to inequalities with an additional term quantifying signaling: Bell–Dzhafarov–Kujala inequalities.  相似文献   

5.
The lack of adequate indicators in the research of digital economy may lead to the shortage of data support on decision making for governments. To solve this problem, first we establish a digital economy indicator evaluation system by dividing the digital economy into four types: “basic type”, “technology type”, “integration type” and “service type” and select 5 indicators for each type. On this basis, the weight of each indicator is calculated to find the deficiencies in the development of some digital economic fields by the improved entropy method. By drawing on the empowerment idea of Analytic Hierarchy Process, the improved entropy method firstly compares the difference coefficient of indicators in pairs and maps the comparison results to the scales 1–9. Then, the judgment matrix is constructed based on the information entropy, which can solve as much as possible the problem that the difference among the weight of each indicator is too large in traditional entropy method. The results indicate that: the development of digital economy in Guangdong Province was relatively balanced from 2015 to 2018 and will be better in the future while the development of rural e-commerce in Guangdong Province is relatively backward, and there is an obvious digital gap between urban and rural areas. Next we extract two new variables respectively to replace the 20 indicators we select through principal component analysis and factor analysis methods in multivariate statistical analysis, which can retain the original information to the greatest extent and provide convenience for further research in the future. Finally, we and provide constructive comments of digital economy in Guangdong Province from 2015 to 2018.  相似文献   

6.
This review focuses on the in vitro synthesis of polysaccharides, the method of which is “enzymatic polymerization” mainly developed by our group. Polysaccharides are formed by repeated glycosylation reactions between a glycosyl donor and a glycosyl acceptor. A hydrolysis enzyme was found very efficient as catalyst, where the monomer is designed based on the new concept of a “transition-state analogue substrate” (TSAS); sugar fluoride monomers for polycondensation and sugar oxazoline monomers for ring-opening polyaddition. Enzymatic polymerization enabled the first in vitro synthesis of natural polysaccharides such as cellulose, xylan, chitin, hyaluronan and chondroitin, and also of unnatural polysaccharides such as a cellulose–chitin hybrid, a hyaluronan–chondroitin hybrid, and others. Supercatalysis of hyaluronidase was disclosed as unusual enzymatic multi-catalyst functions. Mutant enzymes were very useful for synthetic and mechanistic studies. In situ observations of enzymatic polymerization by SEM, TEM, and combined SAS methods revealed mechanisms of the polymerization and of the self-assembling of high-order molecular structure formed by elongating polysaccharide molecules.  相似文献   

7.
I explore the processes of equilibration exhibited by the Adapted Caldeira–Leggett (ACL) model, a small unitary “toy model” developed for numerical studies of quantum decoherence between an SHO and an environment. I demonstrate how dephasing allows equilibration to occur in a wide variety of situations. While the finite model size and other “unphysical” aspects prevent the notions of temperature and thermalization from being generally applicable, certain primitive aspects of thermalization can be realized for particular parameter values. I link the observed behaviors to intrinsic properties of the global energy eigenstates, and argue that the phenomena I observe contain elements which might be key ingredients that lead to ergodic behavior in larger more realistic systems. The motivations for this work range from curiosity about phenomena observed in earlier calculations with the ACL model to much larger questions related to the nature of equilibrium, thermalization, and the emergence of physical laws.  相似文献   

8.
To estimate the amount of evapotranspiration in a river basin, the “short period water balance method” was formulated. Then, by introducing the “complementary relationship method,” the amount of evapotranspiration was estimated seasonally, and with reasonable accuracy, for both small and large areas. Moreover, to accurately estimate river discharge in the low water season, the “weighted statistical unit hydrograph method” was proposed and a procedure for the calculation of the unit hydrograph was developed. Also, a new model, based on the “equivalent roughness method,” was successfully developed for the estimation of flood runoff from newly reclaimed farmlands. Based on the results of this research, a “composite reservoir model” was formulated to analyze the repeated use of irrigation water in large spatial areas. The application of this model to a number of watershed areas provided useful information with regard to the realities of water demand-supply systems in watersheds predominately dedicated to paddy fields, in Japan.  相似文献   

9.
The properties of decays that take place during jet formation cannot be easily deduced from the final distribution of particles in a detector. In this work, we first simulate a system of particles with well-defined masses, decay channels, and decay probabilities. This presents the “true system” for which we want to reproduce the decay probability distributions. Assuming we only have the data that this system produces in the detector, we decided to employ an iterative method which uses a neural network as a classifier between events produced in the detector by the “true system” and some arbitrary “test system”. In the end, we compare the distributions obtained with the iterative method to the “true” distributions.  相似文献   

10.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

11.
We consider state changes in quantum theory due to “conditional action” and relate these to the discussion of entropy decrease due to interventions of “intelligent beings” and the principles of Szilard and Landauer/Bennett. The mathematical theory of conditional actions is a special case of the theory of “instruments”, which describes changes of state due to general measurements and will therefore be briefly outlined in the present paper. As a detailed example, we consider the imperfect erasure of a qubit that can also be viewed as a conditional action and will be realized by the coupling of a spin to another small spin system in its ground state.  相似文献   

12.
We propose the first correct special-purpose quantum circuits for preparation of Bell diagonal states (BDS), and implement them on the IBM Quantum computer, characterizing and testing complex aspects of their quantum correlations in the full parameter space. Among the circuits proposed, one involves only two quantum bits but requires adapted quantum tomography routines handling classical bits in parallel. The entire class of Bell diagonal states is generated, and several characteristic indicators, namely entanglement of formation and concurrence, CHSH non-locality, steering and discord, are experimentally evaluated over the full parameter space and compared with theory. As a by-product of this work, we also find a remarkable general inequality between “quantum discord” and “asymmetric relative entropy of discord”: the former never exceeds the latter. We also prove that for all BDS the two coincide.  相似文献   

13.
In this paper, a new parametric compound G family of continuous probability distributions called the Poisson generalized exponential G (PGEG) family is derived and studied. Relevant mathematical properties are derived. Some new bivariate G families using the theorems of “Farlie-Gumbel-Morgenstern copula”, “the modified Farlie-Gumbel-Morgenstern copula”, “the Clayton copula”, and “the Renyi’s entropy copula” are presented. Many special members are derived, and a special attention is devoted to the exponential and the one parameter Pareto type II model. The maximum likelihood method is used to estimate the model parameters. A graphical simulation is performed to assess the finite sample behavior of the estimators of the maximum likelihood method. Two real-life data applications are proposed to illustrate the importance of the new family.  相似文献   

14.
It is known that “quantum non locality”, leading to the violation of Bell’s inequality and more generally of classical local realism, can be attributed to the conjunction of two properties, which we call here elementary locality and predictive completeness. Taking this point of view, we show again that quantum mechanics violates predictive completeness, allowing the making of contextual inferences, which can, in turn, explain why quantum non locality does not contradict relativistic causality. An important question remains: if the usual quantum state ψ is predictively incomplete, how do we complete it? We give here a set of new arguments to show that ψ should be completed indeed, not by looking for any “hidden variables”, but rather by specifying the measurement context, which is required to define actual probabilities over a set of mutually exclusive physical events.  相似文献   

15.
In this paper, we have analyzed the mathematical model of various nonlinear oscillators arising in different fields of engineering. Further, approximate solutions for different variations in oscillators are studied by using feedforward neural networks (NNs) based on the backpropagated Levenberg–Marquardt algorithm (BLMA). A data set for different problem scenarios for the supervised learning of BLMA has been generated by the Runge–Kutta method of order 4 (RK-4) with the “NDSolve” package in Mathematica. The worth of the approximate solution by NN-BLMA is attained by employing the processing of testing, training, and validation of the reference data set. For each model, convergence analysis, error histograms, regression analysis, and curve fitting are considered to study the robustness and accuracy of the design scheme.  相似文献   

16.
Ordinal patterns classifying real vectors according to the order relations between their components are an interesting basic concept for determining the complexity of a measure-preserving dynamical system. In particular, as shown by C. Bandt, G. Keller and B. Pompe, the permutation entropy based on the probability distributions of such patterns is equal to Kolmogorov–Sinai entropy in simple one-dimensional systems. The general reason for this is that, roughly speaking, the system of ordinal patterns obtained for a real-valued “measuring arrangement” has high potential for separating orbits. Starting from a slightly different approach of A. Antoniouk, K. Keller and S. Maksymenko, we discuss the generalizations of ordinal patterns providing enough separation to determine the Kolmogorov–Sinai entropy. For defining these generalized ordinal patterns, the idea is to substitute the basic binary relation ≤ on the real numbers by another binary relation. Generalizing the former results of I. Stolz and K. Keller, we establish conditions that the binary relation and the dynamical system have to fulfill so that the obtained generalized ordinal patterns can be used for estimating the Kolmogorov–Sinai entropy.  相似文献   

17.
18.
Objective: To examine the changes in postural alignment and kyphosis-correlated factors after 6 months of back extensor strengthening exercise in a group of community-dwelling older adults aged ≥65 years. Methods: We quasi-randomized 29 subjects into an intervention group treated with a back extensor strengthening program and a control group treated with a full-body exercise program. These groups completed 20-30 minutes of exercise directed by a physical therapist one or more times per week and were instructed to exercise at home as well. The participants were assessed prior to and after the intervention using the following criteria: postural alignment of “usual” and “best” posture, physical function, physical performance, self-efficacy, and quality of life. The differences between two factors (group and period) were compared for each of the measurement variables. Results: Subjects who adequately completed the exercises were analyzed. A reduced knee flexion angle was noted in the “best” posture of both groups, as were improved physical function and performance with the exception of one-leg standing time. Verifying the effect size in the post-hoc analysis, the body parts that showed changes to postural alignment after the intervention differed between groups. Conclusions: Back extensor strengthening exercises improved physical function and performance, but did not improve spinal alignment. The changes due to these interventions were not significantly different from changes observed in the full-body exercise group. However, post-hoc analysis revealed that the effect size of posture change was different, possible indicating that the two groups experienced different changes in the postural alignment.  相似文献   

19.
This article reconsiders the double-slit experiment from the nonrealist or, in terms of this article, “reality-without-realism” (RWR) perspective, grounded in the combination of three forms of quantum discontinuity: (1) “Heisenberg discontinuity”, defined by the impossibility of a representation or even conception of how quantum phenomena come about, even though quantum theory (such as quantum mechanics or quantum field theory) predicts the data in question strictly in accord with what is observed in quantum experiments); (2) “Bohr discontinuity”, defined, under the assumption of Heisenberg discontinuity, by the view that quantum phenomena and the data observed therein are described by classical and not quantum theory, even though classical physics cannot predict them; and (3) “Dirac discontinuity” (not considered by Dirac himself, but suggested by his equation), according to which the concept of a quantum object, such as a photon or electron, is an idealization only applicable at the time of observation and not to something that exists independently in nature. Dirac discontinuity is of particular importance for the article’s foundational argument and its analysis of the double-slit experiment.  相似文献   

20.
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号