首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Recent advances in artificial intelligence (AI) have led to its widespread industrial adoption, with machine learning systems demonstrating superhuman performance in a significant number of tasks. However, this surge in performance, has often been achieved through increased model complexity, turning such systems into “black box” approaches and causing uncertainty regarding the way they operate and, ultimately, the way that they come to decisions. This ambiguity has made it problematic for machine learning systems to be adopted in sensitive yet critical domains, where their value could be immense, such as healthcare. As a result, scientific interest in the field of Explainable Artificial Intelligence (XAI), a field that is concerned with the development of new methods that explain and interpret machine learning models, has been tremendously reignited over recent years. This study focuses on machine learning interpretability methods; more specifically, a literature review and taxonomy of these methods are presented, as well as links to their programming implementations, in the hope that this survey would serve as a reference point for both theorists and practitioners.  相似文献   

2.
Random Boolean Networks (RBNs for short) are strongly simplified models of gene regulatory networks (GRNs), which have also been widely studied as abstract models of complex systems and have been used to simulate different phenomena. We define the “common sea” (CS) as the set of nodes that take the same value in all the attractors of a given network realization, and the “specific part” (SP) as the set of all the other nodes, and we study their properties in different ensembles, generated with different parameter values. Both the CS and of the SP can be composed of one or more weakly connected components, which are emergent intermediate-level structures. We show that the study of these sets provides very important information about the behavior of the model. The distribution of distances between attractors is also examined. Moreover, we show how the notion of a “common sea” of genes can be used to analyze data from single-cell experiments.  相似文献   

3.
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.  相似文献   

4.
5.
The application of the conceptual analysis (CA) method outlined in Part I is illustrated on the example of quantum mechanics. In Part II, we deduce the complete-lattice structure in quantum mechanics from postulates specifying the idealizations that are accepted in the theory. The idealized abstract concepts are introduced by means of a topological extension of the basic structure (obtained in Part I) in accord with the “approximation principle”; the relevant topologies are not arbitrarily chosen; they are fixed by the choice of the idealizations. There is a typical topological asymmetry in the mathematical scheme. Convexity or linear structures do not play any role in the mathematical methods of this approach. The essential concept in Part II is the idealization of “perfect measurement” suggested by our conceptual analysis in Part I. The Hilbert-space representation will be deduced in Part III. In our papers, we keep to the tenet: The mathematical scheme of a physical theory must be rigorously formulated. However, for physics, mathematics is only a nice and useful tool; it is not purpose.  相似文献   

6.
Information transmission and storage have gained traction as unifying concepts to characterize biological systems and their chances of survival and evolution at multiple scales. Despite the potential for an information-based mathematical framework to offer new insights into life processes and ways to interact with and control them, the main legacy is that of Shannon’s, where a purely syntactic characterization of information scores systems on the basis of their maximum information efficiency. The latter metrics seem not entirely suitable for biological systems, where transmission and storage of different pieces of information (carrying different semantics) can result in different chances of survival. Based on an abstract mathematical model able to capture the parameters and behaviors of a population of single-celled organisms whose survival is correlated to information retrieval from the environment, this paper explores the aforementioned disconnect between classical information theory and biology. In this paper, we present a model, specified as a computational state machine, which is then utilized in a simulation framework constructed specifically to reveal emergence of a “subjective information”, i.e., trade-off between a living system’s capability to maximize the acquisition of information from the environment, and the maximization of its growth and survival over time. Simulations clearly show that a strategy that maximizes information efficiency results in a lower growth rate with respect to the strategy that gains less information but contains a higher meaning for survival.  相似文献   

7.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

8.
In recent years, law enforcement authorities have increasingly used mathematical tools to support criminal investigations, such as those related to terrorism. In this work, two relevant questions are discussed: “How can the different roles of members of a terrorist organization be recognized?” and “are there early signs of impending terrorist acts?” These questions are addressed using the tools of entropy and network theory, more specifically centralities (degree, betweenness, clustering) and their entropies. These tools were applied to data (physical contacts) of four real terrorist networks from different countries. The different roles of the members are clearly recognized from the values of the selected centralities. An early sign of impending terrorist acts is the evolutionary pattern of the values of the entropies of the selected centralities. These results have been confirmed in all four terrorist networks. The conclusion is expected to be useful to law enforcement authorities to identify the roles of the members of terrorist organizations as the members with high centrality and to anticipate when a terrorist attack is imminent, by observing the evolution of the entropies of the centralities.  相似文献   

9.
A new formulation involving fulfillment of all the Kolmogorov axioms is suggested for acomplete probability theory. This proves to be not a purely mathematical discipline. Probability theory deals with abstract objects—images of various classes of concrete objects—whereas experimental statistics deals with concrete objects alone. Both have to be taken into account. Quantum physics and classical statistical physics prove to be different aspects ofone probabilistic physics. The connection of quantum mechanics with classical statistical mechanics is examined and the origin of the Schrödinger equation is elucidated. Attention is given to the true meaning of the wave-corpuscle duality, and the incompleteness of nonrelativistic quantum mechanics is explained.  相似文献   

10.
With the increasing number of connected devices, complex systems such as smart homes record a multitude of events of various types, magnitude and characteristics. Current systems struggle to identify which events can be considered more memorable than others. In contrast, humans are able to quickly categorize some events as being more “memorable” than others. They do so without relying on knowledge of the system’s inner working or large previous datasets. Having this ability would allow the system to: (i) identify and summarize a situation to the user by presenting only memorable events; (ii) suggest the most memorable events as possible hypotheses in an abductive inference process. Our proposal is to use Algorithmic Information Theory to define a “memorability” score by retrieving events using predicative filters. We use smart-home examples to illustrate how our theoretical approach can be implemented in practice.  相似文献   

11.
We consider state changes in quantum theory due to “conditional action” and relate these to the discussion of entropy decrease due to interventions of “intelligent beings” and the principles of Szilard and Landauer/Bennett. The mathematical theory of conditional actions is a special case of the theory of “instruments”, which describes changes of state due to general measurements and will therefore be briefly outlined in the present paper. As a detailed example, we consider the imperfect erasure of a qubit that can also be viewed as a conditional action and will be realized by the coupling of a spin to another small spin system in its ground state.  相似文献   

12.
Uncertainty propagation in multi-parameter complex structures possess significant computational challenges. This paper investigates the possibility of using the High Dimensional Model Representation (HDMR) approach when uncertain system parameters are modeled using fuzzy variables. In particular, the application of HDMR is proposed for fuzzy finite element analysis of linear dynamical systems. The HDMR expansion is an efficient formulation for high-dimensional mapping in complex systems if the higher order variable correlations are weak, thereby permitting the input-output relationship behavior to be captured by the terms of low-order. The computational effort to determine the expansion functions using the α-cut method scales polynomically with the number of variables rather than exponentially. This logic is based on the fundamental assumption underlying the HDMR representation that only low-order correlations among the input variables are likely to have significant impacts upon the outputs for most high-dimensional complex systems. The proposed method is first illustrated for multi-parameter nonlinear mathematical test functions with fuzzy variables. The method is then integrated with a commercial finite element software (ADINA). Modal analysis of a simplified aircraft wing with fuzzy parameters has been used to illustrate the generality of the proposed approach. In the numerical examples, triangular membership functions have been used and the results have been validated against direct Monte Carlo simulations. It is shown that using the proposed HDMR approach, the number of finite element function calls can be reduced without significantly compromising the accuracy.  相似文献   

13.
In this paper, a new parametric compound G family of continuous probability distributions called the Poisson generalized exponential G (PGEG) family is derived and studied. Relevant mathematical properties are derived. Some new bivariate G families using the theorems of “Farlie-Gumbel-Morgenstern copula”, “the modified Farlie-Gumbel-Morgenstern copula”, “the Clayton copula”, and “the Renyi’s entropy copula” are presented. Many special members are derived, and a special attention is devoted to the exponential and the one parameter Pareto type II model. The maximum likelihood method is used to estimate the model parameters. A graphical simulation is performed to assess the finite sample behavior of the estimators of the maximum likelihood method. Two real-life data applications are proposed to illustrate the importance of the new family.  相似文献   

14.
This paper is concerned with the feasibility of the Arnold scrambling based on Improved Flexible Representation of Quantum Images (IFRQI). Firstly, the flexible representation of quantum image is updated to the improved flexible representation of quantum image (IFRQI) to represent a quantum image with arbitrary size L × B. Then, by making use of Control-NOT gate and Adder-Modular operation, the concrete quantum circuit of Arnold scrambling for IFRQI is designed. Simulation results show the effectiveness of the proposed circuit.  相似文献   

15.
Representation of an abstract quantum logic withan ordering set of states S in the form of a family L(S) of fuzzy subsets of S which fulfils conditionsanalogous to Kolmogorovian conditions imposed on -algebra of random events allows us toconstruct quantum probability calculus in a waycompletely parallel to the classical Kolmogorovianprobability calculus. It is shown that the quantumprobability calculus so constructed is a propergeneralization of the classical Kolmogorovian one. Someindications for building a phase-space representation ofquantum mechanics free of the problem of negativeprobabilities are given.  相似文献   

16.
The article argues that—at least in certain interpretations, such as the one assumed in this article under the heading of “reality without realism”—the quantum-theoretical situation appears as follows: While—in terms of probabilistic predictions—connected to and connecting the information obtained in quantum phenomena, the mathematics of quantum theory (QM or QFT), which is continuous, does not represent and is discontinuous with both the emergence of quantum phenomena and the physics of these phenomena, phenomena that are physically discontinuous with each other as well. These phenomena, and thus this information, are described by classical physics. All actually available information (in the mathematical sense of information theory) is classical: it is composed of units, such as bits, that are—or are contained in—entities described by classical physics. On the other hand, classical physics cannot predict this information when it is created, as manifested in measuring instruments, in quantum experiments, while quantum theory can. In this epistemological sense, this information is quantum. The article designates the discontinuity between quantum theory and the emergence of quantum phenomena the “Heisenberg discontinuity”, because it was introduced by W. Heisenberg along with QM, and the discontinuity between QM or QFT and the classical physics of quantum phenomena, the “Bohr discontinuity”, because it was introduced as part of Bohr’s interpretation of quantum phenomena and QM, under the assumption of Heisenberg discontinuity. Combining both discontinuities precludes QM or QFT from being connected to either physical reality, that ultimately responsible for quantum phenomena or that of these phenomena themselves, other than by means of probabilistic predictions concerning the information, classical in character, contained in quantum phenomena. The nature of quantum information is, in this view, defined by this situation. A major implication, discussed in the Conclusion, is the existence and arguably the necessity of two—classical and quantum—or with relativity, three and possibly more essentially different theories in fundamental physics.  相似文献   

17.
The subject of this article is the reconstruction of quantum mechanics on the basis of a formal language of quantum mechanical propositions. During recent years, research in the foundations of the language of science has given rise to adialogic semantics that is adequate in the case of a formal language for quantum physics. The system ofsequential logic which is comprised by the language is more general than classical logic; it includes the classical system as a special case. Although the system of sequential logic can be founded without reference to the empirical content of quantum physical propositions, it establishes an essential part of the structure of the mathematical formalism used in quantum mechanics. It is the purpose of this paper to demonstrate the connection between the formal language of quantum physics and its representation by mathematical structures in a self-contained way.  相似文献   

18.
The topological properties of the spatial coherence function are investigated rigorously. The phase singular structures (coherence vortices) of coherence function can be naturally deduced from the topological current, which is an abstract mathematical object studied previously. We find that coherence vortices are characterized by the Hopf index and Brouwer degree in topology. The coherence flux quantization and the linking of the closed coherence vortices are also studied from the topological properties of the spatial coherence function.  相似文献   

19.
20.
Integrated information has been recently suggested as a possible measure to identify a necessary condition for a system to display conscious features. Recently, we have shown that astrocytes contribute to the generation of integrated information through the complex behavior of neuron–astrocyte networks. Still, it remained unclear which underlying mechanisms governing the complex behavior of a neuron–astrocyte network are essential to generating positive integrated information. This study presents an analytic consideration of this question based on exact and asymptotic expressions for integrated information in terms of exactly known probability distributions for a reduced mathematical model (discrete-time, discrete-state stochastic model) reflecting the main features of the “spiking–bursting” dynamics of a neuron–astrocyte network. The analysis was performed in terms of the empirical “whole minus sum” version of integrated information in comparison to the “decoder based” version. The “whole minus sum” information may change sign, and an interpretation of this transition in terms of “net synergy” is available in the literature. This motivated our particular interest in the sign of the “whole minus sum” information in our analytical considerations. The behaviors of the “whole minus sum” and “decoder based” information measures are found to bear a lot of similarity—they have mutual asymptotic convergence as time-uncorrelated activity increases, and the sign transition of the “whole minus sum” information is associated with a rapid growth in the “decoder based” information. The study aims at creating a theoretical framework for using the spiking–bursting model as an analytically tractable reference point for applying integrated information concepts to systems exhibiting similar bursting behavior. The model can also be of interest as a new discrete-state test bench for different formulations of integrated information.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号