首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Jing Wang 《中国物理 B》2021,30(12):120518-120518
The statistical model for community detection is a promising research area in network analysis. Most existing statistical models of community detection are designed for networks with a known type of community structure, but in many practical situations, the types of community structures are unknown. To cope with unknown community structures, diverse types should be considered in one model. We propose a model that incorporates the latent interaction pattern, which is regarded as the basis of constructions of diverse community structures by us. The interaction pattern can parameterize various types of community structures in one model. A collapsed Gibbs sampling inference is proposed to estimate the community assignments and other hyper-parameters. With the Pitman-Yor process as a prior, our model can automatically detect the numbers and sizes of communities without a known type of community structure beforehand. Via Bayesian inference, our model can detect some hidden interaction patterns that offer extra information for network analysis. Experiments on networks with diverse community structures demonstrate that our model outperforms four state-of-the-art models.  相似文献   

2.
Complex networks with binary-state dynamics represent many meaningful behaviors in a variety of contexts. Reconstruction of networked systems hosting delayed binary processes with hidden nodes becomes an outstanding challenge in this field. To address this issue, we extend the statistical inference method to complex networked systems with distinct binary-state dynamics in presence of time delay and missing data. By exploiting the expectation-maximization (EM) algorithm, we implement the statistical inference based approach to different (i.e., random, small world, and scale-free) networks hosting delayed-binary processes. Our framework is completely data driven, and does not require any a prior knowledge about the detailed dynamical process on the network; especially, our method can independently infer each physical connectivity and estimate the time delay solely from the data of a pair of nodes in this link. We provide a physical understanding of the underlying mechanism; and extensive numerical simulations validate the robustness, efficiency, and accuracy of our method.  相似文献   

3.
We study the dynamics of two-level atomic systems(qubits) subject to a double-layer environment that consists of a network of single-mode cavities coupled to a common reservoir. A general exact master equation for the dynamics of a qubit system can be obtained by the quantum-state-diffusion(QSD) approach, which is extended to our spin-cavity-boson model. The quantumness of the atoms comprising coherence and entanglement is investigated for various configurations of the double-layer environment.The findings indicate that parametric control is available for the preservation and generation of system-quantumness by regulating the cavity network. Moreover the underlying physics is profoundly revealed by an effective model obtained by a unitary transformation. Therefore, our work provides an interesting proposal to protect the quantumness of open systems in the framework of a double-layer environment containing bosonic modes.  相似文献   

4.
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic modeling. Historically, probabilistic modeling has been constrained to very restricted model classes, where exact or approximate probabilistic inference is feasible. However, developments in variational inference, a general form of approximate probabilistic inference that originated in statistical physics, have enabled probabilistic modeling to overcome these limitations: (i) Approximate probabilistic inference is now possible over a broad class of probabilistic models containing a large number of parameters, and (ii) scalable inference methods based on stochastic gradient descent and distributed computing engines allow probabilistic modeling to be applied to massive data sets. One important practical consequence of these advances is the possibility to include deep neural networks within probabilistic models, thereby capturing complex non-linear stochastic relationships between the random variables. These advances, in conjunction with the release of novel probabilistic modeling toolboxes, have greatly expanded the scope of applications of probabilistic models, and allowed the models to take advantage of the recent strides made by the deep learning community. In this paper, we provide an overview of the main concepts, methods, and tools needed to use deep neural networks within a probabilistic modeling framework.  相似文献   

5.
Extreme value theory for chaotic deterministic dynamical systems is a rapidly expanding area of research. Given a system and a real function (observable) defined on its phase space, extreme value theory studies the limit probabilistic laws obeyed by large values attained by the observable along orbits of the system. Based on this theory, the so-called block maximum method is often used in applications for statistical prediction of large value occurrences. In this method, one performs statistical inference for the parameters of the Generalised Extreme Value (GEV) distribution, using maxima over blocks of regularly sampled observable values along an orbit of the system. The observables studied so far in the theory are expressed as functions of the distance with respect to a point, which is assumed to be a density point of the system’s invariant measure. However, at least with respect to the ambient (usually Euclidean) metric, this is not the structure of the observables typically encountered in physical applications, such as windspeed or vorticity in atmospheric models. In this paper we consider extreme value limit laws for observables which are not expressed as functions of the distance (in the ambient metric) from a density point of the dynamical system. In such cases, the limit laws are no longer determined by the functional form of the observable and the dimension of the invariant measure: they also depend on the specific geometry of the underlying attractor and of the observable’s level sets. We present a collection of analytical and numerical results, starting with a toral hyperbolic automorphism as a simple template to illustrate the main ideas. We then formulate our main results for a uniformly hyperbolic system, the solenoid map. We also discuss non-uniformly hyperbolic examples of maps (Hénon and Lozi maps) and of flows (the Lorenz63 and Lorenz84 models). Our purpose is to outline the main ideas and to highlight several serious problems found in the numerical estimation of the limit laws.  相似文献   

6.
A purely statistical characterization of measurements of observables (described by spectral measures in conventional formalism of quantum mechanics) is given in the framework of the general statistical (convex) approach. The relation to physical premises underlying the conventional notion of observable is discussed. Structural aspects of general statistical models such as central decomposition and characterization of classical models are considered. It is shown by explicit construction that an arbitrary statistical model admits a formal introduction of “hidden variables” preserving the structural properties of a single statistical model. The relation of this result to other theorems on hidden variables is discussed.  相似文献   

7.
Active inference is an increasingly prominent paradigm in theoretical biology. It frames the dynamics of living systems as if they were solving an inference problem. This rests upon their flow towards some (non-equilibrium) steady state—or equivalently, their maximisation of the Bayesian model evidence for an implicit probabilistic model. For many models, these self-evidencing dynamics manifest as messages passed among elements of a system. Such messages resemble synaptic communication at a neuronal network level but could also apply to other network structures. This paper attempts to apply the same formulation to biochemical networks. The chemical computation that occurs in regulation of metabolism relies upon sparse interactions between coupled reactions, where enzymes induce conditional dependencies between reactants. We will see that these reactions may be viewed as the movement of probability mass between alternative categorical states. When framed in this way, the master equations describing such systems can be reformulated in terms of their steady-state distribution. This distribution plays the role of a generative model, affording an inferential interpretation of the underlying biochemistry. Finally, we see that—in analogy with computational neurology and psychiatry—metabolic disorders may be characterized as false inference under aberrant prior beliefs.  相似文献   

8.
ABSTRACT

Finding parameters that minimise a loss function is at the core of many machine learning methods. The Stochastic Gradient Descent (SGD) algorithm is widely used and delivers state-of-the-art results for many problems. Nonetheless, SGD typically cannot find the global minimum, thus its empirical effectiveness is hitherto mysterious. We derive a correspondence between parameter inference and free energy minimisation in statistical physics. The degree of undersampling plays the role of temperature. Analogous to the energy–entropy competition in statistical physics, wide but shallow minima can be optimal if the system is undersampled, as is typical in many applications. Moreover, we show that the stochasticity in the algorithm has a non-trivial correlation structure which systematically biases it towards wide minima. We illustrate our argument with two prototypical models: image classification using deep learning and a linear neural network where we can analytically reveal the relationship between entropy and out-of-sample error.  相似文献   

9.
Analyzing open-source software systems as complex networks   总被引:1,自引:0,他引:1  
Xiaolong Zheng  Huiqian Li 《Physica A》2008,387(24):6190-6200
Software systems represent one of the most complex man-made artifacts. Understanding the structure of software systems can provide useful insights into software engineering efforts and can potentially help the development of complex system models applicable to other domains. In this paper, we analyze one of the most popular open-source Linux meta packages/distributions called the Gentoo Linux. In our analysis, we model software packages as nodes and dependencies among them as edges. Our empirical results show that the resulting Gentoo network cannot be easily explained by existing complex network models. This in turn motivates our research in developing two new network growth models in which a new node is connected to an old node with the probability that depends not only on the degree but also on the “age” of the old node. Through computational and empirical studies, we demonstrate that our models have better explanatory power than the existing ones. In an effort to further explore the properties of these new models, we also present some related analytical results.  相似文献   

10.
11.
Reconstructability Analysis (RA) and Bayesian Networks (BN) are both probabilistic graphical modeling methodologies used in machine learning and artificial intelligence. There are RA models that are statistically equivalent to BN models and there are also models unique to RA and models unique to BN. The primary goal of this paper is to unify these two methodologies via a lattice of structures that offers an expanded set of models to represent complex systems more accurately or more simply. The conceptualization of this lattice also offers a framework for additional innovations beyond what is presented here. Specifically, this paper integrates RA and BN by developing and visualizing: (1) a BN neutral system lattice of general and specific graphs, (2) a joint RA-BN neutral system lattice of general and specific graphs, (3) an augmented RA directed system lattice of prediction graphs, and (4) a BN directed system lattice of prediction graphs. Additionally, it (5) extends RA notation to encompass BN graphs and (6) offers an algorithm to search the joint RA-BN neutral system lattice to find the best representation of system structure from underlying system variables. All lattices shown in this paper are for four variables, but the theory and methodology presented in this paper are general and apply to any number of variables. These methodological innovations are contributions to machine learning and artificial intelligence and more generally to complex systems analysis. The paper also reviews some relevant prior work of others so that the innovations offered here can be understood in a self-contained way within the context of this paper.  相似文献   

12.
Front propagation is a ubiquitous phenomenon. It arises in physical, biological and cross-disciplinary systems as diverse as flame propagation, superconductors, virus infections, cancer spread or transitions in human prehistory. Here we derive a single, approximate front speed from three rather different time-delayed reaction–diffusion models, suggesting a general law. According to our approximate speed, fronts are crucially driven by the lag times (periods during which individuals or particles do not move). Rather surprisingly, the approximate speed is able to explain the observed spread rates of completely different biophysical systems such as virus infections, the Neolithic transition in Europe, and postglacial tree recolonizations.  相似文献   

13.
Network theory provides various tools for investigating the structural or functional topology of many complex systems found in nature, technology and society. Nevertheless, it has recently been realised that a considerable number of systems of interest should be treated, more appropriately, as interacting networks or networks of networks. Here we introduce a novel graph-theoretical framework for studying the interaction structure between subnetworks embedded within a complex network of networks. This framework allows us to quantify the structural role of single vertices or whole subnetworks with respect to the interaction of a pair of subnetworks on local, mesoscopic and global topological scales. Climate networks have recently been shown to be a powerful tool for the analysis of climatological data. Applying the general framework for studying interacting networks, we introduce coupled climate subnetworks to represent and investigate the topology of statistical relationships between the fields of distinct climatological variables. Using coupled climate subnetworks to investigate the terrestrial atmosphere’s three-dimensional geopotential height field uncovers known as well as interesting novel features of the atmosphere’s vertical stratification and general circulation. Specifically, the new measure “cross-betweenness” identifies regions which are particularly important for mediating vertical wind field interactions. The promising results obtained by following the coupled climate subnetwork approach present a first step towards an improved understanding of the Earth system and its complex interacting components from a network perspective.  相似文献   

14.
The free energy principle (FEP) states that any dynamical system can be interpreted as performing Bayesian inference upon its surrounding environment. Although, in theory, the FEP applies to a wide variety of systems, there has been almost no direct exploration or demonstration of the principle in concrete systems. In this work, we examine in depth the assumptions required to derive the FEP in the simplest possible set of systems – weakly-coupled non-equilibrium linear stochastic systems. Specifically, we explore (i) how general the requirements imposed on the statistical structure of a system are and (ii) how informative the FEP is about the behaviour of such systems. We discover that two requirements of the FEP – the Markov blanket condition (i.e. a statistical boundary precluding direct coupling between internal and external states) and stringent restrictions on its solenoidal flows (i.e. tendencies driving a system out of equilibrium) – are only valid for a very narrow space of parameters. Suitable systems require an absence of perception-action asymmetries that is highly unusual for living systems interacting with an environment. More importantly, we observe that a mathematically central step in the argument, connecting the behaviour of a system to variational inference, relies on an implicit equivalence between the dynamics of the average states of a system with the average of the dynamics of those states. This equivalence does not hold in general even for linear stochastic systems, since it requires an effective decoupling from the system's history of interactions. These observations are critical for evaluating the generality and applicability of the FEP and indicate the existence of significant problems of the theory in its current form. These issues make the FEP, as it stands, not straightforwardly applicable to the simple linear systems studied here and suggest that more development is needed before the theory could be applied to the kind of complex systems that describe living and cognitive processes.  相似文献   

15.
一种新型电力网络局域世界演化模型   总被引:7,自引:0,他引:7       下载免费PDF全文
现实世界中的许多系统都可以用复杂网络来描述,电力系统是人类创造的最为复杂的网络系统之一.当前经典的网络模型与实际电力网络存在较大差异.从电力网络本身的演化机理入手,提出并研究了一种可以模拟电力网络演化规律的新型局域世界网络演化模型.理论分析表明该模型的度分布具有幂尾特性,且幂律指数在3—∞之间可调.最后通过对中国北方电网和美国西部电网的仿真以及和无标度网络、随机网络的对比,验证了该模型可以很好地反映电力网络的演化规律,并且进一步证实了电力网络既不是无标度网络,也不是完全的随机网络. 关键词: 电力网络 演化模型 局域世界 幂律分布  相似文献   

16.
Much of human cooperation remains an evolutionary riddle. Coevolutionary public goods games in structured populations are studied where players can change from an unproductive public goods game to a productive one, by evaluating the productivity of the public goods games. In our model, each individual participates in games organized by its neighborhood plus by itself. Coevolution here refers to an evolutionary process entailing both deletion of existing links and addition of new links between agents that accompanies the evolution of their strategies. Furthermore, we investigate the effects of time scale separation of strategy and structure on cooperation level. This study presents the following: Foremost, we observe that high cooperation levels in public goods interactions are attained by the entangled coevolution of strategy and structure. Presented results also confirm that the resulting networks show many features of real systems, such as cooperative behavior and hierarchical clustering. The heterogeneity of the interaction network is held responsible for the observed promotion of cooperation. We hope our work may offer an explanation for the origin of large-scale cooperative behavior among unrelated individuals.  相似文献   

17.
Revealing how a biological network is organized to realize its function is one of the main topics in systems biology. The functional backbone network, defined as the primary structure of the biological network, is of great importance in maintaining the main function of the biological network. We propose a new algorithm, the tinker algorithm, to determine this core structure and apply it in the cell-cycle system. With this algorithm, the backbone network of the cell-cycle network can be determined accurately and efficiently in various models such as the Boolean model, stochastic model, and ordinary differential equation model. Results show that our algorithm is more efficient than that used in the previous research. We hope this method can be put into practical use in relevant future studies.  相似文献   

18.
Active inference is a physics of life process theory of perception, action and learning that is applicable to natural and artificial agents. In this paper, active inference theory is related to different types of practice in social organization. Here, the term social organization is used to clarify that this paper does not encompass organization in biological systems. Rather, the paper addresses active inference in social organization that utilizes industrial engineering, quality management, and artificial intelligence alongside human intelligence. Social organization referred to in this paper can be in private companies, public institutions, other for-profit or not-for-profit organizations, and any combination of them. The relevance of active inference theory is explained in terms of variational free energy, prediction errors, generative models, and Markov blankets. Active inference theory is most relevant to the social organization of work that is highly repetitive. By contrast, there are more challenges involved in applying active inference theory for social organization of less repetitive endeavors such as one-of-a-kind projects. These challenges need to be addressed in order for active inference to provide a unifying framework for different types of social organization employing human and artificial intelligence.  相似文献   

19.
Clustering is a major unsupervised learning algorithm and is widely applied in data mining and statistical data analyses. Typical examples include k-means, fuzzy c-means, and Gaussian mixture models, which are categorized into hard, soft, and model-based clusterings, respectively. We propose a new clustering, called Pareto clustering, based on the Kolmogorov–Nagumo average, which is defined by a survival function of the Pareto distribution. The proposed algorithm incorporates all the aforementioned clusterings plus maximum-entropy clustering. We introduce a probabilistic framework for the proposed method, in which the underlying distribution to give consistency is discussed. We build the minorize-maximization algorithm to estimate the parameters in Pareto clustering. We compare the performance with existing methods in simulation studies and in benchmark dataset analyses to demonstrate its highly practical utilities.  相似文献   

20.
Many real-life processes are black-box problems, i.e., the internal workings are inaccessible or a closed-form mathematical expression of the likelihood function cannot be defined. For continuous random variables, likelihood-free inference problems can be solved via Approximate Bayesian Computation (ABC). However, an optimal alternative for discrete random variables is yet to be formulated. Here, we aim to fill this research gap. We propose an adjusted population-based MCMC ABC method by re-defining the standard ABC parameters to discrete ones and by introducing a novel Markov kernel that is inspired by differential evolution. We first assess the proposed Markov kernel on a likelihood-based inference problem, namely discovering the underlying diseases based on a QMR-DTnetwork and, subsequently, the entire method on three likelihood-free inference problems: (i) the QMR-DT network with the unknown likelihood function, (ii) the learning binary neural network, and (iii) neural architecture search. The obtained results indicate the high potential of the proposed framework and the superiority of the new Markov kernel.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号