首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This work concerns the interaction between two classical problems: the forecasting of the dynamical behaviors of elementary cellular automata (ECA) from its intrinsic mathematical laws and the conditions that determine the emergence of complex dynamics. To approach these problems, and inspired by the theory of reversible logical gates, we decompose the ECA laws in a “spectrum” of dyadic Boolean gates. Emergent properties due to interactions are captured generating another spectrum of logical gates. The combined analysis of both spectra shows the existence of characteristic bias in the distribution of Boolean gates for ECA belonging to different dynamical classes. These results suggest the existence of signatures capable to indicate the propensity to develop complex dynamics. Logical gates “exclusive‐or” and “equivalence” are among these signatures of complexity. An important conclusion is that within ECA space, interactions are not capable to generate signatures of complexity in the case these signatures are absent in the intrinsic law of the automaton. © 2004 Wiley Periodicals, Inc. Complexity 9: 33–42, 2004  相似文献   

2.
Analysis and Modeling is the first “phase” of understanding or developing a system. It is also, maybe more importantly, the foundation of understanding a natural science or system. It's abstract and conceptually difficult but, being foundational, contributes the most to the quality of understanding of (designed or natural) systems. Complex Systems have a natural hierarchy of levels and multiple subsystems. The character and functionality of each level or subsystem “emerges” across its boundaries. Both sides of these boundaries must be understood within that side's natural thought patterns. Integrated interdisciplinary collaboration is essential for making sense of complex systems; but collaboration among disciplines is difficult, because of their different ways of thinking. This creates a dilemma, “understanding complex systems” is one horn; “integrated interdisciplinary collaboration” is the other. This dilemma in complex system analysis/modeling and interdiscipline collaboration, is currently addressed by “grabbing the bull by the horns.” This takes on this doubly complex problem, by painstakingly building up abstract “bull wrestling” skills in and across domains and disciplines. There's another wrinkle; complexity requires interdisciplinary collaboration at deeper, more dissimilar, levels. The usual approach, finding a way to “pass between the horns of the dilemma” will not work here, due to this cross coupling. Rather than trying to pass between the horns, by abstracting away the coupling, we overtly organizing this coupling. We weave a semantic unification space of conceptual connections linking each side of a boundary to its appropriate way of thinking. This allows us to abstracting away the dilemma and iron out the wrinkle. The threads of common image schemas, cognitive metaphors and conceptual interfaces, weave a bridge between the semantics foundations and organizations of each problem. These allow addressing the problems synergistically. This paper presents and explores a naturally valid way for discipline specific and discipline integrating addressing complex systems. We start with the methodological insights from analysis and modeling from the perspective of object orientation, with its ontologies, organizing lexical semantics. We advance from there by integrating in imagistic, imaginative semantics and affordance based interaction methodology, as the keys to addressing complex systems analysis, modeling and integration. © 2007 Wiley Periodicals, Inc. Complexity, 2007  相似文献   

3.
Distributed computing systems are becoming bigger and more complex. Although the complexity of large‐scale distributed systems has been acknowledged to be an important challenge, there has not been much work in defining or measuring system complexity. Thus, today, it is difficult to compare the complexities of different systems, or to state that one system is easier to program, to manage, or to use than another. In this article, we try to understand the factors that cause computing systems to appear very complex to people. We define different aspects of system complexity and propose metrics for measuring these aspects. We also show how these aspects affect different kinds of people—viz. developers, administrators, and end‐users. On the basis of the aspects and metrics of complexity that we identify, we propose general guidelines that can help reduce the complexity of systems. © 2007 Wiley Periodicals, Inc. Complexity 12: 37–45, 2007  相似文献   

4.
The elementary cellular automaton following rule 184 can mimic particles flowing in one direction at a constant speed. Therefore, this automaton can model highway traffic qualitatively. In a recent paper, we have incorporated intersections regulated by traffic lights to this model using exclusively elementary cellular automata. In such a paper, however, we only explored a rectangular grid. We now extend our model to more complex scenarios using an hexagonal grid. This extension shows first that our model can readily incorporate multiple‐way intersections and hence simulate complex scenarios. In addition, the current extension allows us to study and evaluate the behavior of two different kinds of traffic‐light controller for a grid of six‐way streets allowing for either two‐ or three‐street intersections: a traffic light that tries to adapt to the amount of traffic (which results in self‐organizing traffic lights) and a system of synchronized traffic lights with coordinated rigid periods (sometimes called the “green‐wave” method). We observe a tradeoff between system capacity and topological complexity. The green‐wave method is unable to cope with the complexity of a higher‐capacity scenario, while the self‐organizing method is scalable, adapting to the complexity of a scenario and exploiting its maximum capacity. Additionally, in this article, we propose a benchmark, independent of methods and models, to measure the performance of a traffic‐light controller comparing it against a theoretical optimum. © 2011 Wiley Periodicals, Inc. Complexity, 2012  相似文献   

5.
When a dynamical system with multiple point attractors is released from an arbitrary initial condition, it will relax into a configuration that locally resolves the constraints or opposing forces between interdependent state variables. However, when there are many conflicting interdependencies between variables, finding a configuration that globally optimizes these constraints by this method is unlikely or may take many attempts. Here, we show that a simple distributed mechanism can incrementally alter a dynamical system such that it finds lower energy configurations, more reliably and more quickly. Specifically, when Hebbian learning is applied to the connections of a simple dynamical system undergoing repeated relaxation, the system will develop an associative memory that amplifies a subset of its own attractor states. This modifies the dynamics of the system such that its ability to find configurations that minimize total system energy, and globally resolve conflicts between interdependent variables, is enhanced. Moreover, we show that the system is not merely “recalling” low energy states that have been previously visited but “predicting” their location by generalizing over local attractor states that have already been visited. This “self‐modeling” framework, i.e., a system that augments its behavior with an associative memory of its own attractors, helps us better understand the conditions under which a simple locally mediated mechanism of self‐organization can promote significantly enhanced global resolution of conflicts between the components of a complex adaptive system. We illustrate this process in random and modular network constraint problems equivalent to graph coloring and distributed task allocation problems. © 2010 Wiley Periodicals, Inc. Complexity 16: 17–26, 2011  相似文献   

6.
Proxy data allows the temperature of the Earth to be mapped over long periods of time. In this work the temperature fluctuations for over 200 proxy data sets were examined and from this set 50 sets were analyzed to test for periodic and quasi-periodic fluctuations in the data sets. Temperature reconstructions over 4 different time scales were analyzed to see if patterns emerged. Data were put into four time intervals; 4,000 years, 14,000 years, 1,000,000 years, and 3,000,000 years and analyzed with a goal to understanding periodic and quasi-periodic patterns in global temperature change superimposed on a “background” average temperature change. Quasi-periodic signatures were identified that predate the Industrial Revolution, during much of which direct data on temperature are not available. These data indicate that Earth temperatures have undergone a number of periodic and quasi-periodic intervals that contain both global warming and global cooling cycles. The fluctuations are superimposed on a background of temperature change that has a declining slope during the two periods, pre-ice age and post ice age with a transition about 12,000 BCE. The data are divided into “events” that span the time periods 3,000,000 BCE to “0” CE, 1,000,000 BCE to “0” CE, 12,000 BCE to 2,000 CE and 2,000 BCE to 2,000 CE. An equation using a quasi-periodic (frequency modulated sine waves) patterns was developed to analyze the date sets for quasi-periodic patterns. “Periodicities” which show reasonable agreement with the predictions of Milankovitch and other investigators were found in the data sets.  相似文献   

7.
Often relegated to the methods section of genetic research articles, the term “degeneracy” is regularly misunderstood and its theoretical significance widely understated. Degeneracy describes the ability of different structures to be conditionally interchangeable in their contribution to system functions. Frequently mislabeled redundancy, degeneracy refers to structural variation whereas redundancy refers to structural duplication. Sources of degeneracy include, but are not limited to, (1) duplicate structures that differentiate yet remain isofunctional, (2) unrelated isofunctional structures that are dispersed endogenously or exogenously, (3) variable arrangements of interacting structures that achieve the same output through multiple pathways, and (4) parcellation of a structure into subunits that can still variably perform the same initial function. The ability to perform the same function by drawing upon an array of dissimilar structures contributes advantageously to the integrity of a system. Drawing attention to the heterogeneous construction of living systems by highlighting the concept of degeneracy valuably enhances the ways scientists think about self‐organization, robustness, and complexity. Labels in science, however, can sometimes be misleading. In scientific nomenclature, the word “degeneracy” has calamitous proximity to the word “degeneration” used by pathologists and the shunned theory of degeneration once promoted by eugenicists. This article disentangles the concept of degeneracy from its close etymological siblings and offers a brief overview of the historical and contemporary understandings of degeneracy in science. Distinguishing the importance of degeneracy will hopefully allow systems theorists to more strategically operationally conceptualize the distributed intersecting networks that comprise complex living systems. © 2014 Wiley Periodicals, Inc. Complexity 20: 12–21, 2015  相似文献   

8.
We investigate group-theoretic “signatures” of odd cycles of a graph, and their connections to topological obstructions to 3-colourability. In the case of signatures derived from free groups, we prove that the existence of an odd cycle with trivial signature is equivalent to having the coindex of the hom-complex at least 2 (which implies that the chromatic number is at least 4). In the case of signatures derived from elementary abelian 2-groups we prove that the existence of an odd cycle with trivial signature is a sufficient condition for having the index of the hom-complex at least 2 (which again implies that the chromatic number is at least 4).  相似文献   

9.
In our paper numerical simulations of chemical pattern in ionic reaction‐diffusion‐migration system assuming a “self‐consistent” electric field are presented. Chemical waves as well as stationary concentration pattern arise due to an interplay of an autocatalytic chemical reaction with transport processes. Concentration gradient inside the chemical pattern lead to electric diffusion‐potential which in turn affect the patterns. Thus, the model equations take the general form of the Fokker‐Planck equation.  相似文献   

10.
The self‐similar tree topology in open dissipative systems is formed as a result of self‐organization and found in various examples, such as river networks, blood vessels, vascular organizations in plants, and even lightning. It is generally assumed that the tree organization is a result of a dynamic process that minimizes the dissipation of energy. Here, we argue that inherent randomness is a sufficient condition for the generation of tree patterns under evolutionary dynamics and the decrease of energy expenditure is not the cause but a consequent signature. © 2008 Wiley Periodicals, Inc. Complexity, 13: 30–37, 2008  相似文献   

11.
Bayesian maxent lets one integrate thermal physics and information theory points of view in the quantitative study of complex systems. Since net surprisal (a free energy analog for measuring “departures from expected”) allows one to place second law constraints on mutual information (a multimoment measure of correlations), it makes a quantitative case for the role of reversible thermalization in the natural history of invention, and suggests multiscale strategies to monitor standing crop as well. It prompts one to track evolved complexity starting from live astrophysically observed processes, rather than only from evidence of past events. Various gradients and boundaries that play a role in availability flow, ranging from the edge of a wave‐packet to the boundary between idea‐pools, allow one to frame wide‐ranging correlations (including that between a phenomenon and its explanation) as delocalized physical structures. © 2007 Wiley Periodicals, Inc. Complexity, 2008  相似文献   

12.
An asymptotic result is obtained for a two-point boundary value problem for a vector system of nonlinear ordinary differential equations involving “fast” and “slow” inputs. The asymptotically limiting system is obtained by an averaging procedure. Using this result, an approximate analysis of the original system may be carried out by considering two lower-order systems each involving only one time scale. It is shown that some optimal control problems for systems with multiple time scales may be analyzed by this method.  相似文献   

13.
A very simple model prebiotic system is explored, both, to elucidate the full complexity of its dynamic behavior, and from the standpoint of what a thermal analysis can tell us about evolution more generally. The system consists of a coacervate containing a four variable enzymatic oscillator that is driven by a single input concentration. The reaction scheme is “nested”; i.e., by “turning on” one reaction at a time we can go from a two‐variable system, to one of three variables, to one of four, each developing more complex behavior. The four‐variable system is shown to have at least five distinct genera of complex attractors within the range examined. Two of these coexist for the same parameter values; the other three are substantially separated in the bifurcation space. The fixed points, characteristic roots, Lyapunov exponents, stability (dissipation), Kaplan‐Yorke dimensions, and correlation dimensions are all calculated for each attractor. A six‐variable amplification of the reaction scheme is considered as a simple model of nucleation in a coacervate and is shown to totally stabilize the corresponding attractor. An Evolutionary Potential is proposed that is wholly beyond the purview of classical thermostatics, yet incorporates entropy effects via Clausius' strong version of the Second Law. It is shown that the latter is a necessary condition for the sort of structuring characteristic of living systems. © 2003 Wiley Periodicals, Inc. Complexity 8: 45–67, 2003  相似文献   

14.
The proofs of universally quantified statements, in mathematics, are given as “schemata” or as “prototypes” which may be applied to each specific instance of the quantified variable. Type Theory allows to turn into a rigorous notion this informal intuition described by many, including Herbrand. In this constructive approach where propositions are types, proofs are viewed as terms of λ‐calculus and act as “proof‐schemata”, as for universally quantified types. We examine here the critical case of Impredicative Type Theory, i. e. Girard's system F, where type‐quantification ranges over all types. Coherence and decidability properties are proved for prototype proofs in this impredicative context.  相似文献   

15.
The “classical” random graph models, in particular G(n,p), are “homogeneous,” in the sense that the degrees (for example) tend to be concentrated around a typical value. Many graphs arising in the real world do not have this property, having, for example, power‐law degree distributions. Thus there has been a lot of recent interest in defining and studying “inhomogeneous” random graph models. One of the most studied properties of these new models is their “robustness”, or, equivalently, the “phase transition” as an edge density parameter is varied. For G(n,p), p = c/n, the phase transition at c = 1 has been a central topic in the study of random graphs for well over 40 years. Many of the new inhomogeneous models are rather complicated; although there are exceptions, in most cases precise questions such as determining exactly the critical point of the phase transition are approachable only when there is independence between the edges. Fortunately, some models studied have this property already, and others can be approximated by models with independence. Here we introduce a very general model of an inhomogeneous random graph with (conditional) independence between the edges, which scales so that the number of edges is linear in the number of vertices. This scaling corresponds to the p = c/n scaling for G(n,p) used to study the phase transition; also, it seems to be a property of many large real‐world graphs. Our model includes as special cases many models previously studied. We show that, under one very weak assumption (that the expected number of edges is “what it should be”), many properties of the model can be determined, in particular the critical point of the phase transition, and the size of the giant component above the transition. We do this by relating our random graphs to branching processes, which are much easier to analyze. We also consider other properties of the model, showing, for example, that when there is a giant component, it is “stable”: for a typical random graph, no matter how we add or delete o(n) edges, the size of the giant component does not change by more than o(n). © 2007 Wiley Periodicals, Inc. Random Struct. Alg., 31, 3–122, 2007  相似文献   

16.
Human movement reveals the hall mark characteristics of complex systems: namely, many interacting subsystems, multiple interactions within and between levels of analysis, emergence of movement coordination modes, and the exhibition of varying levels of the complexity of system output that continually evolve with learning and development over the life span. Here we outline how this high or infinitely dimensional complex dynamical system can be modeled by an epigenetic landscape framework—in the sense of Waddington—that captures the key features of the adaptive qualitative and quantitative properties of coordination modes (“order parameters”), the degeneracy of movement organization and the time scales of change. The framework provides some new ways to consider old problems in motor learning and development—such as an explicit and quantitative approach to exploring the concept of motor programs and developmental pathways—and yields new results and insights into the organization of learning during practice and rest times. For instance along one dimension of the landscape most of the changes occur between practice sessions. © 2006 Wiley Periodicals, Inc. Complexity 12: 40–51, 2006  相似文献   

17.
We reconsider some classical natural semantics of integers (namely iterators of functions, cardinals of sets, index of equivalence relations) in the perspective of Kolmogorov complexity. To each such semantics one can attach a simple representation of integers that we suitably effectivize in order to develop an associated Kolmogorov theory. Such effectivizations are particular instances of a general notion of “self‐enumerated system” that we introduce in this paper. Our main result asserts that, with such effectivizations, Kolmogorov theory allows to quantitatively distinguish the underlying semantics. We characterize the families obtained by such effectivizations and prove that the associated Kolmogorov complexities constitute a hierarchy which coincides with that of Kolmogorov complexities defined via jump oracles and/or infinite computations (cf. [6]). This contrasts with the well‐known fact that usual Kolmogorov complexity does not depend (up to a constant) on the chosen arithmetic representation of integers, let it be in any base n ≥ 2 or in unary. Also, in a conceptual point of view, our result can be seen as a mean to measure the degree of abstraction of these diverse semantics. (© 2006 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

18.
ABSTRACT. An individual‐based model of stream trout is analyzed by testing its ability to reproduce patterns of population‐level behavior observed in real trout: (1) “self‐thinning,” a negative power relation between weight and abundance; (2) a “critical period” of density‐dependent mortality in young‐of‐the‐year; (3) high and age‐specific inter‐annual variability in abundance; (4) density dependence in growth; and (5) fewer large trout when pool habitat is eliminated. The trout model successfully reproduced these patterns and was useful for evaluating their theoretical basis. The model analyses produced new explanations for some field observations and indicated that some patterns are less general than field studies indicate. The model did not reproduce field‐observed patterns of population variability by age class, discrepancies potentially explained by site differences, predation mortality being more stochastic than the model assumes, or uncertainty in the field study's age estimates.  相似文献   

19.
While scale‐free power‐laws are frequently found in social and technological systems, their authenticity, origin, and gained insights are often questioned, and rightfully so. The article presents a newly found rank‐frequency power‐law that aligns the top‐500 supercomputers according to their performance. Pursuing a cautious approach in a systematic way, we check for authenticity, evaluate several potential generative mechanisms, and ask the “so what” question. We evaluate and finally reject the applicability of well‐known potential generative mechanisms such as preferential attachment, self‐organized criticality, optimization, and random observation. Instead, the microdata suggest that an inverse relationship between exponential technological progress and exponential technology diffusion through social networks results in the identified fat‐tail distribution. This newly identified generative mechanism suggests that the supply and demand of technology (“technology push” and “demand pull”) align in exponential synchronicity, providing predictive insights into the evolution of highly uncertain technology markets. © 2013 Wiley Periodicals, Inc. Complexity 19: 56–65, 2014  相似文献   

20.
A sub‐calculus of the calculus (“algebra”) of all complete conormal symbols arising in the edge pseudodifferential calculus is constructed. This calculus of complete conormal symbols is suitable for constructing sub‐calculi of the general edge pseudodifferential calculus, for which the edge‐degenerate pseudodifferential operators involved map conormal asymptotics of distributions near the edges in a prescribed manner. (© 2007 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号