首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 671 毫秒
1.
We present an individual based model of cultural evolution, where interacting agents are coded by binary strings standing for strategies for action, blueprints for products or attitudes and beliefs. The model is patterned on an established model of biological evolution, the Tangled Nature Model (TNM), where a “tangle” of interactions between agents determines their reproductive success. In addition, our agents also have the ability to copy part of each other's strategy, a feature inspired by the Axelrod model of cultural diversity. Unlike the latter, but similarly to the TNM, the model dynamics goes through a series of metastable stages of increasing length, each characterized by mutually enforcing cultural patterns. These patterns are abruptly replaced by other patterns characteristic of the next metastable period. We analyze the time dependence of the population and diversity in the system, show how different cultures are formed and merge, and how their survival probability lacks, in the model, a finite average life‐time. Finally, we use historical data on the number of car manufacturers after the introduction of the automobile to the market, to argue that our model can qualitatively reproduce the flurry of cultural activity which follows a disruptive innovation. © 2015 Wiley Periodicals, Inc. Complexity 21: 214–223, 2016  相似文献   

2.
At first we model the way an intelligence “I” constructs statements from phrases, and then how “I” interlocks these statements to form a string of statements to attain a concept. These strings of statements are called progressions. That is, starting with an initial stimulating relation between two phrases, we study how “I” forms the first statement of the progression and continues from this first statement to form the remaining statements in these progressions to construct a concept. We assume that “I” retains the progressions that it has constructed. Then we show how these retained progressions provide “I” with a platform to incrementally constructs more and more sophisticated conceptual structures. The reason for the construction of these conceptual structures is to achieve additional concepts. Choice plays a very important role in the progression and concept formation. We show that as “I” forms new concepts, it enriches its conceptual structure and makes further concepts attainable. This incremental attainment of concepts is a way in which we humans learn, and this paper studies the attainability of concepts from previously attained concepts. We also study the ability of “I” to apply its progressions and also the ability of “I” to electively manipulate its conceptual structure to achieve new concepts. Application and elective manipulation requires of “I” ingenuity and insight. We also show that as “I” attains new concepts, the conceptual structures change and circumstances arise where unanticipated conceptual discoveries are attainable. As the conceptual structure of “I” is developed, the logical and structural relationships between concepts embedded in this structure also develop. These relationships help “I” understand concepts in the context of other concepts and help “I1” communicate to another “I2” information and concept structures. The conceptual structures formed by “I” give rise to a directed web of statement paths which is called a convolution web. The convolution web provides “I” with the paths along which it can reason and obtain new concepts and alternative ways to attain a given concept.This paper is an extension of the ideas introduced in [1]. It is written to be self-contained and the required background is supplied as needed.  相似文献   

3.
The Axelrod model is a spatial stochastic model for the dynamics of cultures which, similar to the voter model, includes social influence, but differs from the latter by also accounting for another social factor called homophily, the tendency to interact more frequently with individuals who are more similar. Each individual is characterized by its opinions about a finite number of cultural features, each of which can assume the same finite number of states. Pairs of adjacent individuals interact at a rate equal to the fraction of features they have in common, thus modeling homophily, which results in the interacting pair having one more cultural feature in common, thus modeling social influence. It has been conjectured based on numerical simulations that the one-dimensional Axelrod model clusters when the number of features exceeds the number of states per feature. In this article, we prove this conjecture for the two-state model with an arbitrary number of features.  相似文献   

4.
5.
6.
We establish an equivalence of two systems of equations of one-dimensional shallow water models describing the propagation of surface waves over even and sloping bottoms. For each of these systems, we obtain formulas for the general form of their nondegenerate solutions, which are expressible in terms of solutions of the Darboux equation. The invariant solutions of the Darboux equation that we find are simplest representatives of its essentially different exact solutions (those not related by invertible point transformations). They depend on 21 arbitrary real constants; after “proliferation” formulas derived by methods of group theory analysis are applied, they generate a 27-parameter family of essentially different exact solutions. Subsequently using the derived infinitesimal “proliferation” formulas for the solutions in this family generates a denumerable set of exact solutions, whose linear span constitutes an infinite-dimensional vector space of solutions of the Darboux equation. This vector space of solutions of the Darboux equation and the general formulas for nondegenerate solutions of systems of shallow water equations with even and sloping bottoms give an infinite set of their solutions. The “proliferation” formulas for these systems determine their additional nondegenerate solutions. We also find all degenerate solutions of these systems and thus construct a database of an infinite set of exact solutions of systems of equations of the one-dimensional nonlinear shallow water model with even and sloping bottoms.  相似文献   

7.
We discuss differences in mathematical representations of the physical and mental worlds. Following Aristotle, we present the mental space as discrete, hierarchic, and totally disconnected topological space. One of the basic models of such spaces is given by ultrametric spaces and more specially by m-adic trees. We use dynamical systems in such spaces to model flows of unconscious information at different level of mental representation hierarchy, for “mental points”, categories, and ideas. Our model can be interpreted as an unconventional computational model: non-algorithmic hierarchic “computations” (identified with the process of thinking at the unconscious level).  相似文献   

8.
The paper studies regularity properties of set-valued mappings between metric spaces. In the context of metric regularity, nonlinear models correspond to nonlinear dependencies of estimates of error bounds in terms of residuals. Among the questions addressed in the paper are equivalence of the corresponding concepts of openness and “pseudo-Hölder” behavior, general and local regularity criteria with special emphasis on “regularity of order $k$ ”, for local settings, and variational methods to extimate regularity moduli in case of length range spaces. The majority of the results presented in the paper are new.  相似文献   

9.
Bisimulations have been widely used in many areas of computer science to model equivalence between various systems, and to reduce the number of states of these systems, whereas uniform fuzzy relations have recently been introduced as a means to model the fuzzy equivalence between elements of two possible different sets. Here we use the conjunction of these two concepts as a powerful tool in the study of equivalence between fuzzy automata. We prove that a uniform fuzzy relation between fuzzy automata A and B is a forward bisimulation if and only if its kernel and co-kernel are forward bisimulation fuzzy equivalence relations on A and B and there is a special isomorphism between factor fuzzy automata with respect to these fuzzy equivalence relations. As a consequence we get that fuzzy automata A and B are UFB-equivalent, i.e., there is a uniform forward bisimulation between them, if and only if there is a special isomorphism between the factor fuzzy automata of A and B with respect to their greatest forward bisimulation fuzzy equivalence relations. This result reduces the problem of testing UFB-equivalence to the problem of testing isomorphism of fuzzy automata, which is closely related to the well-known graph isomorphism problem. We prove some similar results for backward-forward bisimulations, and we point to fundamental differences. Because of the duality with the studied concepts, backward and forward-backward bisimulations are not considered separately. Finally, we give a comprehensive overview of various concepts on deterministic, nondeterministic, fuzzy, and weighted automata, which are related to bisimulations.  相似文献   

10.
We develop a methodology to efficiently implement the reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithms of Green, applicable for example to model selection inference in a Bayesian framework, which builds on the “dragging fast variables” ideas of Neal. We call such algorithms annealed importance sampling reversible jump (aisRJ). The proposed procedures can be thought of as being exact approximations of idealized RJ algorithms which in a model selection problem would sample the model labels only, but cannot be implemented. Central to the methodology is the idea of bridging different models with fictitious intermediate models, whose role is to introduce smooth intermodel transitions and, as we shall see, improve performance. Efficiency of the resulting algorithms is demonstrated on two standard model selection problems and we show that despite the additional computational effort incurred, the approach can be highly competitive computationally. Supplementary materials for the article are available online.  相似文献   

11.
The Axelrod model is a spatial stochastic model for the dynamics of cultures that includes two key social mechanisms: homophily and social influence, respectively, defined as the tendency of individuals to interact more frequently with individuals who are more similar and the tendency of individuals to become more similar when they interact. The original model assumes that individuals are located on the vertex set of an interaction network and are characterized by their culture, a vector of opinions about F cultural features, each of which offering the same number q of alternatives. Pairs of neighbors interact at a rate proportional to the number of cultural features for which they agree, which results in one more agreement between the two neighbors. In this article, we study a more general and more realistic version of the standard Axelrod model that allows for a variable number of opinions across cultural features, say \(q_i\) possible alternatives for the ith cultural feature. Our main result shows that the one-dimensional system with two cultural features fixates when \(q_1 + q_2 \ge 6\).  相似文献   

12.
In this paper, we report the results of a series of experiments on a version of the centipede game in which the total payoff to the two players is constant. Standard backward induction arguments lead to a unique Nash equilibrium outcome prediction, which is the same as the prediction made by theories of “fair” or “focal” outcomes. We find that subjects frequently fail to select the unique Nash outcome prediction. While this behavior was also observed in McKelvey and Palfrey (1992) in the “growing pie” version of the game they studied, the Nash outcome was not “fair”, and there was the possibility of Pareto improvement by deviating from Nash play. Their findings could therefore be explained by small amounts of altruistic behavior. There are no Pareto improvements available in the constant-sum games we examine. Hence, explanations based on altruism cannot account for these new data. We examine and compare two classes of models to explain these data. The first class consists of non-equilibrium modifications of the standard “Always Take” model. The other class we investigate, the Quantal Response Equilibrium model, describes an equilibrium in which subjects make mistakes in implementing their best replies and assume other players do so as well. One specification of this model fits the experimental data best, among the models we test, and is able to account for all the main features we observe in the data.  相似文献   

13.
We study the representability problem for torsion-free arithmetic matroids. After introducing a “strong gcd property” and a new operation called “reduction”, we describe and implement an algorithm to compute all essential representations, up to equivalence. As a consequence, we obtain an upper bound to the number of equivalence classes of representations. In order to rule out equivalent representations, we describe an efficient way to compute a normal form of integer matrices, up to left-multiplication by invertible matrices and change of sign of the columns (we call it the “signed Hermite normal form”). Finally, as an application of our algorithms, we disprove two conjectures about the poset of layers and the independence poset of a toric arrangement.  相似文献   

14.
A clique (or a complete subgraph) is a popular model for an “ideal” cluster in a network. However, in many practical applications this notion turns out to be overly restrictive as it requires the existence of all pairwise links within the cluster. Thus, the researchers and practitioners often rely on various clique relaxation ideas for more flexible models of highly connected clusters. In this paper, we propose a new clique relaxation model referred to as a small-world subgraph, which represents a network cluster with “small-world” properties: low average distance and high clustering coefficient. In particular, we demonstrate that the proposed small-world subgraph model has better “cohesiveness” characteristics than other existing clique relaxation models in some worst-case scenarios. The main focus of the paper is on the problem of finding a small-world subgraph of maximum cardinality in a given graph. We describe a mixed integer programming (MIP) formulation of the problem along with several algorithmic enhancements. For solving large-scale instances of the problem we propose a greedy-type heuristic referred to as the iterative depth-first search (IDF) algorithm. Furthermore, we show that the small-world subgraphs identified by the IDF algorithm have an additional property that may be attractive from the practical perspective, namely, 2-connectivity. Finally, we perform extensive computational experiments on real-world and randomly generated networks to demonstrate the performance of the developed computational approaches that also reveal interesting insights about the proposed clique relaxation model.  相似文献   

15.
Utility or value functions play an important role of preference models in multiple-criteria decision making. We investigate the relationships between these models and the decision-rule preference model obtained from the Dominance-based Rough Set Approach. The relationships are established by means of special “cancellation properties” used in conjoint measurement as axioms for representation of aggregation procedures. We are considering a general utility function and three of its important special cases: associative operator, Sugeno integral and ordered weighted maximum. For each of these aggregation functions we give a representation theorem establishing equivalence between a very weak cancellation property, the specific utility function and a set of rough-set decision rules. Each result is illustrated by a simple example of multiple-criteria decision making. The results show that the decision rule model we propose has clear advantages over a general utility function and its particular cases.  相似文献   

16.
This paper deals with the development and analysis of well-posed models and computational algorithms for control of a class of partial differential equations that descrive the motions of thermo-viscoelastic structures. We first present an abstract “state space” framework and general well-posedness result that can be applied to a large class of thermo-elastic and thermo-viscoelastic models. This state space framework is used in the development of a computational scheme to be used in the solution of an LQR control problem. A detailed convergence proof is provided for the viscoelastic model, and several numerical results are presented to illustrate the theory and to analyze problems for which the theory is incomplete.  相似文献   

17.
We bring some market segmentation concepts into the statement of the “new product introduction” problem with Nerlove-Arrow’s linear goodwill dynamics. In fact, only a few papers on dynamic quantitative advertising models deal with market segmentation, although this is a fundamental topic of marketing theory and practice. In this way we obtain some new deterministic optimal control problems solutions and show how such marketing concepts as “targeting” and “segmenting” may find a mathematical representation. We consider two kinds of situations. In the first one, we assume that the advertising process can reach selectively each target group. In the second one, we assume that one advertising channel is available and that it has an effectiveness segment-spectrum, which is distributed over a non-trivial set of segments. We obtain the explicit optimal solutions of the relevant problems.  相似文献   

18.
The pyrolysis models undergoes a fast development due to the enhancement of both the computational power and the new test used to characterize the behaviour of materials under thermal stresses, which were widely used in the field of the chemical engineering to obtain the reaction rates. Thus, these models allow us to characterize either the transient heat of material (thermal inertia) or the complete chemical scheme of mass loss processes (kinetic triplet). The pyrolysis model needs a several number of parameters what does the optimization of a suitable set of parameters a difficult task. Two kinds of materials have been investigated; the first one was a real material which mass loss process was characterized as “one-step” reaction and the second one as a “two-steps” process. Further, it has been analyzed the influence of some algorithm features (initial population number, parameter range, crossover influence) in the optimization time and also in the accuracy of results.  相似文献   

19.
ABSTRACT. We combine new concepts of noncooperative coalition theory with an integrated assessment model on climate change to analyze the impact of different protocol designs on the success of coalition formation. We analyze the role of “single versus multiple coalitions,”“open versus exclusive membership,”“no, weak and strong consensus about membership” and “no transfers versus transfers.” First, we want to find out whether and how modifications of the standard assumptions affect results that are associated with the widely applied cartel formation game in the noncooperative game theoretic analysis of international environmental agreements. Second, we discuss normative policy conclusions that emerge from the various modifications. Third, we confront our results with evidence on past international environmental treaties and derive an agenda for future research.  相似文献   

20.
《Journal of Complexity》1998,14(1):102-121
The real-number model of computation is used in computational geometry, in the approach suggested by Blum, Shub, and Smale and in information based complexity. In this paper we propose a refinement of this model, the TTE-model of computation. In contrast to the real-number model, which is unrealistic or at least too optimistic, the TTE-model is very realistic; i.e., for TTE-algorithms there are digital computers, which behave exactly the same way as predicted by the theoretical model. We start with a detailed discussion of some objections to the real-number model. We introduce the refined model by adding the condition “every partial input or output information of an algorithm is finite” to the assumptions of the IBC-model of computation. First, we explain computability and computational complexity in TTE for the simple case of real functions. Then we apply the refined model to two typical IBC-problems, integration and zero-finding on the spaceC[0; 1] of continuous real functions. We study computability and computational complexity and compare TTE-results with IBC-results. Finally, we briefly discuss the computation of discontinuous problems. This paper does not present new technical results but should be considered as a fresh impetus to reflect on models of computation for numerical analysis and as an invitation to try out the TTE-model of computation in information based complexity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号