首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
Bonanno (Logics and the foundations of game and decision theory, Amsterdam University Press, Amsterdam, 2008) provides an epistemic characterization for the solution concept of iterated deletion of inferior strategy profiles (IDIP) by embedding strategic-form games with ordinal payoffs in non-probabilistic epistemic models which are built on Kripke frames. In this paper, we will follow the event-based approach to epistemic game theory and supplement strategic games with type space models, where each type is associated with a preference relation on the state space. In such a framework, IDIP can be characterized by the conditions that at least one player has correct beliefs about the state of the world and that there is common belief that every player is rational, has correct beliefs about the state of the world and has strictly monotone preferences. Moreover, we shall compare the epistemic motivations for IDIP and its mixed strategy variant known as strong rationalizability (SR). Presuppose the above conditions. Whenever there is also common belief that players’ preferences are representable by some expected utility function IDIP still applies. But if there is common belief that players’ preferences are representable by some expected payoff function, then SR results.  相似文献   

2.
There are many conceptualizations and formalizations of decision making. In this paper we compare classical decision theory with qualitative decision theory, knowledge-based systems and belief–desire–intention models developed in artificial intelligence and agent theory. They all contain representations of information and motivation. Examples of informational attitudes are probability distributions, qualitative abstractions of probabilities, knowledge, and beliefs. Examples of motivational attitudes are utility functions, qualitative abstractions of utilities, goals, and desires. Each of them encodes a set of alternatives to be chosen from. This ranges from a small predetermined set, a set of decision variables, through logical formulas, to branches of a tree representing events through time. Moreover, they have a way of formulating how a decision is made. Classical and qualitative decision theory focus on the optimal decisions represented by a decision rule. Knowledge-based systems and belief–desire–intention models focus on an alternative conceptualization to formalize decision making, inspired by cognitive notions like belief, desire, goal and intention. Relations among these concepts express an agent type, which constrains the deliberation process. We also consider the relation between decision processes and intentions, and the relation between game theory and norms and commitments.  相似文献   

3.
Most decision models for handling vague and imprecise information are unnecessarily restrictive since they do not admit for discrimination between different beliefs in different values. This is true for classical utility theory as well as for the various interval methods that have prevailed. To allow for more refined estimates, we suggest a framework designed for evaluating decision situations considering beliefs in sets of epistemically possible utility and probability functions, as well as relations between them. The various beliefs are expressed using different kinds of belief distributions. We show that the use of such distributions allows for representation principles not requiring too hard data aggregation, but still admitting efficient evaluation of decision situations.  相似文献   

4.
Distances between possible worlds play an important role in logic-based knowledge representation (especially in belief change, reasoning about action, belief merging and similarity-based reasoning). We show here how they can be used for representing in a compact and intuitive way the preference profile of an agent, following the principle that given a goal G, then the closer a world w to a model of G, the better w. We give an integrated logical framework for preference representation which handles weighted goals and distances to goals in a uniform way. Then we argue that the widely used Hamming distance (which merely counts the number of propositional symbols assigned a different value by two worlds) is generally too rudimentary and too syntax-sensitive to be suitable in real applications; therefore, we propose a new family of distances, based on Choquet integrals, in which the Hamming distance has a position very similar to that of the arithmetic mean in the class of Choquet integrals.  相似文献   

5.
One common principle in the study of belief is what has been called the “consensual validation of reality”: the idea that persons in highly inbred social networks alter their beliefs regarding the external world by repeated interaction with each other rather than by direct observation. This notion accounts for phenomena such as panics, in which a substantial number of actors in a given population suddenly converge to (typically unsubstantiated) beliefs. In this paper, a Bayesian conditional probability model will be used to explore the conditions necessary for such outcomes, and alternative results will be likewise documented. Finally, suggestions for operationalization of the Bayesian model in experimental research will be given, along with some implications of the theory for common phenomena such as the propagation of ideas by media sources, organizational rumors, and polarization of group opinion.  相似文献   

6.
This paper describes a novel experimental method for determining the value of different types of information to military decision makers. The experimental method used a simple scenario and a set of serials constructed from cards, each presenting a single piece of information and presented sequentially. Each of a number of pairs of players were taken through the scenario and asked to judge when they would make each of a pair of escalating responses to the situation. The data proved well suited to analysis using a probit model and is consistent with the hypothesis of a Bayesian decision mechanism with normally distributed ‘action points’. The methodology allowed the determination of weights for each of a number of different classes of information, together with estimates of the human and situational elements of variation, including estimates of the ‘prior belief’ of the different pairs of players.  相似文献   

7.
This paper deals with repeated nonsymmetric congestion games in which the players cannot observe their payoffs at each stage. Examples of applications come from sharing facilities by multiple users. We show that these games present a unique Pareto optimal Nash equilibrium that dominates all other Nash equilibria and consequently it is also the social optimum among all equilibria, as it minimizes the sum of all the players’ costs. We assume that the players adopt a best response strategy. At each stage, they construct their belief concerning others probable behavior, and then, simultaneously make a decision by optimizing their payoff based on their beliefs. Within this context, we provide a consensus protocol that allows the convergence of the players’ strategies to the Pareto optimal Nash equilibrium. The protocol allows each player to construct its belief by exchanging only some aggregate but sufficient information with a restricted number of neighbor players. Such a networked information structure has the advantages of being scalable to systems with a large number of players and of reducing each player’s data exposure to the competitors.  相似文献   

8.
A qualitative approach to decision making under uncertainty has been proposed in the setting of possibility theory, which is based on the assumption that levels of certainty and levels of priority (for expressing preferences) are commensurate. In this setting, pessimistic and optimistic decision criteria have been formally justified. This approach has been transposed into possibilistic logic in which the available knowledge is described by formulas which are more or less certainly true and the goals are described in a separate prioritized base. This paper adapts the possibilistic logic handling of qualitative decision making under uncertainty in the Answer Set Programming (ASP) setting. We show how weighted beliefs and prioritized preferences belonging to two separate knowledge bases can be handled in ASP by modeling qualitative decision making in terms of abductive logic programming where (uncertain) knowledge about the world and prioritized preferences are encoded as possibilistic definite logic programs and possibilistic literals respectively. We provide ASP-based and possibilistic ASP-based algorithms for calculating optimal decisions and utility values according to the possibilistic decision criteria. We describe a prototype implementing the algorithms proposed on top of different ASP solvers and we discuss the complexity of the different implementations.  相似文献   

9.
The concepts of substantive beliefs and derived beliefs are defined, a set of substantive beliefs S like open set and the neighborhood of an element substantive belief. A semantic operation of conjunction is defined with a structure of an Abelian group. Mathematical structures exist such as poset beliefs and join‐semilattttice beliefs. A metric space of beliefs and the distance of belief depending on the believer are defined. The concepts of closed and opened ball are defined. S′ is defined as subgroup of the metric space of beliefs Σ and S′ is a totally limited set. The term s is defined (substantive belief) in terms of closing of S′. It is deduced that Σ is paracompact due to Stone's Theorem. The pseudometric space of beliefs is defined to show how the metric of the nonbelieving subject has a topological space like a nonmaterial abstract ideal space formed in the mind of the believing subject, fulfilling the conditions of Kuratowski axioms of closure. To establish patterns of materialization of beliefs we are going to consider that these have defined mathematical structures. This will allow us to understand better cultural processes of text, architecture, norms, and education that are forms or the materialization of an ideology. This materialization is the conversion by means of certain mathematical correspondences, of an abstract set whose elements are beliefs or ideas, in an impure set whose elements are material or energetic. Text is a materialization of ideology. © 2013 Wiley Periodicals, Inc. Complexity 19: 46–62, 2013  相似文献   

10.
Recent research demonstrates that in many countries gender differences in mathematics achievement have virtually disappeared. Expectancy‐value theory and social cognition theory both predict that if gender differences in achievement have declined there should be a similar decline in gender differences in self‐beliefs. Extant literature is equivocal: there are studies indicating that the male over female advantage in self‐efficacy and beliefs about math learning is as strong as ever and there are studies reporting an absence of gender differences in belief. Using data from 996 grades 7–10 Canadian students, we found that gender differences in beliefs continued, even though gender differences in achievement were near zero. Gender differences, favoring males, were larger for self‐beliefs (math self‐efficacy and fear of failure) and weaker for functional and dysfunctional beliefs about math learning. There were also gender differences in the structure of a model linking beliefs about math, beliefs about self and achievement.  相似文献   

11.
In this paper we study the model of decision under uncertainty consistent with confidence preferences. In that model, a decision maker held beliefs represented by a fuzzy set of priors and tastes captured by a standard affine utility index on consequences. First, we find some interesting properties concerning the well-known maxmin expected utility model, taking into account the point of view of the confidence preferences model. Further, we provide new examples of preferences that capture ambiguity-averse attitudes weaker than ambiguity attitudes featured by maxmin expected utility theory. Finally, we discuss the axiomatic foundations for the confidence preferences model with optimistic behavior.  相似文献   

12.
The objective of analysing a company's risk exposures is togain an understanding of the risks that the company faces. Onlythen can the likely level of future losses be estimated, anddecisions about how best to manage these risks be made. To gaina full understanding, we first need to adjust for a number ofexternal factors to ensure that all data are on a consistentbasis. The historic data can then be analysed and the levelof variability determined. After identifying appropriate probabilitydistributions for the frequency and severity of the risks, simulationscan be run to make forecasts. Once forecasts have been made, the best way to manage and financethe risks can be considered. As such decisions typically dependupon many factors, utility theory can be used to summarize theadvantage that the company will obtain from each alternativein a given situation. This will involve defining a utility functionfor the company. Methods of eliciting these utility functionsexist, including influence diagrams. Decision theory can consequentlybe applied to determine the best course of action using thecompany's utility function and its beliefs about the future.Uncertainty inherent in the information can therefore be incorporatedin the decision process rather than be ignored. The decisionwill also depend upon the ability of the company to sustaina loss from retained risks and regulatory requirements relatingto the risks.  相似文献   

13.
Beliefs constitute a central part of a person’s professional competencies and are crucial to the perception of situations as they influence our choice of actions. This paper focuses on epistemological beliefs about the nature of mathematics of future primary teachers from an international perspective. The data reported are part of a larger sample originating from the TEDS-M study which compares primary mathematics teacher education in 15 countries. In this paper we examine the pattern of beliefs of future teachers aiming to teach mathematics at primary level. We explore whether and to what extent beliefs concerning the nature of mathematics are influenced by cultural factors, in our case the extent to which a country’s culture can be characterized by an individualistic versus collectivistic orientation according to Hofstede’s terminology. In the first part of the paper, the literature on epistemological beliefs is reviewed and the role of culture and individualism/collectivism on the formation of beliefs concerning the nature of mathematics will be discussed. In the empirical part, means and distributions of belief ratings will be reported. Finally, multilevel analyses explore how much of the variation of belief preferences between countries can be explained by the individualistic orientation of a country.  相似文献   

14.
The Dempster–Shafer theory of evidence is applied to a multiattribute decision-making problem where the decision maker must determine which of several products/services have the best opportunity for success in a competitive marketplace. Multiattribute decisions are generally constrained by the uncertainty inherent in assessing the relative importance of each attribute element that is needed for success and the evaluation of the product/service to be introduced. The relative importance of each attribute element deemed necessary for success is assessed by the decision maker as a goal to be met. The evaluation of each product/service is addressed through expert opinion about the degree to which each element is contained in each product/service. Then the belief and plausibility that a product/service will satisfy the decision maker's goal are calculated. The decision to introduce a product or service depends on the evaluation of the anticipated loss from introduction of a product/service into a competitive market.  相似文献   

15.
Standard type spaces induce belief structures defined by precise beliefs. This paper proposes and analyzes simple procedures for constructing perturbations of such belief structures in which beliefs have a degree of ambiguity. Specifically, we construct ambiguous type spaces whose induced (ambiguous) belief hierarchies approximate the standard, precise, belief hierarchies corresponding to the initial type space. Based on a metric that captures the resulting approximation, two alternative procedures to construct such perturbations are introduced, and are shown to yield a simple and intuitive characterization of convergence to the initial unperturbed environment. As a special case, one of these procedures is shown to characterize the set of all finite perturbations. The introduced perturbations and their convergence properties provide conceptual foundations for the analysis of robustness to ambiguity of various solutions concepts, and for various decision rules under ambiguity.  相似文献   

16.
We investigate the make-buy decision of a manufacturer who does not know its potential suppliers’ capabilities. In order to mitigate the consequences of this limited knowledge, the manufacturer can either perform in-house or audit suppliers. An audit reveals the audited supplier’s capability such that the manufacturer can base the make-buy decision on the audit outcome; the manufacturer might also learn from the audit and update its beliefs about the capabilities of the unaudited suppliers. Interestingly, using a very general model we find that the manufacturer’s decision can be independent of both the number of available suppliers and of the mechanism it uses to update its beliefs after an audit. We illustrate our general model by considering a possible application, where a manufacturer is making the outsource-audit decisions when the suppliers are more cost effective. However, when outsourcing to supplier, the manufacturer would face the uncertainty of whether or not the delivered task can integrate well with the other parts of the project.  相似文献   

17.
We give in this paper new results on merging operators. Those operators aim to define the goals (or beliefs) of an agents' group after the individuals' goals (beliefs). Using the logical framework of Konieczny and Pino Pérez [Proceedings of the Fifth European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty (ECSQARU'99), Lecture Notes in Artificial Intelligence 1638, 1999, pp. 233–244] we study the relationships between two important sub-families of merging operators: majority operators and arbitration operators. An open question was to know if those two families were disjoint or not. We show that there are operators that belong simultaneously to the two families. Furthermore, the new family introduced allows the user to choose the “consensual level” he wants for his majority operator. We show at the end of this work some relationships between logical belief merging operators and social choice rules.  相似文献   

18.
This paper extends the theory of belief functions by introducing new concepts and techniques, allowing to model the situation in which the beliefs held by a rational agent may only be expressed (or are only known) with some imprecision. Central to our approach is the concept of interval-valued belief structure (IBS), defined as a set of belief structures verifying certain constraints. Starting from this definition, many other concepts of Evidence Theory (including belief and plausibility functions, pignistic probabilities, combination rules and uncertainty measures) are generalized to cope with imprecision in the belief numbers attached to each hypothesis. An application of this new framework to the classification of patterns with partially known feature values is demonstrated.  相似文献   

19.
A belief is reflectively lucky if it is a matter of luck that the belief is true, given what a subject is aware of on reflection alone. Various epistemologists have argued that any adequate theory of knowledge should eliminate reflective luck, but doing so has proven difficult. This article distinguishes between two kinds of reflective luck arguments in the literature: local arguments and global arguments. It argues that local arguments are best interpreted as demanding, not that one be reflectively aware of the reliability of the sources of one’s beliefs, but that one’s beliefs be attributable to one as one’s own. The article then argues that global arguments make illegitimate demands, because they require that we be ultimately answerable for our beliefs. In the end, the article argues that epistemologists should shift their focus away from reflective luck and toward the conditions under which beliefs are attributable to cognitive agents.  相似文献   

20.
This paper considers varieties of probabilism capable of distilling paradox-free qualitative doxastic notions (e.g., full belief, expectation, and plain belief) from a notion of probability taken as a primitive. We show that core systems, collections of nested propositions expressible in the underlying algebra, can play a crucial role in these derivations. We demonstrate how the notion of a probability core can be naturally generalized to high probability, giving rise to what we call a high probability core, a notion that when formulated in terms of classical monadic probability coincides with the notion of stability proposed by Hannes Leitgeb [32]. Our work continues by one of us in collaboration with Rohit Parikh [7]. In turn, the latter work was inspired by the seminal work of Bas van Fraassen [46]. We argue that the adoption of dyadic probability as a primitive (as articulated by van Fraassen [46]) admits a smoother connection with the standard theory of probability cores as well as a better model in which to situate doxastic notions like full belief. We also illustrate how the basic structure underlying a system of cores naturally leads to alternative probabilistic acceptance rules, like the so-called ratio rule initially proposed by Isaac Levi [34].Core systems in their various guises are ubiquitous in many areas of formal epistemology (e.g., belief revision, the semantics of conditionals, modal logic, etc.). We argue that core systems can also play a natural and important role in Bayesian epistemology and decision theory. In fact, the final part of the article shows that probabilistic core systems are naturally derivable from basic decision-theoretic axioms which incorporate only qualitative aspects of core systems; that the qualitative aspects of core systems alone can be naturally integrated in the articulation of coherence of primitive conditional probability; and that the guiding idea behind the primary qualitative features of a core system gives rise to the formulation of lexicographic decision rules.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号