首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We show that Pearl's causal networks can be described using causal compositional models (CCMs) in the valuation-based systems (VBS) framework. One major advantage of using the VBS framework is that as VBS is a generalization of several uncertainty theories (e.g., probability theory, a version of possibility theory where combination is the product t-norm, Spohn's epistemic belief theory, and Dempster–Shafer belief function theory), CCMs, initially described in probability theory, are now described in all uncertainty calculi that fit in the VBS framework. We describe conditioning and interventions in CCMs. Another advantage of using CCMs in the VBS framework is that both conditioning and intervention can be easily described in an elegant and unifying algebraic way for the same CCM without having to do any graphical manipulations of the causal network. We describe how conditioning and intervention can be computed for a simple example with a hidden (unobservable) variable. Also, we illustrate the algebraic results using numerical examples in some of the specific uncertainty calculi mentioned above.  相似文献   

2.
This paper considers the problem of combining belief functions obtained from not necessarily independent sources of information. It introduces two combination rules for the situation in which no assumption is made about the dependence of the information sources. These two rules are based on cautious combinations of plausibility and commonality functions, respectively. The paper studies the properties of these rules and their connection with Dempster’s rules of conditioning and combination and the minimum rule of possibility theory.  相似文献   

3.
Belief and plausibility functions have been introduced as generalizations of probability measures, which abandon the axiom of additivity. It turns out that elementwise multiplication is a binary operation on the set of belief functions. If the set functions of the type considered here are defined on a locally compact and separable space X, a theorem by Choquet ensures that they can be represented by a probability measure on the space containing the closed subsets of X, the so-called basic probability assignment. This is basic for defining two new types of integrals. One of them may be used to measure the degree of non-additivity of the belief or plausibility function. The other one is a generalization of the Lebesgue integral. The latter is compared with Choquet's and Sugeno's integrals for non-additive set functions.  相似文献   

4.
In this paper, belief functions, defined on the lattice of intervals partitions of a set of objects, are investigated as a suitable framework for combining multiple clusterings. We first show how to represent clustering results as masses of evidence allocated to sets of partitions. Then a consensus belief function is obtained using a suitable combination rule. Tools for synthesizing the results are also proposed. The approach is illustrated using synthetic and real data sets.  相似文献   

5.
It is appropriate to use Dempster's rule for combining belief functions only if the belief functions combined are based on independent items of evidence. What can be done in the case of dependent evidence? Often the answer is to reframe the problem. Three examples are given: one from everyday experience, one from probabilistic relaxation, and one from expert systems.  相似文献   

6.
It is suggested that there exists many fuzzy set systems, each with its specific pointwise operations for union and intersection. A general law of compound possibilities is valid for all these systems, as well as a general law for representing marginal possibility distributions as unions of fuzzy sets. Max-min fuzzy sets are a special case of a fuzzy set system which uses the pointwise operations of max and min for union and intersection respectively. Probabilistic fuzzy sets are another special case which uses the operations of addition and multiplication. Probably there exists an infinite number of fuzzy set operations and systems. It is shown why the law of idempotency for intersection is not required for such systems. An essential difference between the meaning of the operations of union and intersection in traditional measure theory as compared with their meaning in the theory of possibility is pointed out. The operation of particularization is used to illustrate that the two distinct classical theories of nonfuzzy relations and of probability are merely two aspects of a more generalized theory of fuzzy sets. It is shown that we must distinguish between particularization of conditional fuzzy sets and of joint fuzzy sets. The concept of restriction of nonfuzzy relations is a special case of particularization of both conditional and joint fuzzy sets. The computation of joint probabilities from conditional and marginal ones is a special case of particularization of conditional probabilistic fuzzy sets. The difference between linguistic modifiers of type 1 and particulating modifiers is pointed out, as well as a general difference between nouns and adjectives.  相似文献   

7.
We continue the investigations initiated in the recent papers (Brown et al. in The modal logic of Bayesian belief revision, 2017; Gyenis in Standard Bayes logic is not finitely axiomatizable, 2018) where Bayes logics have been introduced to study the general laws of Bayesian belief revision. In Bayesian belief revision a Bayesian agent revises (updates) his prior belief by conditionalizing the prior on some evidence using the Bayes rule. In this paper we take the more general Jeffrey formula as a conditioning device and study the corresponding modal logics that we call Jeffrey logics, focusing mainly on the countable case. The containment relations among these modal logics are determined and it is shown that the logic of Bayes and Jeffrey updating are very close. It is shown that the modal logic of belief revision determined by probabilities on a finite or countably infinite set of elementary propositions is not finitely axiomatizable. The significance of this result is that it clearly indicates that axiomatic approaches to belief revision might be severely limited.  相似文献   

8.
In a very recent note by Gao and Ni [B. Gao, M.F. Ni, A note on article “The evidential reasoning approach for multiple attribute decision analysis using interval belief degrees”, European Journal of Operational Research, in press, doi:10.1016/j.ejor.2007.10.0381], they argued that Yen’s combination rule [J. Yen, Generalizing the Dempster–Shafer theory to fuzzy sets, IEEE Transactions on Systems, Man and Cybernetics 20 (1990) 559–570], which normalizes the combination of multiple pieces of evidence at the end of the combination process, was incorrect. If this were the case, the nonlinear programming models we proposed in [Y.M. Wang, J.B. Yang, D.L. Xu, K.S. Chin, The evidential reasoning approach for multiple attribute decision analysis using interval belief degrees, European Journal of Operational Research 175 (2006) 35–66] would also be incorrect. In this reply to Gao and Ni, we re-examine their numerical illustrations and reconsider their analysis of Yen’s combination rule. We conclude that Yen’s combination rule is correct and our nonlinear programming models are valid.  相似文献   

9.
In this paper we discuss the semantics and properties of the relative belief transform, a probability transformation of belief functions closely related to the classical plausibility transform. We discuss its rationale in both the probability-bound and Shafer’s interpretations of belief functions. Even though the resulting probability (as it is the case for the plausibility transform) is not consistent with the original belief function, an interesting rationale in terms of optimal strategies in a non-cooperative game can be given in the probability-bound interpretation to both relative belief and plausibility of singletons. On the other hand, we prove that relative belief commutes with Dempster’s orthogonal sum, meets a number of properties which are the duals of those met by the relative plausibility of singletons, and commutes with convex closure in a similar way to Dempster’s rule. This supports the argument that relative plausibility and belief transform are indeed naturally associated with the D-S framework, and highlights a classification of probability transformations into two families, according to the operator they relate to. Finally, we point out that relative belief is only a member of a class of “relative mass” mappings, which can be interpreted as low-cost proxies for both plausibility and pignistic transforms.  相似文献   

10.
Propagating belief functions in qualitative Markov trees   总被引:1,自引:0,他引:1  
This article is concerned with the computational aspects of combining evidence within the theory of belief functions. It shows that by taking advantage of logical or categorical relations among the questions we consider, we can sometimes avoid the computational complexity associated with brute-force application of Dempster's rule.The mathematical setting for this article is the lattice of partitions of a fixed overall frame of discernment. Different questions are represented by different partitions of this frame, and the categorical relations among these questions are represented by relations of qualitative conditional independence or dependence among the partitions. Qualitative conditional independence is a categorical rather than a probabilistic concept, but it is analogous to conditional independence for random variables.We show that efficient implementation of Dempster's rule is possible if the questions or partitions for which we have evidence are arranged in a qualitative Markov tree—a tree in which separations indicate relations of qualitative conditional independence. In this case, Dempster's rule can be implemented by propagating belief functions through the tree.  相似文献   

11.
The problem of aggregating two or more sources of information containing knowledge about a common domain is considered. We propose an aggregation framework for the case where the available information is modelled by coherent lower previsions, corresponding to convex sets of probability mass functions. The consistency between aggregated beliefs and sources of information is discussed. A closed formula, which specializes our rule to a particular class of models, is also derived. Two applications consisting in a possible explanation of Zadeh’s paradox and an algorithm for estimation fusion in sensor networks are finally reported.  相似文献   

12.
In this paper we deal with the set of k-additive belief functions dominating a given capacity. We follow the line introduced by Chateauneuf and Jaffray for dominating probabilities and continued by Grabisch for general k-additive measures. First, we show that the conditions for the general k-additive case lead to a very wide class of functions and this makes that the properties obtained for probabilities are no longer valid. On the other hand, we show that these conditions cannot be improved. We solve this situation by imposing additional constraints on the dominating functions. Then, we consider the more restrictive case of k-additive belief functions. In this case, a similar result with stronger conditions is proved. Although better, this result is not completely satisfactory and, as before, the conditions cannot be strengthened. However, when the initial capacity is a belief function, we find a subfamily of the set of dominating k-additive belief functions from which it is possible to derive any other dominant k-additive belief function, and such that the conditions are even more restrictive, obtaining the natural extension of the result for probabilities. Finally, we apply these results in the fields of Social Welfare Theory and Decision Under Risk.  相似文献   

13.
In this paper, we propose the plausibility transformation method for translating Dempster–Shafer (D–S) belief function models to probability models, and describe some of its properties. There are many other transformation methods used in the literature for translating belief function models to probability models. We argue that the plausibility transformation method produces probability models that are consistent with D–S semantics of belief function models, and that, in some examples, the pignistic transformation method produces results that appear to be inconsistent with Dempster’s rule of combination.  相似文献   

14.
Statistical problems were at the origin of the mathematical theory of evidence, or Dempster–Shafer theory. It was also one of the major concerns of Philippe Smets, starting with his PhD dissertation. This subject is reconsidered here, starting with functional models, describing how data is generated in statistical experiments. Inference is based on these models, using probabilistic assumption-based reasoning. It results in posterior belief functions on the unknown parameters. Formally, the information used in the process of inference can be represented by hints. Basic operations on hints are combination, corresponding to Dempster’s rule, and focussing. This leads to an algebra of hints. Applied to functional models, this introduces an algebraic flavor into statistical inference. It emphasizes the view that in statistical inference different pieces of information have to be combined and then focussed onto the question of interest. This theory covers Bayesian and Fisher type inference as two extreme cases of a more general theory of inference.  相似文献   

15.
16.
We obtain a characterization for Lp approximation by analytic functions on compact plane sets which is analogous to Vitushkin's characterization for uniform approximation. For p = 2 this was done by Havin by use of Cartan's fine topology; we study the general case by use of quasi topologies.  相似文献   

17.
This paper studies reduction of a fuzzy covering and fusion of multi-fuzzy covering systems based on the evidence theory and rough set theory. A novel pair of belief and plausibility functions is defined by employing a method of non-classical probability model and the approximation operators of a fuzzy covering. Then we study the reduction of a fuzzy covering based on the functions we presented. In the case of multiple information sources, we present a method of information fusion for multi-fuzzy covering systems, by which objects can be well classified in a fuzzy covering decision system. Finally, by using the method of maximum flow, we discuss under what conditions, fuzzy covering approximation operators can be induced by a fuzzy belief structure.  相似文献   

18.
This paper discusses an alternative to conditioning that may be used when the probability distribution is not fully specified. It does not require any assumptions (such as CAR: coarsening at random) on the unknown distribution. The well-known Monty Hall problem is the simplest scenario where neither naive conditioning nor the CAR assumption suffice to determine an updated probability distribution. This paper thus addresses a generalization of that problem to arbitrary distributions on finite outcome spaces, arbitrary sets of ‘messages’, and (almost) arbitrary loss functions, and provides existence and characterization theorems for robust probability updating strategies. We find that for logarithmic loss, optimality is characterized by an elegant condition, which we call RCAR (reverse coarsening at random). Under certain conditions, the same condition also characterizes optimality for a much larger class of loss functions, and we obtain an objective and general answer to how one should update probabilities in the light of new information.  相似文献   

19.
Given a parametric statistical model, evidential methods of statistical inference aim at constructing a belief function on the parameter space from observations. The two main approaches are Dempster's method, which regards the observed variable as a function of the parameter and an auxiliary variable with known probability distribution, and the likelihood-based approach, which considers the relative likelihood as the contour function of a consonant belief function. In this paper, we revisit the latter approach and prove that it can be derived from three basic principles: the likelihood principle, compatibility with Bayes' rule and the minimal commitment principle. We then show how this method can be extended to handle low-quality data. Two cases are considered: observations that are only partially relevant to the population of interest, and data acquired through an imperfect observation process.  相似文献   

20.
Based on the setting of exchangeable bets, this paper proposes a subjectivist view of numerical possibility theory. It relies on the assumption that when an agent constructs a probability measure by assigning prices to lotteries, this probability measure is actually induced by a belief function representing the agent’s actual state of knowledge. We also assume that the probability measure proposed by the agent in the course of the elicitation procedure is constructed via the so-called pignistic transformation (mathematically equivalent to the Shapley value in game theory). We pose and solve the problem of finding the least informative belief function having a given pignistic probability. We prove that it is unique and consonant, thus induced by a possibility distribution. This result exploits a simple informational ordering, in agreement with partial orderings between belief functions, comparing their information content. The obtained possibility distribution is subjective in the same sense as in the subjectivist school in probability theory. However, we claim that it is the least biased representation of the agent’s state of knowledge compatible with the observed betting behaviour.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号