首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
The theory of belief functions is a generalization of probability theory; a belief function is a set function more general than a probability measure but whose values can still be interpreted as degrees of belief. Dempster's rule of combination is a rule for combining two or more belief functions; when the belief functions combined are based on distinct or “independent” sources of evidence, the rule corresponds intuitively to the pooling of evidence. As a special case, the rule yields a rule of conditioning which generalizes the usual rule for conditioning probability measures. The rule of combination was studied extensively, but only in the case of finite sets of possibilities, in the author's monograph A Mathematical Theory of Evidence. The present paper describes the rule for general, possibly infinite, sets of possibilities. We show that the rule preserves the regularity conditions of continuity and condensability, and we investigate the two distinct generalizations of probabilistic independence which the rule suggests.  相似文献   

2.
Graphical models are efficient and simple ways to represent dependencies between variables. We introduce in this paper the so-called belief causal networks where dependencies are uncertain causal links and where the uncertainty is represented by belief masses. Through these networks, we propose to represent the results of passively observing the spontaneous behavior of the system and also evaluate the effects of external actions. Interventions are very useful for representing causal relations, we propose to compute their effects using a generalization of the “do” operator. Even if the belief chain rule is different from the Bayesian chain rule, we show that the joint distributions of the altered structures to graphically describe interventions are equivalent. This paper also addresses new issues that are arisen when handling interventions: we argue that in real world applications, external manipulations may be imprecise and show that they have a natural encoding under the belief function framework.  相似文献   

3.
In Robust Bayesian analysis one attempts to avoid the ‘Dogma of Precision’ in Bayesian analysis by entertaining a set of probability distributions instead of exactly one. The algebraic approach to plausibility calculi is inspired by Cox's and Jaynes' analyses of plausibility assessment as a logic of uncertainty. In the algebraic approach one is not so much interested in different ways to prove that precise Bayesian probability is inevitable but rather in how different sets of assumptions are reflected in the resulting plausibility calculus. It has repeatedly been pointed out that a partially ordered plausibility domain is more appropriate than a totally ordered one, but it has not yet been completely resolved exactly what such domains can look like. One such domain is the natural robust Bayesian representation, an indexed family of probabilities.We show that every plausibility calculus embeddable in a partially ordered ring is equivalent to a subring of a product of ordered fields, i.e., the robust Bayesian representation is universal under our assumptions, if extended rather than standard probability is used.We also show that this representation has at least the same expressiveness as coherent sets of desirable gambles with real valued payoffs, for a finite universe.  相似文献   

4.
An integrated approach to truth-gaps and epistemic uncertainty is described, based on probability distributions defined over a set of three-valued truth models. This combines the explicit representation of borderline cases with both semantic and stochastic uncertainty, in order to define measures of subjective belief in vague propositions. Within this framework we investigate bridges between probability theory and fuzziness in a propositional logic setting. In particular, when the underlying truth model is from Kleene's three-valued logic then we provide a complete characterisation of compositional min–max fuzzy truth degrees. For classical and supervaluationist truth models we find partial bridges, with min and max combination rules only recoverable on a fragment of the language. Across all of these different types of truth valuations, min–max operators are resultant in those cases in which there is only uncertainty about the relative sharpness or vagueness of the interpretation of the language.  相似文献   

5.
Whilst supported by compelling arguments, the representation of uncertainty by means of (subjective) probability does not enjoy a unanimous consensus. A substantial part of the relevant criticisms point to its alleged inadequacy for representing ignorance as opposed to uncertainty. The purpose of this paper is to show how a strong justification for taking belief as probability, namely the Dutch Book argument, can be extended naturally so as to provide a logical characterization of coherence for imprecise probability, a framework which is widely believed to accommodate some fundamental features of reasoning under ignorance. The appropriate logic for our purposes is an algebraizable logic whose equivalent algebraic semantics is a variety of MV-algebras with an additional internal unary operation representing upper probability (these algebras will be called UMV-algebras).  相似文献   

6.
7.
8.
Probability theory has become the standard framework in the field of mobile robotics because of the inherent uncertainty associated with sensing and acting. In this paper, we show that the theory of belief functions with its ability to distinguish between different types of uncertainty is able to provide significant advantages over probabilistic approaches in the context of robotics. We do so by presenting solutions to the essential problems of simultaneous localization and mapping (SLAM) and planning based on belief functions. For SLAM, we show how the joint belief function over the map and the robot's poses can be factored and efficiently approximated using a Rao-Blackwellized particle filter, resulting in a generalization of the popular probabilistic FastSLAM algorithm. Our SLAM algorithm produces occupancy grid maps where belief functions explicitly represent additional information about missing and conflicting measurements compared to probabilistic grid maps. The basis for this SLAM algorithm are forward and inverse sensor models, and we present general evidential models for range sensors like sonar and laser scanners. Using the generated evidential grid maps, we show how optimal decisions can be made for path planning and active exploration. To demonstrate the effectiveness of our evidential approach, we apply it to two real-world datasets where a mobile robot has to explore unknown environments and solve different planning problems. Finally, we provide a quantitative evaluation and show that the evidential approach outperforms a probabilistic one both in terms of map quality and navigation performance.  相似文献   

9.
Civil engineering projects and designs are commonly developed in a systems framework that includes different types of uncertainty. In general, uncertainty can be of the ambiguity or vagueness type. The theory of probability and statistics has been extensively used in civil engineering to deal with the ambiguity type of uncertainty. The theory of fuzzy sets and systems have been used in civil engineering to model the vagueness type of uncertainty in many civil engineering applications. In this paper, the role of fuzzy sets in civil engineering systems is described using several example applications, e.g., quality assessment of wildlife habitat, construction engineering and management, structural reliability, and damage assessment of existing structures.  相似文献   

10.
The main goal of this paper is to describe a new graphical structure called ‘Bayesian causal maps’ to represent and analyze domain knowledge of experts. A Bayesian causal map is a causal map, i.e., a network-based representation of an expert’s cognition. It is also a Bayesian network, i.e., a graphical representation of an expert’s knowledge based on probability theory. Bayesian causal maps enhance the capabilities of causal maps in many ways. We describe how the textual analysis procedure for constructing causal maps can be modified to construct Bayesian causal maps, and we illustrate it using a causal map of a marketing expert in the context of a product development decision.  相似文献   

11.
This paper investigates a model of decision making under uncertainty comprising opposite epistemic states of complete ignorance and probability. In the first part, a new utility theory under complete ignorance is developed that combines Hurwicz–Arrow's theory of decision under ignorance with Anscombe–Aumann's idea of reversibility and monotonicity used to characterize subjective probability. The main result is a representation theorem for preference under ignorance by a particular one-parameter function – the τ-anchor utility function. In the second part, we study decision making under uncertainty comprising an ignorant variable and a probabilistic variable. We show that even if the variables are independent, they are not reversible in Anscombe–Aumann's sense. This insight leads to the development of a new proposal for decision under uncertainty represented by a preference relation that satisfies the weak order and monotonicity assumptions but rejects the reversibility assumption. A distinctive feature of the new proposal is that the certainty equivalent of a mapping from the state space of uncertain variables to the prize space depends on the order in which the variables are revealed. Explicit modeling of the order of variables explains some of the puzzles in multiple-prior model and the models for decision making with Dempster–Shafer belief function.  相似文献   

12.
An argument graph is a graph where each node denotes an argument, and each arc denotes an attack by one argument on another. It offers a valuable starting point for theoretical analysis of argumentation following the proposals by Dung. However, the definition of an argument graph does not take into account the belief in the attacks. In particular, when constructing an argument graph from informal arguments, where each argument is described in free text, it is often evident that there is uncertainty about whether some of the attacks hold. This might be because there is some expressed doubt that an attack holds or because there is some imprecision in the language used in the arguments. In this paper, we use the set of spanning subgraphs of an argument graph as a sample space. A spanning subgraph contains all the arguments, and a subset of the attacks, of the argument graph. We assign a probability value to each spanning subgraph such that the sum of the assignments is 1. This means we can reflect the uncertainty over which is the actual subgraph using this probability distribution. Using the probability distribution over subgraphs, we can then determine the probability that a set of arguments is admissible or an extension. We can also obtain the probability of an attack relationship in the original argument graph as a marginal distribution (i.e. it is the sum of the probability assigned to each subgraph containing that attack relationship). We investigate some of the features of this proposal, and we consider the utility of our framework for capturing some practical argumentation scenarios.  相似文献   

13.
We consider the problem of deciding the best action time when observations are made sequentially. Specifically we address a special type of optimal stopping problem where observations are made from state-contingent distributions and there exists uncertainty on the state. In this paper, the decision-maker's belief on state is revised sequentially based on the previous observations. By using the independence property of the observations from a given distribution, the sequential Bayesian belief revision process is represented as a simple recursive form. The methodology developed in this paper provides a new theoretical framework for addressing the uncertainty on state in the action-timing problem context. By conducting a simulation analysis, we demonstrate the value of applying Bayesian strategy which uses sequential belief revision process. In addition, we evaluate the value of perfect information to gain more insight on the effects of using Bayesian strategy in the problem.  相似文献   

14.
Partially consonant belief functions (pcb), studied by Walley, are the only class of Dempster-Shafer belief functions that are consistent with the likelihood principle of statistics. Structurally, the set of foci of a pcb is partitioned into non-overlapping groups and within each group, foci are nested. The pcb class includes both probability function and Zadeh’s possibility function as special cases. This paper studies decision making under uncertainty described by pcb. We prove a representation theorem for preference relation over pcb lotteries to satisfy an axiomatic system that is similar in spirit to von Neumann and Morgenstern’s axioms of the linear utility theory. The closed-form expression of utility of a pcb lottery is a combination of linear utility for probabilistic lottery and two-component (binary) utility for possibilistic lottery. In our model, the uncertainty information, risk attitude and ambiguity attitude are separately represented. A tractable technique to extract ambiguity attitude from a decision maker behavior is also discussed.  相似文献   

15.
Traditionally, an insurance risk process describes an insurance company’s risk through some criteria using the historical data under the framework of probability theory with the prerequisite that the estimated distribution function is close enough to the true frequency. However, because of the complexity and changeability of the world, economical and technological reasons in many cases enough historical data are unavailable and we have to base on belief degrees given by some domain experts, which motivates us to include the human uncertainty in the insurance risk process by regarding interarrival times and claim amounts as uncertain variables using uncertainty theory. Noting the expansion of insurance companies’ operation scale and the increase of businesses with different risk nature, in this paper we extend the uncertain insurance risk process with a single class of claims to that with multiple classes of claims, and derive expressions for the ruin index and the uncertainty distribution of ruin time respectively. As the ruin time can be infinite, we propose a proper uncertain variable and the corresponding proper uncertainty distribution of that. Some numerical examples are documented to illustrate our results. Finally our method is applied to a real-world problem with some satellite insurance data provided by global insurance brokerage MARSH.  相似文献   

16.
We describe the Dempster–Shafer belief structure and provide some of its basic properties. We introduce the plausibility and belief measures associated with a belief structure. We note that these are not the only measures that can be associated with a belief structure. We describe a general approach for generating a class of measures that can be associated with a belief structure using a monotonic function on the unit interval, called a weight generating function. We study a number of these functions and the measures that result. We show how to use weight-generating functions to obtain dual measures from a belief structure. We show the role of belief structures in representing imprecise probability distributions. We describe the use of dual measures, other then plausibility and belief, to provide alternative bounding intervals for the imprecise probabilities associated with a belief structure. We investigate the problem of decision making under belief structure type uncertain. We discuss two approaches to this decision problem. One of which is based on an expected value of the OWA aggregation of the payoffs associated with the focal elements. The second approach is based on using the Choquet integral of a measure generated from the belief structure. We show the equivalence of these approaches.  相似文献   

17.
In this paper we present a generalization of belief functions over fuzzy events. In particular we focus on belief functions defined in the algebraic framework of finite MV-algebras of fuzzy sets. We introduce a fuzzy modal logic to formalize reasoning with belief functions on many-valued events. We prove, among other results, that several different notions of belief functions can be characterized in a quite uniform way, just by slightly modifying the complete axiomatization of one of the modal logics involved in the definition of our formalism.  相似文献   

18.
We define a class of “algebraic” random matrices. These are random matrices for which the Stieltjes transform of the limiting eigenvalue distribution function is algebraic, i.e., it satisfies a (bivariate) polynomial equation. The Wigner and Wishart matrices whose limiting eigenvalue distributions are given by the semicircle law and the Marčenko–Pastur law are special cases. Algebraicity of a random matrix sequence is shown to act as a certificate of the computability of the limiting eigenvalue density function. The limiting moments of algebraic random matrix sequences, when they exist, are shown to satisfy a finite depth linear recursion so that they may often be efficiently enumerated in closed form. In this article, we develop the mathematics of the polynomial method which allows us to describe the class of algebraic matrices by its generators and map the constructive approach we employ when proving algebraicity into a software implementation that is available for download in the form of the RMTool random matrix “calculator” package. Our characterization of the closure of algebraic probability distributions under free additive and multiplicative convolution operations allows us to simultaneously establish a framework for computational (noncommutative) “free probability” theory. We hope that the tools developed allow researchers to finally harness the power of infinite random matrix theory.  相似文献   

19.
Statistical inference about unknown parameter values that have known constraints is a challenging problem for both frequentist and Bayesian methods. As an alternative, inferential models created with the weak belief method can generate inferential results with desirable frequency properties for constrained parameter problems. To accomplish this, we propose an extension of weak belief called the elastic belief method. Compared to an existing rule for conditioning on constraint information, the elastic belief method produces more efficient probabilistic inference while maintaining desirable frequency properties. The application of this new method is demonstrated in two well-studied examples: inference about a nonnegative quantity measured with Gaussian error and inference about the signal rate of a Poisson count with a known background rate. Compared to several previous interval-forming methods for the constrained Poisson signal rate, the new method gives an interval with better coverage probability or a simpler construction. More importantly, the inferential model provides a post-data predictive measure of uncertainty about the unknown parameter value that is not inherent in other interval-forming methods.  相似文献   

20.
We review the method of spin tomography of quantum states in which we use the standard probability distribution functions to describe spin projections on selected directions, which provides the same information about states as is obtained by the density matrix method. In this approach, we show that satisfaction or violation of Bell's inequalities can be understood as properties of tomographic functions for joint probability distributions for two spins. We compare results obtained using the methods of classical probability theory with those obtained in the framework of traditional quantum mechanics. __________ Translated from Teoreticheskaya i Matematicheskaya Fizika, Vol. 146, No. 1, pp. 172–185, January, 2006.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号