首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 11 毫秒
1.
We establish a formula for combining dependent evidential bodies that are conditionally independent given their shared knowledge. Markov examples are provided to illustrate various aspects of our combination formula, including its practicality. We also show that the Dempster–Shafer formula and the conjunctive rule of the Transferable Belief Model can be recovered as special cases of our combination formula.  相似文献   

2.
We study a new approach to statistical prediction in the Dempster–Shafer framework. Given a parametric model, the random variable to be predicted is expressed as a function of the parameter and a pivotal random variable. A consonant belief function in the parameter space is constructed from the likelihood function, and combined with the pivotal distribution to yield a predictive belief function that quantifies the uncertainty about the future data. The method boils down to Bayesian prediction when a probabilistic prior is available. The asymptotic consistency of the method is established in the iid case, under some assumptions. The predictive belief function can be approximated to any desired accuracy using Monte Carlo simulation and nonlinear optimization. As an illustration, the method is applied to multiple linear regression.  相似文献   

3.
The book that launched the Dempster–Shafer theory of belief functions appeared 40 years ago. This intellectual autobiography looks back on how I came to write the book and how its ideas played out in my later work.  相似文献   

4.
Knowledge about the quality of a source can take several forms: it may for instance relate to its truthfulness or to its relevance, and may even be uncertain. Of particular interest in this paper is that such knowledge may also be contextual; for instance the reliability of a sensor may be known to depend on the actual object observed. Various tools, called correction mechanisms, have been developed within the theory of belief functions, to take into account knowledge about the quality of a source. Yet, only a single tool is available to account for contextual knowledge about the quality of a source, and precisely about the relevance of a source. There is thus some lack of flexibility since contextual knowledge about the quality of a source does not have to be restricted to its relevance. The first aim of this paper is thus to try and enlarge the set of tools available in belief function theory to deal with contextual knowledge about source quality. This aim is achieved by (1) providing an interpretation to each one of two contextual correction mechanisms introduced initially from purely formal considerations, and (2) deriving extensions – essentially by uncovering contextual forms – of two interesting and non-contextual correction mechanisms. The second aim of this paper is related to the origin of contextual knowledge about the quality of a source: due to the lack of dedicated approaches, it is indeed not clear how to obtain such specific knowledge in practice. A sound, easy to interpret and computationally simple method is therefore provided to learn from data contextual knowledge associated with the contextual correction mechanisms studied in this paper.  相似文献   

5.
Hidden Markov fields (HMFs) have been successfully used in many areas to take spatial information into account. In such models, the hidden process of interest X is a Markov field, that is to be estimated from an observable process Y. The possibility of such estimation is due to the fact that the conditional distribution of the hidden process with respect to the observed one remains Markovian. The latter property remains valid when the pairwise process (X,Y) is Markov and such models, called pairwise Markov fields (PMFs), have been shown to offer larger modeling capabilities while exhibiting similar processing cost. Further extensions lead to a family of more general models called triplet Markov fields (TMFs) in which the triplet (U,X,Y) is Markov where U is an underlying process that may have different meanings according to the application. A link has also been established between these models and the theory of evidence, opening new possibilities of achieving Dempster–Shafer fusion in Markov fields context. The aim of this paper is to propose a unifying general formalism allowing all conventional modeling and processing possibilities regarding information imprecision, sensor unreliability and data fusion in Markov fields context. The generality of the proposed formalism is shown theoretically through some illustrative examples dealing with image segmentation, and experimentally on hand-drawn and SAR images.  相似文献   

6.
7.
This note is a rejoinder to comments by Dubois and Moral about my paper “Likelihood-based belief function: justification and some extensions to low-quality data” published in this issue. The main comments concern (1) the axiomatic justification for defining a consonant belief function in the parameter space from the likelihood function and (2) the Bayesian treatment of statistical inference from uncertain observations, when uncertainty is quantified by belief functions. Both issues are discussed in this note, in response to the discussants' comments.  相似文献   

8.
A method is proposed to quantify uncertainty on statistical forecasts using the formalism of belief functions. The approach is based on two steps. In the estimation step, a belief function on the parameter space is constructed from the normalized likelihood given the observed data. In the prediction step, the variable Y to be forecasted is written as a function of the parameter θ and an auxiliary random variable Z with known distribution not depending on the parameter, a model initially proposed by Dempster for statistical inference. Propagating beliefs about θ and Z through this model yields a predictive belief function on Y. The method is demonstrated on the problem of forecasting innovation diffusion using the Bass model, yielding a belief function on the number of adopters of an innovation in some future time period, based on past adoption data.  相似文献   

9.
In machine learning problems, the availability of several classifiers trained on different data or features makes the combination of pattern classifiers of great interest. To combine distinct sources of information, it is necessary to represent the outputs of classifiers in a common space via a transformation called calibration. The most classical way is to use class membership probabilities. However, using a single probability measure may be insufficient to model the uncertainty induced by the calibration step, especially in the case of few training data. In this paper, we extend classical probabilistic calibration methods to the evidential framework. Experimental results from the calibration of SVM classifiers show the interest of using belief functions in classification problems.  相似文献   

10.
Given a parametric statistical model, evidential methods of statistical inference aim at constructing a belief function on the parameter space from observations. The two main approaches are Dempster's method, which regards the observed variable as a function of the parameter and an auxiliary variable with known probability distribution, and the likelihood-based approach, which considers the relative likelihood as the contour function of a consonant belief function. In this paper, we revisit the latter approach and prove that it can be derived from three basic principles: the likelihood principle, compatibility with Bayes' rule and the minimal commitment principle. We then show how this method can be extended to handle low-quality data. Two cases are considered: observations that are only partially relevant to the population of interest, and data acquired through an imperfect observation process.  相似文献   

11.
The theory of belief functions is a generalization of probability theory; a belief function is a set function more general than a probability measure but whose values can still be interpreted as degrees of belief. Dempster's rule of combination is a rule for combining two or more belief functions; when the belief functions combined are based on distinct or “independent” sources of evidence, the rule corresponds intuitively to the pooling of evidence. As a special case, the rule yields a rule of conditioning which generalizes the usual rule for conditioning probability measures. The rule of combination was studied extensively, but only in the case of finite sets of possibilities, in the author's monograph A Mathematical Theory of Evidence. The present paper describes the rule for general, possibly infinite, sets of possibilities. We show that the rule preserves the regularity conditions of continuity and condensability, and we investigate the two distinct generalizations of probabilistic independence which the rule suggests.  相似文献   

12.
In insurance, the analyst is often faced with a large number of inter-related variables for which correlations need to be estimated. Clearly, all correlations lie in the interval [?1,?1], but the numbers cannot be assigned independently. Here, the choices left to the analyst are considered from both a geometric and a probabilistic viewpoint. In practice, the underwriter or risk manager may fix some of the correlations and this paper considers, from both these viewpoints, what effect this has on the analyst's choice of correlations between the remaining variables.  相似文献   

13.
14.
A model and method are proposed for dealing with noisy and dependent features in classification problems. The knowledge base consists of uncertain logical rules forming a probabilistic argumentation system. Assumption-based reasoning is the inference mechanism that is used to derive information about the correct class of the object. Given a hypothesis regarding the correct class, the system provides a symbolic expression of the arguments for that hypothesis as a logical disjunctive normal form. These arguments turn into degrees of support for the hypothesis when numerical weights are assigned to them, thereby creating a support function on the set of possible classes. Since a support function is a belief function, the pignistic transformation is then applied to the support function and the object is placed into the class with maximal pignistic probability.  相似文献   

15.
We discuss several problems on the structure of nil rings from the linear algebra point of view. Among others, a number of questions and results are presented concerning algebras of infinite matrices over nil algebras, and nil algebras of infinite matrices over fields, which are related to the famous Koethe's problem. Some questions on radicals of tensor products of algebras related to Koethe's problem are also discussed.  相似文献   

16.
We study the properties of multiple life annuity and insurance premiums for general symmetric and survival statuses in the case when the joint distribution of future lifetimes has a dependence structure belonging to some nonparametric neighbourhood of independence. The size of the neighbourhood is controlled by a single parameter, which enables us to model really weak as well as stronger dependencies. We provide bounds on the difference of multiple life premiums for vectors of dependent and independent future lifetimes with the same univariate marginal distributions. Each such upper bound can be treated as a premium loading related to the strength of lifetimes’ dependence.  相似文献   

17.
基于DS/AHP的供应商选择方法   总被引:5,自引:0,他引:5  
梁昌勇  陈增明  丁勇 《运筹与管理》2005,14(6):33-38,56
供应商选择方法有很多种,在众多的方法中层次分析法以能够将定性指标定量化而被广泛应用于供应商选择决策中。考虑到供应商选择问题中包含有很多的不确定性而证据理论在处理不确定问题又有着独特的优点,因此本文采用了一种由层次分析法和证据理论结合而产生的DS/AHP决策方法,并将其应用于供应商选择决策问题中,该方法综合了层次分析法和证据理论的优点,可以更科学的进行供应商选择决策,最后通过一个例子说明这种方法在供应商选择中的应用。  相似文献   

18.
19.
In a very recent note by Gao and Ni [B. Gao, M.F. Ni, A note on article “The evidential reasoning approach for multiple attribute decision analysis using interval belief degrees”, European Journal of Operational Research, in press, doi:10.1016/j.ejor.2007.10.0381], they argued that Yen’s combination rule [J. Yen, Generalizing the Dempster–Shafer theory to fuzzy sets, IEEE Transactions on Systems, Man and Cybernetics 20 (1990) 559–570], which normalizes the combination of multiple pieces of evidence at the end of the combination process, was incorrect. If this were the case, the nonlinear programming models we proposed in [Y.M. Wang, J.B. Yang, D.L. Xu, K.S. Chin, The evidential reasoning approach for multiple attribute decision analysis using interval belief degrees, European Journal of Operational Research 175 (2006) 35–66] would also be incorrect. In this reply to Gao and Ni, we re-examine their numerical illustrations and reconsider their analysis of Yen’s combination rule. We conclude that Yen’s combination rule is correct and our nonlinear programming models are valid.  相似文献   

20.
We consider a mathematical model which describes the bilateral quasistatic contact of a viscoelastic body with a rigid obstacle. The contact is modelled with a modified version of Coulomb's law of dry friction and, moreover, the coefficient of friction is assumed to depend either on the total slip or on the current slip. In the first case, the problem depends upon contact history. We present the classical formulations of the problems, the variational formulations and establish the existence and uniqueness of a weak solution to each of them, when the coefficient of friction is sufficiently small. The proofs are based on classical results for elliptic variational inequalities and fixed point arguments. We also study the dependence of the solutions on the perturbations of the friction coefficient and obtain a uniform convergence result. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号