首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
Obtaining reliable estimates of the parameters of a probabilistic classification model is often a challenging problem because the amount of available training data is limited. In this paper, we present a classification approach based on belief functions that makes the uncertainty resulting from limited amounts of training data explicit and thereby improves classification performance. In addition, we model classification as an active information acquisition problem where features are sequentially selected by maximizing the expected information gain with respect to the current belief distribution, thus reducing uncertainty as quickly as possible. For this, we consider different measures of uncertainty for belief functions and provide efficient algorithms for computing them. As a result, only a small subset of features need to be extracted without negatively impacting the recognition rate. We evaluate our approach on an object recognition task where we compare different evidential and Bayesian methods for obtaining likelihoods from training data and we investigate the influence of different uncertainty measures on the feature selection process.  相似文献   

2.
In machine learning problems, the availability of several classifiers trained on different data or features makes the combination of pattern classifiers of great interest. To combine distinct sources of information, it is necessary to represent the outputs of classifiers in a common space via a transformation called calibration. The most classical way is to use class membership probabilities. However, using a single probability measure may be insufficient to model the uncertainty induced by the calibration step, especially in the case of few training data. In this paper, we extend classical probabilistic calibration methods to the evidential framework. Experimental results from the calibration of SVM classifiers show the interest of using belief functions in classification problems.  相似文献   

3.
This paper deals with face detection and tracking by computer vision for multimedia applications. Contrary to current techniques that are based on huge learning databases and complex algorithms to get generic face models (e.g., active appearance models), the proposed method handles simple contextual knowledge representative of the application background thanks to a quick supervised initialization. The transferable belief model is used to counteract the incompleteness of the prior model due first to a lack of exhaustiveness of the learning stage and secondly to the subjectivity of the task of face segmentation. The algorithm contains two main steps: detection and tracking. In the detection phase, an evidential face model is estimated by merging basic beliefs elaborated from Viola and Jones face detector and from a skin colour detector, for the assignment of mass functions. These functions are computed as the merging of sources in a specific nonlinear colour space. In order to deal with colour information dependence in the fusion process, the Den?ux cautious rule is used. The pignistic probabilities stemming from the face model guarantee the compatibility between the belief framework and the probabilistic framework. They are the entries of a bootstrap particle filter which yields face tracking at video rate. We show that the proper tuning of the evidential model parameters improves the tracking performance in real-time. Quantitative evaluation of the proposed method gives a detection rate reaching 80%, comparable to what can be found in the literature. However the proposed method requires only a weak initialization.  相似文献   

4.
Simultaneous Localization and Mapping (SLAM) has received quite a lot of attention in the last decades because of its relevance for many applications centered on a mobile observer, such as service robotics and intelligent transportation systems. This paper focuses on the use of recursive Bayesian filtering, as implemented by the Extendend Kalman Filter (EKF), to face the Visual SLAM problem, i.e., when using data from visual sources. In Monocular SLAM, which uses a single camera as unique source of information, it is not possible to directly estimate the depth of a feature from a single image. To handle the severely non-normal distribution representing such uncertainty, inverse parametrizations were developed, capable to deal with such uncertainty and still relying on Gaussian variables. In the paper, after an introduction to EKF-SLAM, we provide a review of different inverse parametrizations, and we introduce a novel proposal, the Framed Inverse Depth (FID) parametrization, which, in terms of consistency, performs similarly to state of the art Monocular SLAM parametrizations, but at a reduced computational cost. All these parametrizations can be used in a stereo and multi camera setting too. An extensive analysis is presented for both Monocular and stereo SLAM, for a simulated environment widely used in the literature as well as on a widely used real dataset.  相似文献   

5.
Many multiple attribute decision analysis (MADA) problems are characterised by both quantitative and qualitative attributes with various types of uncertainties. Incompleteness (or ignorance) and vagueness (or fuzziness) are among the most common uncertainties in decision analysis. The evidential reasoning (ER) approach has been developed in the 1990s and in the recent years to support the solution of MADA problems with ignorance, a kind of probabilistic uncertainty. In this paper, the ER approach is further developed to deal with MADA problems with both probabilistic and fuzzy uncertainties.In this newly developed ER approach, precise data, ignorance and fuzziness are all modelled under the unified framework of a distributed fuzzy belief structure, leading to a fuzzy belief decision matrix. A utility-based grade match method is proposed to transform both numerical data and qualitative (fuzzy) assessment information of various formats into the fuzzy belief structure. A new fuzzy ER algorithm is developed to aggregate multiple attributes using the information contained in the fuzzy belief matrix, resulting in an aggregated fuzzy distributed assessment for each alternative. Different from the existing ER algorithm that is of a recursive nature, the new fuzzy ER algorithm provides an analytical means for combining all attributes without iteration, thus providing scope and flexibility for sensitivity analysis and optimisation. A numerical example is provided to illustrate the detailed implementation process of the new ER approach and its validity and wide applicability.  相似文献   

6.
We construct alternative frames of discernment from input belief functions. We assume that the core of each belief function is a subset of a so far unconstructed frame of discernment. The alternative frames are constructed as different cross products of unions of different cores. With the frames constructed the belief functions are combined for each alternative frame. The appropriateness of each frame is evaluated in two ways: (i) we measure the aggregated uncertainty (an entropy measure) of the combined belief functions for that frame to find if the belief functions are interacting in interesting ways, (ii) we measure the conflict in Dempster’s rule when combining the belief functions to make sure they do not exhibit too much internal conflict. A small frame typically yields a small aggregated uncertainty but a large conflict, and vice versa. The most appropriate frame of discernment is that which minimizes a probabilistic sum of the conflict and a normalized aggregated uncertainty of all combined belief functions for that frame of discernment.  相似文献   

7.
Optimization models are increasingly being used in agricultural planning. However, the inherent uncertainties present in agriculture make it difficult. In recent years, robust optimization has emerged as a methodology that allows dealing with uncertainty in optimization models, even when probabilistic knowledge of the phenomenon is incomplete. In this paper, we consider a wine grape harvesting scheduling optimization problem subject to several uncertainties, such as the actual productivity that can be achieved when harvesting. We study how effective robust optimization is solving this problem in practice. We develop alternative robust models and show results for some test problems obtained from actual wine industry problems.  相似文献   

8.
In this paper, parametric regression analyses including both linear and nonlinear regressions are investigated in the case of imprecise and uncertain data, represented by a fuzzy belief function. The parameters in both the linear and nonlinear regression models are estimated using the fuzzy evidential EM algorithm, a straightforward fuzzy version of the evidential EM algorithm. The nonlinear regression model is derived by introducing a kernel function into the proposed linear regression model. An unreliable sensor experiment is designed to evaluate the performance of the proposed linear and nonlinear parametric regression methods, called parametric evidential regression (PEVREG) models. The experimental results demonstrate the high prediction accuracy of the PEVREG models in regressions with crisp inputs and a fuzzy belief function as output.  相似文献   

9.
We show that Pearl's causal networks can be described using causal compositional models (CCMs) in the valuation-based systems (VBS) framework. One major advantage of using the VBS framework is that as VBS is a generalization of several uncertainty theories (e.g., probability theory, a version of possibility theory where combination is the product t-norm, Spohn's epistemic belief theory, and Dempster–Shafer belief function theory), CCMs, initially described in probability theory, are now described in all uncertainty calculi that fit in the VBS framework. We describe conditioning and interventions in CCMs. Another advantage of using CCMs in the VBS framework is that both conditioning and intervention can be easily described in an elegant and unifying algebraic way for the same CCM without having to do any graphical manipulations of the causal network. We describe how conditioning and intervention can be computed for a simple example with a hidden (unobservable) variable. Also, we illustrate the algebraic results using numerical examples in some of the specific uncertainty calculi mentioned above.  相似文献   

10.
The construction of topological index maps for equivariant families of Dirac operators requires factoring a general smooth map through maps of a very simple type: zero sections of vector bundles, open embeddings, and vector bundle projections. Roughly speaking, a normally non-singular map is a map together with such a factorisation. These factorisations are models for the topological index map. Under some assumptions concerning the existence of equivariant vector bundles, any smooth map admits a normal factorisation, and two such factorisations are unique up to a certain notion of equivalence. To prove this, we generalise the Mostow Embedding Theorem to spaces equipped with proper groupoid actions. We also discuss orientations of normally non-singular maps with respect to a cohomology theory and show that oriented normally non-singular maps induce wrong-way maps on the chosen cohomology theory. For K-oriented normally non-singular maps, we also get a functor to Kasparov's equivariant KK-theory. We interpret this functor as a topological index map.  相似文献   

11.
In telecommunications, operators usually use market surveys and statistical models to estimate traffic evolution in networks or to approximate queuing delay functions in routing strategies. Many research activities concentrated on handling traffic uncertainty in network design. Measurements on real world networks have shown significant errors in delay approximations, leading to weak management decisions in network planning. In this work, we introduce elements of robust optimization theory for delay modeling in routing problems. Different types of data uncertainty are considered and linked to corresponding robust models. We study a special case of constraints featuring separable additive functions. Specifically, we consider that each term of the sum is disturbed by a random parameter. These constraints are frequent in network based problems, where functions reflecting real world measurements on links are summed up over end-to-end paths. While classical robust formulations have to deal with the introduction of new variables, we show that, under specific hypotheses, the deterministic robust counterpart can be formulated in the space of original variables. This offers the possibility of constructing tractable robust models. Starting from Soyster’s conservative model, we write and compare different uncertainty sets and formulations offering each a different protection level for the delay constrained routing problem. Computational experiments are developed in order to evaluate the “price of robustness” and to assess the quality of the new formulations.  相似文献   

12.
The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur Dempster's multivalued mappings of probability spaces. This leads to random set and more generally to random filter models of evidence. In this probabilistic view evidence is seen as more or less probable arguments for certain hypotheses and they can be used to support those hypotheses to certain degrees. These degrees of support are in fact the reliabilities with which the hypotheses can be derived from the evidence. Alternatively, the mathematical theory of evidence can be founded axiomatically on the notion of belief functions or on the allocation of belief masses to subsets of a frame of discernment. These approaches aim to present evidence theory as an extension of probability theory. Evidence theory has been used to represent uncertainty in expert systems, especially in the domain of diagnostics. It can be applied to decision analysis and it gives a new perspective for statistical analysis. Among its further applications are image processing, project planning and scheduling and risk analysis. The computational problems of evidence theory are well understood and even though the problem is complex, efficient methods are available.Research partly supported by Grants No. 21-30186.90 and 21-32660.91 of the Swiss National Foundation for Scientific Research.  相似文献   

13.
Graphical models are efficient and simple ways to represent dependencies between variables. We introduce in this paper the so-called belief causal networks where dependencies are uncertain causal links and where the uncertainty is represented by belief masses. Through these networks, we propose to represent the results of passively observing the spontaneous behavior of the system and also evaluate the effects of external actions. Interventions are very useful for representing causal relations, we propose to compute their effects using a generalization of the “do” operator. Even if the belief chain rule is different from the Bayesian chain rule, we show that the joint distributions of the altered structures to graphically describe interventions are equivalent. This paper also addresses new issues that are arisen when handling interventions: we argue that in real world applications, external manipulations may be imprecise and show that they have a natural encoding under the belief function framework.  相似文献   

14.
In this paper we develop analytical techniques for proving the existence of chaotic dynamics in systems where the dynamics is generated by infinite sequences of maps. These are generalizations of the Conley-Moser conditions that are used to show that a (single) map has an invariant Cantor set on which it is topologically conjugate to a subshift on the space of symbol sequences. The motivation for developing these methods is to apply them to the study of chaotic advection in fluid flows arising from velocity fields with aperiodic time dependence, and we show how dynamics generated by infinite sequences of maps arises naturally in that setting. Our methods do not require the existence of a homoclinic orbit in order to conclude the existence of chaotic dynamics. This is important for the class of fluid mechanical examples considered since one cannot readily identify a homoclinic orbit from the structure of the equations.¶We study three specific fluid mechanical examples: the Aref blinking vortex flow, Samelson's tidal advection model, and Min's rollup-merge map that models kinematics in the mixing layer. Each of these flows is modelled as a type of "blinking flow", which mathematically has the form of a linked twist map, or an infinite sequence of linked twist maps. We show that the nature of these blinking flows is such that it is possible to have a variety of "patches" of chaos in the flow corresponding to different length and time scales.  相似文献   

15.
Multiple attribute decision analysis (MADA) problems having both quantitative and qualitative attributes under uncertainty can be modelled and analysed using the evidential reasoning (ER) approach. Several types of uncertainty such as ignorance and fuzziness can be consistently modelled in the ER framework. In this paper, both interval weight assignments and interval belief degrees are considered, which could be incurred in many decision situations such as group decision making. Based on the existing ER algorithm, several pairs of preference programming models are constructed to support global sensitivity analysis based on the interval values and to generate the upper and lower bounds of the combined belief degrees for distributed assessment and also the expected values for ranking of alternatives. A post-optimisation procedure is developed to identify non-dominated solutions, examine the robustness of the partial ranking orders generated, and provide guidance for the elicitation of additional information for generating more desirable assessment results. A car evaluation problem is examined to show the implementation process of the proposed approach.  相似文献   

16.
This paper proposes solution approaches to the belief linear programming (BLP). The BLP problem is an uncertain linear program where uncertainty is expressed by belief functions. The theory of belief function provides an uncertainty measure that takes into account the ignorance about the occurrence of single states of nature. This is the case of many decision situations as in medical diagnosis, mechanical design optimization and investigation problems. We extend stochastic programming approaches, namely the chance constrained approach and the recourse approach to obtain a certainty equivalent program. A generic solution strategy for the resulting certainty equivalent is presented.  相似文献   

17.
This paper derives a particle filter algorithm within the Dempster–Shafer framework. Particle filtering is a well-established Bayesian Monte Carlo technique for estimating the current state of a hidden Markov process using a fixed number of samples. When dealing with incomplete information or qualitative assessments of uncertainty, however, Dempster–Shafer models with their explicit representation of ignorance often turn out to be more appropriate than Bayesian models.The contribution of this paper is twofold. First, the Dempster–Shafer formalism is applied to the problem of maintaining a belief distribution over the state space of a hidden Markov process by deriving the corresponding recursive update equations, which turn out to be a strict generalization of Bayesian filtering. Second, it is shown how the solution of these equations can be efficiently approximated via particle filtering based on importance sampling, which makes the Dempster–Shafer approach tractable even for large state spaces. The performance of the resulting algorithm is compared to exact evidential as well as Bayesian inference.  相似文献   

18.
Due to the increasing demands for natural gas, it is playing a more important role in the energy system, and its system expansion planning is drawing more attentions. In this paper, we propose expansion planning models which include both natural gas transmission network expansion and LNG (Liquified Natural Gas) terminals location planning. These models take into account the uncertainties of demands and supplies in the future, which make the models stochastic mixed integer programs with discrete subproblems. Also we consider risk control in our models by including probabilistic constraints, such as a limit on CVaR (Conditional Value at Risk). In order to solve large-scale problems, especially with a large number of scenarios, we propose the embedded Benders decomposition algorithm, which applies Benders cuts in both first and second stages, to tackle the discrete subproblems. Numerical results show that our algorithm is efficient for large scale stochastic natural gas transportation system expansion planning problems.  相似文献   

19.
Chaotic dynamics have been observed in example piecewise-affine models of gene regulatory networks. Here we show how the underlying Poincaré maps can be explicitly constructed. To do this, we proceed in two steps. First, we consider a limit case, where some parameters tend to ∞, and then consider the case with finite parameters as a perturbation of the previous one. We provide a detailed example of this construction, in 3-d, with several thresholds per variable. This construction is essentially a topological horseshoe map. We show that the limit situation is conjugate to the golden mean shift, and is thus chaotic. Then, we show that chaos is preserved for large parameters, relying on the structural stability of the return map in the limit case. We also describe a method to embed systems with several thresholds into binary systems, of higher dimensions. This shows that all results found for systems having several thresholds remain valid in the binary case.  相似文献   

20.
This paper describes how to treat hard uncertainties defined by so-called uncertainty maps in multiobjective optimization problems. For the uncertainty map being set-valued, a Taylor formula is shown under appropriate assumptions. The hard uncertainties are modeled using parametric set optimization problems for which a scalarization result is given. The presented new approach for the solution of multiobjective optimization problems with hard uncertainties is then applied to the layout optimization of photovoltaic power plants. Since good weather forecasts are difficult to obtain for future years, weather data are really hard uncertainties arising in the planning process. Numerical results are presented for a real-world problem on the Galapagos island Isabela.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号