首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Partially consonant belief functions (pcb), studied by Walley, are the only class of Dempster-Shafer belief functions that are consistent with the likelihood principle of statistics. Structurally, the set of foci of a pcb is partitioned into non-overlapping groups and within each group, foci are nested. The pcb class includes both probability function and Zadeh’s possibility function as special cases. This paper studies decision making under uncertainty described by pcb. We prove a representation theorem for preference relation over pcb lotteries to satisfy an axiomatic system that is similar in spirit to von Neumann and Morgenstern’s axioms of the linear utility theory. The closed-form expression of utility of a pcb lottery is a combination of linear utility for probabilistic lottery and two-component (binary) utility for possibilistic lottery. In our model, the uncertainty information, risk attitude and ambiguity attitude are separately represented. A tractable technique to extract ambiguity attitude from a decision maker behavior is also discussed.  相似文献   

2.
《Fuzzy Sets and Systems》1987,24(3):363-375
Since fuzzy data can be regarded as distribution of possibility, fuzzy data analysis by possibilistic linear models is proposed in this paper. Possibilistic linear systems are defined by the extension principle. Fuzzy parameter estimations are discussed in possibilistic linear systems and possibilistic linear models are employed for fuzzy data analysis with non-fuzzy inputs and fuzzy outputs defined by fuzzy numbers. The estimated possibilistic linear system can be obtained by solving a linear programming problem. This approach can be regarded as fuzzy interval analysis.  相似文献   

3.
Standard type spaces induce belief structures defined by precise beliefs. This paper proposes and analyzes simple procedures for constructing perturbations of such belief structures in which beliefs have a degree of ambiguity. Specifically, we construct ambiguous type spaces whose induced (ambiguous) belief hierarchies approximate the standard, precise, belief hierarchies corresponding to the initial type space. Based on a metric that captures the resulting approximation, two alternative procedures to construct such perturbations are introduced, and are shown to yield a simple and intuitive characterization of convergence to the initial unperturbed environment. As a special case, one of these procedures is shown to characterize the set of all finite perturbations. The introduced perturbations and their convergence properties provide conceptual foundations for the analysis of robustness to ambiguity of various solutions concepts, and for various decision rules under ambiguity.  相似文献   

4.
DNA sequence data provide a good source of information on the evolutionary history of organisms. Among the proposed methods, the maximum likelihood methods require an explicit probabilistic model of nucleotide substitution that makes the assumption clear. However, procedures for testing hypotheses on topologies have not been well developed. We propose a revised version of the maximum likelihood estimator of a tree and derive some of its properties. Then we present tests to compare given trees and to derive the most likely candidates for the true topology, applying to maximum likelihoods the notion of contrast, as defined in the framework of the analysis of variance, and the procedures used in multiple comparison. Finally, an example is presented.  相似文献   

5.
The scrap charge optimization problem in the brass casting process is a critical management concern that aims to reduce the charge while preventing specification violations. Uncertainties in scrap material compositions often cause violations in product standards. In this study, we have discussed the aleatory and epistemic uncertainties and modelled them by using probability and possibility distributions, respectively. Mathematical models including probabilistic and possibilistic parameters are generally solved by transforming one type of parameter into the other. However, the transformation processes have some handicaps such as knowledge losses or virtual information production. In this paper, we have proposed a new solution approach that needs no transformation process and so eliminates these handicaps. The proposed approach combines both chance-constrained stochastic programming and possibilistic programming. The solution of the numerical example has shown that the blending problem including probabilistic and possibilistic uncertainties can be successfully handled and solved by the proposed approach.  相似文献   

6.
A method is presented for characterizing the family of overall systems reconstructable from a given possibilistic structure system. The technique is elaborated in terms of fuzzy relation equations.It is demonstrated that the reconstruction family of a given structure system is equivalent to the set of solutions of a special type of fuzzy relation equation. The solution set is partially ordered, and contains both minimal solutions and a unique maximum solution. When these elements are identified, all members of the reconstruction family are determined.Another characteristic of the reconstruction family is its reconstruction uncertainty, a measure of which is also developed in this paper. This measure is used to define an identifiability quotient that expresses the degree of confidence with which we may identify a single overall system given a particular structure system.  相似文献   

7.
By analogy with Feller’s general probabilistic scheme used in the construction of many classical convergent sequences of linear operators, in this paper, we consider a Feller-kind scheme based on the possibilistic integral, for the construction of convergent sequences of nonlinear operators. In particular, in the discrete case, all the so-called max-product Bernstein-type operators and their qualitative convergence properties are recovered. Also, discrete nonperiodic nonlinear possibilistic convergent operators of Picard type, Gauss–Weierstrass type and Poisson–Cauchy type are studied and the possibility of introduction of discrete periodic(trigonometric) nonlinear possibilistic operators of de la Vallée–Poussin type, of Fejér type and of Jackson type is mentioned as future directions of research.  相似文献   

8.
When the information about uncertainty cannot be quantified in a simple, probabilistic way, the topic of possibilistic decision theory is often a natural one to consider. The development of possibilistic decision theory has lead to the proposition a series of possibilistic criteria, namely: optimistic and pessimistic possibilistic qualitative criteria [7], possibilistic likely dominance [2], [9], binary possibilistic utility [11] and possibilistic Choquet integrals [24]. This paper focuses on sequential decision making in possibilistic decision trees. It proposes a theoretical study on the complexity of the problem of finding an optimal strategy depending on the monotonicity property of the optimization criteria – when the criterion is transitive, this property indeed allows a polytime solving of the problem by Dynamic Programming. We show that most possibilistic decision criteria, but possibilistic Choquet integrals, satisfy monotonicity and that the corresponding optimization problems can be solved in polynomial time by Dynamic Programming. Concerning the possibilistic likely dominance criteria which is quasi-transitive but not fully transitive, we propose an extended version of Dynamic Programming which remains polynomial in the size of the decision tree. We also show that for the particular case of possibilistic Choquet integrals, the problem of finding an optimal strategy is NP-hard. It can be solved by a Branch and Bound algorithm. Experiments show that even not necessarily optimal, the strategies built by Dynamic Programming are generally very good.  相似文献   

9.
The purpose of this paper is to discuss some procedures that are available for testing non-nested (or separate) hypotheses in the statistics and econometrics literature. Since many of these techniques may also be exploited in other disciplines, it is hoped that an elaboration of the principal theoretical findings may make them more readily accessible to researchers in other disciplines. Several simple examples are used to illustrate the concepts of nested and non-nested hypotheses and, within the latter category, “global” and “partial” non-nested hypotheses. Two alternative methods of testing non-nested hypotheses are discussed and contrasted: the first of these is Cox's modification of the likelihood-ratio statistic, and the second is Atkinson's comprehensive model approach. A major emphasis is placed on the role of the Cox principle of hypothesis testing, which enables a broad range of hypotheses to be tested within the same framework. The problem associated with the application of the comprehensive model approach to composite non-nested hypotheses is also highlighted; Roy's union-intersection principle is presented as a viable method of dealing with this problem. Simulation results concerning the finite-sample properties of various tests are discussed, together with an analysis of some attempts to correct the poor size of the Cox and related tests.  相似文献   

10.
《Fuzzy Sets and Systems》1987,24(2):197-219
It is demonstrated, through a series of theorems, that the U-uncertainty (introduced by Higashi and Klir in 1982) is the only possibilistic measure of uncertainty and information that satisfies possibilistic counterparts of axioms of the well established Shannon and hartley measures of uncertainty and information. Two complementary forms of the possibilistic counterparts of the probabilistic branching (or grouping) axiom, which is usually used in proofs of the uniqueness of the Shannon measure, are introduced in this paper for the first time. A one-to-one correspondence between possibility distributions and basic probabilistic assignments (introduced by Shafer in his mathematical theory of evidence) is instrumental in most proofs in this paper. The uniqueness proof is based on possibilistic formulations of axioms of symmetry, expansibility, additivity, branching, monotonicity, and normalization.  相似文献   

11.
12.
《Fuzzy Sets and Systems》2004,143(3):335-353
We re-take the possibilistic (strictly non-probabilistic) model for information sources and information coding put forward in (Fuzzy Sets and Systems 132–1 (2002) 11–32); the coding-theoretic possibilistic entropy is defined there as the asymptotic rate of compression codes, which are optimal with respect to a possibilistic (not probabilistic) criterion. By proving a uniqueness theorem, in this paper we provide also an axiomatic derivation for such a possibilistic entropy, and so are able to support its use as an adequate measure of non-specificity, or rather of “possibilistic ignorance”, as we shall prefer to say. We compare our possibilistic entropy with two well-known measures of non-specificity: Hartley measure as found in set theory and U-uncertainty as found in possibility theory. The comparison allows us to show that the latter possesses also a coding-theoretic meaning.  相似文献   

13.
For a class of infinite-horizon optimal control problems that appear in studies on economic growth processes, the properties of the adjoint variable in the relations of the Pontryagin maximum principle, defined by a formula similar to the Cauchy formula for the solutions to linear differential systems, are studied. It is shown that under a dominating discount type condition the adjoint variable defined in this way satisfies both the core relations of the maximum principle (the adjoint system and the maximum condition) in the normal form and the complementary stationarity condition for the Hamiltonian. Moreover, a new economic interpretation of the adjoint variable based on this formula is presented.  相似文献   

14.
The probabilistic traveling salesman problem is a paradigmatic example of a stochastic combinatorial optimization problem. For this problem, recently an estimation-based local search algorithm using delta evaluation has been proposed. In this paper, we adopt two well-known variance reduction procedures in the estimation-based local search algorithm: the first is an adaptive sampling procedure that selects the appropriate size of the sample to be used in Monte Carlo evaluation; the second is a procedure that adopts importance sampling to reduce the variance involved in the cost estimation. We investigate several possible strategies for applying these procedures to the given problem and we identify the most effective one. Experimental results show that a particular heuristic customization of the two procedures increases significantly the effectiveness of the estimation-based local search.  相似文献   

15.
In this study, a two-stage fuzzy robust integer programming (TFRIP) method has been developed for planning environmental management systems under uncertainty. This approach integrates techniques of robust programming and two-stage stochastic programming within a mixed integer linear programming framework. It can facilitate dynamic analysis of capacity-expansion planning for waste management facilities within a multi-stage context. In the modeling formulation, uncertainties can be presented in terms of both possibilistic and probabilistic distributions, such that robustness of the optimization process could be enhanced. In its solution process, the fuzzy decision space is delimited into a more robust one by specifying the uncertainties through dimensional enlargement of the original fuzzy constraints. The TFRIP method is applied to a case study of long-term waste-management planning under uncertainty. The generated solutions for continuous and binary variables can provide desired waste-flow-allocation and capacity-expansion plans with a minimized system cost and a maximized system feasibility.  相似文献   

16.
We prove that the ENO reconstruction and ENO interpolation procedures are stable in the sense that the jump of the reconstructed ENO point values at each cell interface has the same sign as the jump of the underlying cell averages across that interface. Moreover, we prove that the size of these jumps after reconstruction relative to the jump of the underlying cell averages is bounded. Similar sign properties and the boundedness of the jumps hold for the ENO interpolation procedure. These estimates, which are shown to hold for ENO reconstruction and interpolation of arbitrary order of accuracy and on nonuniform meshes, indicate a remarkable rigidity of the piecewise polynomial ENO procedure.  相似文献   

17.
In certain signal processing problems, it is customary to estimate parameters in distorted signals by approximating what is termed a cross ambiguity function and estimating where it attains its maximum modulus. To unify and generalize these procedures, we consider a generalized form of the cross ambiguity function and give error bounds for estimating the parameters, showing that these bounds are lower if we maximize the real part rather than the modulus. We also reveal a connection between these bounds and certain uncertainty principles, which leads to a new type of uncertainty principle.  相似文献   

18.
Fuzzy data given by expert knowledge can be regarded as a possibility distribution by which possibilistic linear systems are defined. Recently, it has become important to deal with fuzzy data in connection with expert knowledge. Three formulations of possibilistic linear regression analysis are proposed here to deal with fuzzy data. Since our formulations can be reduced to linear programming problems, the merit of our formulations is to be able to obtain easily fuzzy parameters in possibilistic linear models and to add other constraint conditions which might be obtained from expert knowledge of fuzzy parameters. This approach can be regarded as a fuzzy interval analysis in a fuzzy environment.  相似文献   

19.
We explore an approach to possibilistic fuzzy clustering that avoids a severe drawback of the conventional approach, namely that the objective function is truly minimized only if all cluster centers are identical. Our approach is based on the idea that this undesired property can be avoided if we introduce a mutual repulsion of the clusters, so that they are forced away from each other. We develop this approach for the possibilistic fuzzy c-means algorithm and the Gustafson–Kessel algorithm. In our experiments we found that in this way we can combine the partitioning property of the probabilistic fuzzy c-means algorithm with the advantages of a possibilistic approach w.r.t. the interpretation of the membership degrees.  相似文献   

20.
The maximization of Gaussian probability of the multidimensional set is considered, when expectations of Gaussian distribution are varying and the co-variance matrix is fixed. It is shown that the necessary condition of maximum is a centering, i.e. the optimal point coincides with the weight center of the set. Iterative methods for solving this problem are developed: the analytical centering procedure and the stochastic one by series of Monte-Carlo samples of fixed or regulated size. The convergence of these procedures is considered and the properties of their application in the computer-aided design of systems are discussed  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号