首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
It is widely acknowledged that understanding and prioritizing the voice of customer is a critical step in new product development. In this work, we propose a novel approach to handle missing and incomplete data while combining information from different surveys for prioritizing customer voices. Our new approach comprises of the following stages: estimating and representing missing and incomplete data; estimating intervals for the criteria used in analyzing data; mapping data on criteria to a common scale; modeling interval data using interval belief structure; and aggregating evidence and ranking customer voices using the interval evidential reasoning algorithm. We demonstrate our approach using a case study from automotive domain with a given criteria hierarchy for analyzing data from three different surveys. We propose new optimization formulations for estimating intervals of the criteria used in our case study and logical yet pragmatic transformation functions for mapping criteria values to a common scale.  相似文献   

2.
Multivariate Gaussian criteria in SMAA   总被引:2,自引:0,他引:2  
We consider stochastic multicriteria decision-making problems with multiple decision makers. In such problems, the uncertainty or inaccuracy of the criteria measurements and the partial or missing preference information can be represented through probability distributions. In many real-life problems the uncertainties of criteria measurements may be dependent. However, it is often difficult to quantify these dependencies. Also, most of the existing methods are unable to handle such dependency information.In this paper, we develop a method for handling dependent uncertainties in stochastic multicriteria group decision-making problems. We measure the criteria, their uncertainties and dependencies using a stochastic simulation model. The model is based on decision variables and stochastic parameters with given distributions. Based on the simulation results, we determine for the criteria measurements a joint probability distribution that quantifies the uncertainties and their dependencies. We then use the SMAA-2 stochastic multicriteria acceptability analysis method for comparing the alternatives based on the criteria distributions. We demonstrate the use of the method in the context of a strategic decision support model for a retailer operating in the liberated European electricity market.  相似文献   

3.
Models developed to analyze facility location decisions have typically optimized one or more objectives, subject to physical, structural, and policy constraints, in a static or deterministic setting. Because of the large capital outlays that are involved, however, facility location decisions are frequently long-term in nature. Consequently, there may be considerable uncertainty regarding the way in which relevant parameters in the location decision will change over time. In this paper, we propose two approaches for analyzing these types of dynamic location problems, focussing on situations where the total number of facilities to be located in uncertain. We term this type of location problem NOFUN (Number Of Facilities Uncertain). We analyze the NOFUN problem using two well-established decision criteria: the minimization of expected opportunity loss (EOL), and the minimization of maximum regret. In general, these criteria assume that there are a finite number of decision options and a finite number of possible states of nature. The minisum EOL criterion assumes that one can assign probabilities for the occurrence of the various states of nature and, therefore, find the initial set of facility locations that minimize the sum of expected losses across all future states. The minimax regret criteria finds the pattern of initial facility locations whose maximum loss is minimized over all possible future states.  相似文献   

4.
Passive and Active Compensability Multicriteria ANalysis (PACMAN) is a multiple criteria methodology based on a decision maker oriented notion of compensation, called compensability. An important feature of PACMAN is a possible asymmetry of the connected decision procedure, since compensability is determined for each ordered pair of criteria, distinguishing the compensating criterion from the compensated one. Here we give a notion of implementation of PACMAN, which allows a concrete modelization of a multiple criteria decision problem. We study regular implementations of PACMAN and their monotonicity properties. We also examine several regular implementations, which satisfy some additional properties. Particular emphasis is given to a regular implementation of PACMAN that produces the lexicographic ordering.  相似文献   

5.
The cards procedure was designed in the early 90 as a simple way to elicit weights for multiple criteria decision analysis outranking methods. It is based on the elicitation of the difference of importance between successive pairs of criteria. We propose to extend its use in two directions:  相似文献   

6.
We study the local surjectivity and openness properties of mappings and correspondences by using coderivatives. We recall local criteria but concentrate on point criteria. Our study relies on a compactness condition which is of independent interest, for instance, for the study of the behavior of the injectivity constant of a linear map or of a convex process under stabilization procedures. Several known criteria for openness are shown to be a consequences of this new compactness condition.  相似文献   

7.
Multiple criteria analysis (MCA) is a framework for evaluating decision options against multiple criteria. Numerous techniques for solving an MCA problem are available. This paper applies MCA to six water management decision problems. The MCA methods tested include weighted summation, range of value, PROMTHEE II, Evamix and compromise programming. We show that different MCA methods were in strong agreement with high correlations amongst rankings. In the few cases where strong disagreement between MCA methods did occur it was due to presence of mixed ordinal-cardinal data in the evaluation matrix. The results suggest that whilst selection of the MCA technique is important more emphasis is needed on the initial structuring of the decision problem, which involves choosing criteria and decision options.  相似文献   

8.
In multicriteria decision problems many values must be assigned, such as the importance of the different criteria and the values of the alternatives with respect to subjective criteria. Since these assignments are approximate, it is very important to analyze the sensitivity of results when small modifications of the assignments are made. When solving a multicriteria decision problem, it is desirable to choose a decision function that leads to a solution as stable as possible. We propose here a method based on genetic programming that produces better decision functions than the commonly used ones. The theoretical expectations are validated by case studies.  相似文献   

9.
This paper presents an extension of the comprehensive (overall) concordance index of ELECTRE methods, which takes the interaction between criteria into account. In real-world decision-aiding situations, it is reasonable to consider only the interaction between a small number of criterion pairs. Three types of interaction have been considered: mutual strengthening, mutual weakening, and antagonistic. The new concordance index correctly takes into account such types of interactions, by imposing such conditions as boundary, monotonicity, and continuity. We demonstrate that the generalized index is able to take the three types of interaction, or dependencies, between criteria into account satisfactorily, first using quasi-criteria and then using pseudo-criteria. We also examine the links between the new concordance index and the Choquet integral.  相似文献   

10.
Multiple criteria sorting aims at assigning alternatives evaluated on several criteria to predefined ordered categories. In this paper, we consider a well known multiple criteria sorting method, Electre Tri, which involves three types of preference parameters: (1) category limits defining the frontiers between consecutive categories, (2) weights and majority level specifying which coalitions form a majority, and (3) veto thresholds characterizing discordance effects. We propose an elicitation procedure to infer category limits from assignment examples provided by multiple decision makers. The procedure computes a set of category limits and vetoes common to all decision makers, with variable weights for each decision maker. Hence, the method helps reaching a consensus among decision makers on the category limits and veto thresholds, whereas finding a consensus on weights is left aside. The inference procedure is based on mixed integer linear programming and performs well even for datasets corresponding to real-world decision problems. We provide an illustrative example of the use of the method and analyze the performance of the proposed algorithms.  相似文献   

11.
12.
We establish the splitting of homoclinic orbits for a near-integrable lattice modified KdV (mKdV) equation with periodic boundary conditions. We use the Bäcklund transformation to construct homoclinic orbits of the lattice mKdV equation. We build the Melnikov function with the gradient of the invariant defined through the discrete Floquet discriminant evaluated at critical points. The criteria for the persistence of homoclinic solutions of the perturbed lattice mKdV equation are established.  相似文献   

13.
Case-based preference elicitation methods for multiple criteria sorting problems have the advantage of posing rather small cognitive demands on a decision maker, but they may lead to ambiguous results when preference parameters are not uniquely determined. We use a simulation approach to determine the extent of this problem and to study the impact of additional case information on the quality of results. Our experiments compare two decision analysis tools, case-based distance sorting and the simple additive weighting method, in terms of the effects of additional case information on sorting performance, depending on problem dimension – number of groups, number of criteria, etc. Our results confirm the expected benefit of additional case information on the precision of estimates of the decision maker’s preferences. Problem dimension, however, has some unexpected effects.  相似文献   

14.
In a paper by Chang [D.Y. Chang, Applications of the extent analysis method on fuzzy AHP, European Journal of Operational Research 95 (1996) 649–655], an extent analysis method on fuzzy AHP was proposed to obtain a crisp priority vector from a triangular fuzzy comparison matrix. It is found that the extent analysis method cannot estimate the true weights from a fuzzy comparison matrix and has led to quite a number of misapplications in the literature. In this paper, we show by examples that the priority vectors determined by the extent analysis method do not represent the relative importance of decision criteria or alternatives and that the misapplication of the extent analysis method to fuzzy AHP problems may lead to a wrong decision to be made and some useful decision information such as decision criteria and fuzzy comparison matrices not to be considered. We show these problems to avoid any possible misapplications in the future.  相似文献   

15.
When the information about uncertainty cannot be quantified in a simple, probabilistic way, the topic of possibilistic decision theory is often a natural one to consider. The development of possibilistic decision theory has lead to the proposition a series of possibilistic criteria, namely: optimistic and pessimistic possibilistic qualitative criteria [7], possibilistic likely dominance [2], [9], binary possibilistic utility [11] and possibilistic Choquet integrals [24]. This paper focuses on sequential decision making in possibilistic decision trees. It proposes a theoretical study on the complexity of the problem of finding an optimal strategy depending on the monotonicity property of the optimization criteria – when the criterion is transitive, this property indeed allows a polytime solving of the problem by Dynamic Programming. We show that most possibilistic decision criteria, but possibilistic Choquet integrals, satisfy monotonicity and that the corresponding optimization problems can be solved in polynomial time by Dynamic Programming. Concerning the possibilistic likely dominance criteria which is quasi-transitive but not fully transitive, we propose an extended version of Dynamic Programming which remains polynomial in the size of the decision tree. We also show that for the particular case of possibilistic Choquet integrals, the problem of finding an optimal strategy is NP-hard. It can be solved by a Branch and Bound algorithm. Experiments show that even not necessarily optimal, the strategies built by Dynamic Programming are generally very good.  相似文献   

16.
Evaluating the economic attractiveness of large projects often requires the development of large and complex financial models. Model complexity can prevent management from obtaining crucial information, with the risk of a suboptimal exploitation of the modelling efforts. We propose a methodology based on the so-called “differential importance measure (D)(D)” to enhance the managerial insights obtained from financial models. We illustrate our methodology by applying it to a project finance case study. We show that the additivity property of D grants analysts and managers full flexibility in combining parameters into any group and at the desired aggregation level. We analyze investment criteria related to both the investors’s and lenders’ perspectives. Results indicate that exogenous factors affect investors (sponsors and lenders) in different ways, whether exogenous variables are considered individually or by groups.  相似文献   

17.
18.
We discuss the problem of estimating the number of principal components in principal components analysis (PCA). Despite the importance of the problem and the multitude of solutions proposed in literature, it comes as a surprise that there does not exist a coherent asymptotic framework, which would justify different approaches depending on the actual size of the dataset. In this article, we address this issue by presenting an approximate Bayesian approach based on Laplace approximation and introducing a general method of developing criteria for model selection, called PEnalized SEmi-integrated Likelihood (PESEL). Our general framework encompasses a variety of existing approaches based on probabilistic models, like the Bayesian Information Criterion for Probabilistic PCA (PPCA), and enables the construction of new criteria, depending on the size of the dataset at hand and additional prior information. Specifically, we apply PESEL to derive two new criteria for datasets where the number of variables substantially exceeds the number of observations, which is out of the scope of currently existing approaches. We also report results of extensive simulation studies and real data analysis, which illustrate the desirable properties of our proposed criteria as compared to state-of-the-art methods and very recent proposals. Specifically, these simulations show that PESEL-based criteria can be quite robust against deviations from the assumptions of a probabilistic model. Selected PESEL-based criteria for the estimation of the number of principal components are implemented in the R package pesel, which is available on github (https://github.com/psobczyk/pesel). Supplementary material for this article, with additional simulation results, is available online. The code to reproduce all simulations is available at https://github.com/psobczyk/pesel_simulations.  相似文献   

19.
The purpose of this paper is to learn the order of criteria of lexicographic decision under various reasonable assumptions. We give a sample evaluation and an oracle based algorithm. In the worst case analysis we are dealing with the adversarial models. We show that if the distances of the samples are less than 4, then it is not learnable, but 4-distance samples are polynomial learnable.  相似文献   

20.
In this paper we study bargaining models where the agents consider several criteria to evaluate the results of the negotiation process. We propose a new solution concept for multicriteria bargaining games based on the distance to a utopian minimum level vector. This solution is a particular case of the class of the generalized leximin solutions and can be characterized as the solution of a finite sequence of minimax programming problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号