首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Linear stochastic programming provides a flexible toolbox for analyzing real-life decision situations, but it can become computationally cumbersome when recourse decisions are involved. The latter are usually modeled as decision rules, i.e., functions of the uncertain problem data. It has recently been argued that stochastic programs can quite generally be made tractable by restricting the space of decision rules to those that exhibit a linear data dependence. In this paper, we propose an efficient method to estimate the approximation error introduced by this rather drastic means of complexity reduction: we apply the linear decision rule restriction not only to the primal but also to a dual version of the stochastic program. By employing techniques that are commonly used in modern robust optimization, we show that both arising approximate problems are equivalent to tractable linear or semidefinite programs of moderate sizes. The gap between their optimal values estimates the loss of optimality incurred by the linear decision rule approximation. Our method remains applicable if the stochastic program has random recourse and multiple decision stages. It also extends to cases involving ambiguous probability distributions.  相似文献   

2.
For statistical decision problems, there are two well-known methods of randomization: on the one hand, randomization by means of mixtures of nonrandomized decision functions (randomized decision rules) in the game “statistician against nature,” on the other hand, randomization by means of randomized decision functions. In this paper, we consider the problem of risk-equivalence of these two procedures, i.e., imposing fairly general conditions on a nonsequential decision problem, it is shown that to each randomized decision rule, there is a randomized decision function with uniformly the same risk, and vice versa. The crucial argument is based on rewriting risk-equivalence in terms of Choquet's integral representation theorem. It is shown, in addition, that for certain special cases that do not fulfill the assumptions of the Main Theorem, risk-equivalence holds at least partially.  相似文献   

3.
The pricing and selling of industrial products are considered in sections on decision theory, bidding and personal selling. It is suggested that the marketing of consumer durables and semi-durables can be studied by means of models which use aggregate data but that consumer panel data are needed for modelling consumer expendable markets. Finally, some problems which might be suboptimized are indicated.  相似文献   

4.
The quadratic discriminant function is often used to separate two classes of points in a multidimensional space. When the two classes are normally distributed, this results in the optimum separation. In some cases however, the assumption of normality is a poor one and the classification error is increased. The current paper derives an upper bound for the classification error due to a quadratic decision surface. The bound is strict when the class means and covariances and the quadratic discriminant surface satisfy certain specified symmetry conditions.  相似文献   

5.
Utility or value functions play an important role of preference models in multiple-criteria decision making. We investigate the relationships between these models and the decision-rule preference model obtained from the Dominance-based Rough Set Approach. The relationships are established by means of special “cancellation properties” used in conjoint measurement as axioms for representation of aggregation procedures. We are considering a general utility function and three of its important special cases: associative operator, Sugeno integral and ordered weighted maximum. For each of these aggregation functions we give a representation theorem establishing equivalence between a very weak cancellation property, the specific utility function and a set of rough-set decision rules. Each result is illustrated by a simple example of multiple-criteria decision making. The results show that the decision rule model we propose has clear advantages over a general utility function and its particular cases.  相似文献   

6.
In this paper the problem of verifying the Pareto-optimality of a given solution to a dynamic multiple-criterion decision (DMCD) problem is investigated. For this purpose, some new conditions are derived for Pareto-optimality of DMCD problems. In the literature, Pareto-optimality is characterized by means of Euler-Lagrangian differential equations. There exist problems in production and inventory control to which these conditions cannot be applied directly (Song 1997). Thus, it is necessary to explore new conditions for Pareto-optimality of DMCD problems. With some mild assumptions on the objective functionals, we develop necessary and/or sufficient conditions for Pareto-optimality in the sprit of optimization theory. Both linear and non-linear cases are considered.  相似文献   

7.
This paper discusses multiple criteria models of decision analysis with finite sets of alternatives. A weighted sum of criteria is used to evaluate the performance of alternatives. Information about the weights is assumed to be in the form of arbitrary linear constraints. Conditions for checking dominance and potential optimality of decision alternatives are presented. In the case of testing potential optimality, the proposed appoach leads to the consideration of a couple of mutually dual linear programming problems. The analysis of these problems gives valuable information for the decision maker. In particular, if a decision alternative is not potentially optimal, then a mixed alternative dominating it is defined by a solution to one of the LP problems. This statement generalizes similar results known for some special cases. The interpretation of the mixed alternative is discussed and compared to its analogue in a data envelopment analysis context.  相似文献   

8.
The original rough set approach proved to be very useful in dealing with inconsistency problems following from information granulation. It operates on a data table composed of a set U of objects (actions) described by a set Q of attributes. Its basic notions are: indiscernibility relation on U, lower and upper approximation of either a subset or a partition of U, dependence and reduction of attributes from Q, and decision rules derived from lower approximations and boundaries of subsets identified with decision classes. The original rough set idea is failing, however, when preference-orders of attribute domains (criteria) are to be taken into account. Precisely, it cannot handle inconsistencies following from violation of the dominance principle. This inconsistency is characteristic for preferential information used in multicriteria decision analysis (MCDA) problems, like sorting, choice or ranking. In order to deal with this kind of inconsistency a number of methodological changes to the original rough sets theory is necessary. The main change is the substitution of the indiscernibility relation by a dominance relation, which permits approximation of ordered sets in multicriteria sorting. To approximate preference relations in multicriteria choice and ranking problems, another change is necessary: substitution of the data table by a pairwise comparison table, where each row corresponds to a pair of objects described by binary relations on particular criteria. In all those MCDA problems, the new rough set approach ends with a set of decision rules playing the role of a comprehensive preference model. It is more general than the classical functional or relational model and it is more understandable for the users because of its natural syntax. In order to workout a recommendation in one of the MCDA problems, we propose exploitation procedures of the set of decision rules. Finally, some other recently obtained results are given: rough approximations by means of similarity relations, rough set handling of missing data, comparison of the rough set model with Sugeno and Choquet integrals, and results on equivalence of a decision rule preference model and a conjoint measurement model which is neither additive nor transitive.  相似文献   

9.
The cost of obtaining good information regarding the various probability distributions needed for the solution of most stochastic decision problems is considerable. It is important to consider questions such as: (1) what minimal amounts of information are sufficient to determine optimal decision rules; (2) what is the value of obtaining knowledge of the actual realization of the random vectors; and (3) what is the value of obtaining some partial information regarding the actual realization of the random vectors. This paper is primarily concerned with questions two and three when the decision maker has an a priori knowledge of the joint distribution function of the random variables. Some remarks are made regarding results along the lines of question one. Mention is made of assumptions sufficient so that knowledge of means, or of means, variances, co-variances and n-moments are sufficient for the calculation of optimal decision rules. The analysis of the second question leads to the development of bounds on the value of perfect information. For multiperiod problems it is important to consider when the perfect information is available. Jensen's inequality is the key tool of the analysis. The calculation of the bounds requires the solution of nonlinear programs and the numerical evaluation of certain functions. Generally speaking, tighter bounds may be obtained only at the expense of additional information and computational complexity. Hence, one may wish to compute some simple bounds to decide upon the advisability of obtaining more information. For the analysis of the value of partial information it is convenient to introduce the notion of a signal. Each signal represents the receipt of certain information, and these signals are drawn from a given probability distribution. When a signal is received, it alters the decision maker's perception of the probability distributions inherent in his decision problem. The choice between different information structures must then take into account these probability distributions as well as the decision maker's preference function. A hierarchy of bounds may be determined for partial information evaluation utilizing the tools of the multiperiod perfect information case. However, the calculation of these bounds is generally considerably more dicult than the calculation of similar boulids in the perfect information case. Most of the analysis is directed towards problems in which the decision maker has a linear utility function over profits, costs or some other numerical variable. However, some of the bounds generalize to the case when the utility function is strictly increasing and concave.  相似文献   

10.
Inventory control is a typical problem of decision making. In this paper a periodic replenishment of stock, the spare parts being of one kind, is discussed for some cases when the demand rate is uncertain. The first decision, before all others in the sequence, is done by assuming an a priori distribution of demand rate. In time, as the demand process goes on, corrections of parameters of the a priori distribution are made according to the accumulated knowledge about past demand. This Bayesian approach to decision making based on learning about the uncertain demand rate is known for the case when the demand rate is unknown but constant. It is shown that this same approach can be used in some cases when the demand rate is unknown and not constant. Results are given and used for inventory control.  相似文献   

11.
In this paper we consider the insurance of assets with experience rating in the framework of a discrete-time dynamic decision model. The goal of the risk-averse agent (insurance buyer) is to maximize the expected utility of wealth at the finite planning horizon. First, it has to be decided whether a contract should be bought or not. The contract gives the possibility to choose in each period between three alternatives: to buy insurance and to file a claim, to buy insurance and not to file a claim, to suspend insurance. For this case structural properties of the optimal strategy are obtained by means of dynamic programming. Especially, we present a condition for the experience rating scheme such that the decision to suspend insurance is irreversible during the planning period (stopping rule). In the final section we present some numerical experiences. Among others it will be shown that the optimal decision functions generally are not monotone with respect to the agent's claim history.  相似文献   

12.
随机效应模型中方差分量的经验Bayes检验问题   总被引:4,自引:0,他引:4  
给出了双向分类随机效应模型中方差分量的Bayes检验的判决函数,利用核估计的方法,构造了相应的经验Bayes(EB)检验的判决函数.在适当的条件下证明了EB判决函数是渐近最优的且有收敛速度.给出了模型的特例和推广.最后,举出一个满足定理条件的例子.  相似文献   

13.
The unrestrained expansion in urbanization and increasing development of new means of transports result in major urban land use and transportation system which is socially, economically and environmentally unsustainable. Hence the major challenge for the decision makers regarding the transportation policies is to choose the alternative fuel operated vehicles resulting in a sustainable transportation system. In real life situations, it is difficult to get exact data, so to express the uncertain data, intuitionistic fuzzy data has been considered. The problem is to select the best fuel technology for land transportation subject to multiple criterions resulting in a sustainable transportation system in an uncertain environment. Here, the similarity measures of Intuitionistic fuzzy sets (IFSs) are applied for developing a methodology for identifying the best option. The weights of the attributes may be known or partially known or unknown. The unknown weights are determined by normalizing the average score functions of the intuitionistic fuzzy data for the criterion. Algorithms are given for handling different situations and numerical examples illustrate the varied cases.  相似文献   

14.
Discount utility, based on utility theory, is used to study human decision behaviors under the consideration of time preference. It assumes that by means of the axiomatic system of rationality, it is possible to quantify human beings’ utilities by some explicable models for intertemporal decision making. Recent studies have been based on two basic models: the exponential and the hyperbolic discount models. These two types of model have been proved to be either too fast for discounting or too restricted for fitting human beings’ discounting behaviors. In this study, a power law discount model is proposed. Axiomatic approach is used to ascertain the existence of the power law discount utility, and empirical investigations are implemented to verify the effectiveness of the proposed model.  相似文献   

15.
聚类分析是数据挖掘的重要技术,是一种无监督的学习方式,可根据数据间的相似程度,将数据进行分类.竞争决策算法是一种基于竞争造就优化和决策左右结果的新型优化算法,针对聚类分析的特点,设计了一种竞争决策算法进行求解,经实验测试和验证,并与其它算法的结果进行比较,获得了较好的结果.  相似文献   

16.
The soft set theory, originally proposed by Molodtsov, can be used as a general mathematical tool for dealing with uncertainty. The interval-valued intuitionistic fuzzy soft set is a combination of an interval-valued intuitionistic fuzzy set and a soft set. The aim of this paper is to investigate the decision making based on interval-valued intuitionistic fuzzy soft sets. By means of level soft sets, we develop an adjustable approach to interval-valued intuitionistic fuzzy soft sets based decision making and some numerical examples are provided to illustrate the developed approach. Furthermore, we also define the concept of the weighted interval-valued intuitionistic fuzzy soft set and apply it to decision making.  相似文献   

17.
Quantitative decision support on personnel planning is often restricted to either rostering or staffing. There exist some approaches in which aspects at the staffing level and the rostering level are treated in a sequential way. Obviously, such practice risks producing suboptimal solutions at both decision levels. These arguments justify an integrated approach towards improving the overall quality of personnel planning. This contribution constitutes (1) the introduction of the roster quality staffing problem and (2) a three-step methodology that enables assessing the appropriateness of a personnel structure for achieving high quality rosters, while relying on an existing rostering algorithm. Based on the rostering assessment result, specific modifications to the personnel structure can be suggested at the staffing level. The approach is demonstrated by means of two different hospital cases, which have it that they are subject to complex rostering constraints. Experimental results show that the three-step methodology indeed points out alternative personnel structures that better comply with the rostering requirements. The roster analysis approach and the corresponding staffing recommendations integrate personnel planning needs at operational and tactical levels.  相似文献   

18.
Professionals in neuropsychology usually perform diagnoses of patients’ behaviour in a verbal rather than in a numerical form. This fact generates interest in decision support systems that process verbal data. It also motivates us to develop methods for the classification of such data. In this paper, we describe ways of aiding classification of a discrete set of objects, evaluated on set of criteria that may have verbal estimations, into ordered decision classes. In some situations, there is no explicit additional information available, while in others it is possible to order the criteria lexicographically. We consider both of these cases. The proposed Dichotomic Classification (DC) method is based on the principles of Verbal Decision Analysis (VDA). Verbal Decision Analysis methods are especially helpful when verbal data, in criteria values, are to be handled. When compared to the previously developed Verbal Decision Analysis classification methods, Dichotomic Classification method performs better on the same data sets and is able to cope with larger sizes of the object sets to be classified. We present an interactive classification procedure, estimate the effectiveness and computational complexity of the new method and compare it to one of the previously developed Verbal Decision Analysis methods. The developed and studied methods are implemented in the framework of a decision support system, and the results of testing on artificial sets of data are reported.  相似文献   

19.
This paper presents a procedure for performance benchmarking, that extends the performance measurement technique Data Envelopment Analysis (DEA), to incorporate the interactive decision procedure Interactive Multiple Goal Programming (IMGP). The resulting procedure is called Interactive Data Envelopment Analysis (IDEA). It is a decision support tool that helps decision makers to select performance benchmarks that are both feasible and desirable, and to identify benchmark partners that may be helpful in uncovering ways for achieving the selected performance standards. The IDEA concepts and characteristics are illustrated by means of an example IDEA assessment, using previously reported operating performance data of UK university departments.  相似文献   

20.
Metric rationalization of social decision rules has been intensively investigated when the social outcome is a nonempty subset of alternatives. The present paper proposes a similar framework for social welfare functions (SWFs)—that is when each social outcome is a ranking of alternatives. A metric rationalizable SWF reports as an approximation of the unanimity consensus the relative ranking of any pair of alternatives as in the closest profile where individuals all agree on those alternatives, the closeness being measured with respect to a metric on profiles. Two notions of unanimity are in consideration: top unanimity on an alternative holds when individuals all agree that it is top ranked while pairwise unanimity on a pair occurs when individuals all prefer an alternative to another. Without strong requirements on metrics, characterizations provided in both cases show that metric rationalizations of SWFs are essentially equivalent to the Pareto principle for SWFs. Furthermore, two interesting classes of metric rationalizable SWFs–multi-valued scoring SWFs and pairwise scoring SWFs–are each uniquely identified by means of some appropriate and appealing properties on metrics among which decomposability, neutrality and monotonicity are known.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号