首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
闫英  锁斌  甘蜜 《运筹与管理》2019,28(8):41-47
由于评价问题的复杂性,以及专家知识和经验的局限性,群决策信息中往往同时包含不同程度的不确定性。在区间证据理论框架下对认知不确定性信息、多区间概率信息、不完全信息等混合不确定信息进行统一表示,进而根据区间证据间的相似度对群决策信息进行融合;采用区间数对评语等级进行定量化,并根据融合结果构造个指标重要程度的概率分布函数;最后通过概率分布函数的蒙特卡洛随机抽样确定指标权重。算例分析结果表明了新方法的有效性。  相似文献   

2.
The use of Monte Carlo simulation for evaluation of financial risk of an information technology project selection decision is described. A major Thai bank considered the opportunity to expand credit card operations through information technology (IT). Alternatives considered were in-house development and outsourcing. There were many strategic reasons for the initiative. However, there were also many risks associated with the proposal. A Monte Carlo simulation spreadsheet model was used to model risk parameters, and to analyze key performance variables of financial performance. Key output variables were the number of cardholders expected, project net present value, net profit, and expected return on investment. The spreadsheet model made entry of model elements transparent, and Monte Carlo simulation provided clear visual display of the financial output variables. The bank used this information in its decision to outsource its credit card operations.  相似文献   

3.
This paper describes the prioritisation of an IT budget within a department of a local authority. The decision problem is cast as a simple multiattribute evaluation but from two perspectives. First, as an exercise in group decision making. Here the emphasis is on a shared process wherein the object is to obtain consensus. The use of an explicit evaluation framework and the ability to interact with the evaluation data in real time via a simple spreadsheet model were found to improve the decision making. Second, the prioritisation is made analytically. The motivation is to determine the degree to which the rankings are the result of the structural characteristics of the projects themselves rather than of the differences in importance attached to the achievement of the goals represented by the project attributes. Three methods are used: Monte Carlo simulation of ranks, cluster analysis based on attributes and an approach based on entropy maximisation. It is found that in the case studied the structure inherent in the data is high and so the results of the analyses are robust. Finally, a procedure is suggested for the appropriate use of these analyses via a facilitator to aid prioritisation decisions.  相似文献   

4.
The risks and uncertainties inherent in most enterprise resources planning (ERP) investment projects are vast. Decision making in multistage ERP projects investment is also complex, due mainly to the uncertainties involved and the various managerial and/or physical constraints to be enforced. This paper tackles the problem using a real-option analysis framework, and applies multistage stochastic integer programming in formulating an analytical model whose solution will yield optimum or near-optimum investment decisions for ERP projects. Traditionally, such decision problems were tackled using lattice simulation or finite difference methods to compute the value of simple real options. However, these approaches are incapable of dealing with the more complex compound real options, and their use is thus limited to simple real-option analysis. Multistage stochastic integer programming is particularly suitable for sequential decision making under uncertainty, and is used in this paper and to find near-optimal strategies for complex decision problems. Compared with the traditional approaches, multistage stochastic integer programming is a much more powerful tool in evaluating such compound real options. This paper describes the proposed real-option analysis model and uses an example case study to demonstrate the effectiveness of the proposed approach.  相似文献   

5.
Physicians use clinical guidelines to inform judgment about therapy. Clinical guidelines do not address three important uncertainties: (1) uncertain relevance of tested populations to the individual patient, (2) the patient’s uncertain preferences among possible outcomes, and (3) uncertain subjective and financial costs of intervention. Unreliable probabilistic information is available for some of these uncertainties; no probabilities are available for others. The uncertainties are in the values of parameters and in the shapes of functions. We explore the usefulness of info-gap decision theory in patient-physician decision making in managing cholesterol level using clinical guidelines. Info-gap models of uncertainty provide versatile tools for quantifying diverse uncertainties. Info-gap theory provides two decision functions for evaluating alternative therapies. The robustness function assesses the confidence—in light of uncertainties—in attaining acceptable outcomes. The opportuneness function assesses the potential for better-than-anticipated outcomes. Both functions assist in forming preferences among alternatives. Hypothetical case studies demonstrate that decisions using the guidelines and based on best estimates of the expected utility are sometimes, but not always, consistent with robustness and opportuneness analyses. The info-gap analysis provides guidance when judgment suggests that a deviation from the guidelines would be productive. Finally, analysis of uncertainty can help resolve ambiguous situations.  相似文献   

6.
For large international companies with their own simulation team, it is often hard to make a decision related to selection of new discrete-event simulation software. This paper presents a comprehensive discrete-event simulation software selection methodology that has been successfully used for decision making at Accenture consulting company. Accenture already used a simulation tool at the start of the project, but wanted to find out whether the current tool used still was the most appropriate one for its needs, and to evaluate the latest discrete-event simulation tools. The developed methodology consists of two phases: phase 1 quickly reduces the long list to a short list of packages, and phase 2 matches the requirements of the company with the features of the simulation package in detail. Successful application of the proposed methodology indicates its possible application for decision making in other large organisations, provided that the study is performed by a third party to avoid risks of influencing the outcome of the selection process.  相似文献   

7.
In this paper we deal with group decision-making problems where several decision makers elicit their own preferences separately. The decision makers’ preferences are quantified using a decision support system, which admits incomplete information concerning the decision makers’ responses to the questions they are asked. Consequently, each decision maker proposes classes of utility functions and attribute weight intervals for the different attributes. We introduce an approach based on Monte Carlo simulation techniques for aggregating decision maker preferences that could be the starting point for a negotiation process, if necessary. The negotiation process would basically involve the decision maker tightening the imprecise component utilities and weights to output more meaningful results and achieve a consensus alternative. We focus on how attribute weights and the component utilities associated with a consequence are randomly generated in the aggregation process taking into account the decision-makers’ preferences, i.e., their respective attribute weight intervals and classes of utility functions. Finally, an application to the evaluation of intervention strategies for restoring a radionuclide contaminated lake illustrates the usefulness and flexibility of this iterative process.  相似文献   

8.
估计VaR的传统方法有三种:协方差矩阵法、历史模拟法和蒙特仁洛模拟法。通常,文献中认为刚蒙特卡洛模拟法度量VaR有很多方面的优点。但是,本文通过实证检验发现,使用传统蒙特卡洛模拟法估计的VaR偏小,事后检验效果很不理想。本文引入Copula函数来改进传统的蒙特卡洛模拟法。Copula函数能将单个边际分布和多元联合分布联系起来,能处理非正态的边际分布,并且它度量的相关性不再局限于线性相关性。实证检验表明,基于Copula的蒙特卡罗模拟法可以更加准确地度量资产组合的VaR。  相似文献   

9.
Logistics systems have to cope with uncertainties in demand, in lead times, in transport times, in availability of resources and in quality. Management decisions have to take these uncertainties into consideration. An evaluation of decisions may be done by means of simulation. However, not all stochastic phenomena are of equal importance. By design of simulation experiments and making use of response surfaces, the most important phenomena are detected and their influence on performance estimated. Once the influence of the phenomena is known, this knowledge may be used to determine the optimal values of some decision parameters. An illustration is given on how to use response surfaces in a real-world case. A model is built in a logistics modelling software. The decision parameters have to be optimised for a specific objective function. Experiments are run to estimate the response surface. The validity of the response surface with few observations is also tested.  相似文献   

10.
??Kolmogorov-Smirnov (KS), Cramer-von Mises (CM) and Anderson-Darling (AD) test, which are based on empirical distribution function (EDF), are well-known statistics in testing univariate normality. In this paper, we focus on the high dimensional case and propose a family of generalized EDF based statistics to test the high-dimensional normal distribution by reducing the dimension of the variable. Not only can we approximate the corresponding critical values of three statistics by Monte Carlo method, we also can investigate the approximate distributions of proposed statistics based on approximate formulas in univariate case under null hypothesis. The Monte Carlo simulation is carried out to demonstrate that the performance of proposed statistics is more competitive than existing methods under some alternative hypotheses. Finally, the proposed tests are applied to real data to illustrate their utility.  相似文献   

11.
The real utility of simulation lies in comparing different alternatives that might represent competing system designs. Conventional statistical techniques are not directly applicable to the analysis of simulation output data in the evaluation of competing alternatives since the usual assumptions of normality and common variance are difficult to justify in simulation experiments. This paper revisits a known nonparametric test whose application has recently become feasible due to considerable increases in computing power:randomization tests assess the significance of the observed value of the test statistic by evaluating different permutations of the data. The procedure only requires invariance of the data under all permutations.  相似文献   

12.
In this paper, the stochastic collocation method (SCM) is applied to investigate the nonlinear behavior of an aeroelastic system with uncertainties in the system parameter and the initial condition. Numerical case studies for problems with uncertainties are carried out. In particular, the performance of the SCM is compared with solutions based on other computational techniques such as Monte Carlo simulation, Wiener chaos expansion and wavelet chaos expansion. From the computational results, we conclude that the SCM is an effective tool to study a nonlinear aeroelastic system with random parameters.  相似文献   

13.
Mergers and acquisitions (M&A), private equity and leveraged buyouts, securitization and project finance are characterized by the presence of contractual clauses (covenants). These covenants trigger the technical default of the borrower even in the absence of insolvency. Therefore, borrowers may default on loans even when they have sufficient available cash to repay outstanding debt. This condition is not captured by the net present value (NPV) distribution obtained through a standard Monte Carlo simulation. In this paper, we present a methodology for including the consequences of covenant breach in a Monte Carlo simulation, extending traditional risk analysis in investment planning. We introduce a conceptual framework for modeling technical and material breaches from the standpoint of both lenders and shareholders. We apply this framework to a real case study concerning the project financing of a 64-million euro biomass power plant. The simulation is carried out on the actual model developed by the financial advisor of the project and made available to the authors. Results show that both technical and material breaches have a statistically significant impact on the net present value distribution, and this impact is more relevant when leverage and cost of debt increase.  相似文献   

14.
The analytic hierarchy process can be used for group decision making by aggregating individual judgments or individual priorities. The most commonly used aggregation methods are the geometric mean method and the weighted arithmetic mean method. While it is known that the weighted geometric mean comparison matrix is of acceptable consistency if all individual comparison matrices are of acceptable consistency, this paper addresses the following question: Under what conditions would an aggregated geometric mean comparison matrix be of acceptable consistency if some (or all) of the individual comparison matrices are not of acceptable consistency? Using Monte Carlo simulation, results indicate that given a sufficiently large group size, consistency of the aggregate comparison matrix is guaranteed, regardless of the consistency measures of the individual comparison matrices, if the geometric mean is used to aggregate. This result implies that consistency at the aggregate level is a non-issue in group decision making when group size exceeds a threshold value and the geometric mean is used to aggregate individual judgments. This paper determines threshold values for various dimensions of the aggregated comparison matrix.  相似文献   

15.
In the selection of investment projects, it is important to account for exogenous uncertainties (such as macroeconomic developments) which may impact the performance of projects. These uncertainties can be addressed by examining how the projects perform across several scenarios; but it may be difficult to assign well-founded probabilities to such scenarios, or to characterize the decision makers’ risk preferences through a uniquely defined utility function. Motivated by these considerations, we develop a portfolio selection framework which (i) uses set inclusion to capture incomplete information about scenario probabilities and utility functions, (ii) identifies all the non-dominated project portfolios in view of this information, and (iii) offers decision support for rejection and selection of projects. The proposed framework enables interactive decision support processes where the implications of additional probability and utility information or further risk constraints are shown in terms of corresponding decision recommendations.  相似文献   

16.
提出了一种基于最小二乘法的长周期实物期权精确估值迭代模拟算法,并通过一个商用通信卫星在轨服务投资决策的算例对该算法的实现进行了说明.算法将一个需要一次进行大量运算的问题转变为一个需要进行多次运算但每次运算的计算量相对较小的问题,能够很好地解决在缺乏并行计算的条件下大量模拟运算所面临的计算资源瓶颈问题,不仅能够得到较为精确的实物期权价值的点估计值和区间估计值,也便于推导最优的投资策略.  相似文献   

17.
The Hooke and Jeeves algorithm (HJ) is a pattern search procedure widely used to optimize non-linear functions that are not necessarily continuous or differentiable. The algorithm performs repeatedly two types of search routines; an exploratory search and a pattern search. The HJ algorithm requires deterministic evaluation of the function being optimized. In this paper we consider situations where the objective function is stochastic and can be evaluated only through Monte Carlo simulation. To overcome the problem of expensive use of function evaluations for Monte Carlo simulation, a likelihood ratio performance extrapolation (LRPE) technique is used. We extrapolate the performance measure for different values of the decision parameters while simulating a single sample path from the underlying system. Our modified Hooke and Jeeves algorithm uses a likelihood ratio performance extrapolation for simulation optimization. Computational results are provided to demonstrate the performance of the proposed modified HJ algorithm.  相似文献   

18.
Engineers and scientists often identify robust parameter design as one of the most important process and quality improvement methods. Focused on statistical modeling and numerical optimization strategies, most researchers typically assume a process with reasonably small variability. Realistically, however, industrial processes often exhibit larger variability, particularly in mass production lines. In such cases, many of the modeling assumptions behind the robust parameter design models available in the literature do not hold. Accordingly, the results and recommendations provided to decision makers could generate suboptimal modifications to processes and products. As manufacturers seek improved methods for ensuring quality in resource-constrained environments, experimenters should examine trade-offs to achieve the levels of precision that best support their decision making. In contrast to previous research, this paper proposes a trade-off analysis between the cost of replication and the desired precision of generated solutions. We consider several techniques in the early stages of experimental design, using Monte Carlo simulation as a tool, for revealing potential options to the decision maker. This is perhaps the first study to show the avenue which may lead to more effective robust parameter design models with the optimal combination of cost constraints and desired precision of solutions.  相似文献   

19.
We present a case study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multicore processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we find speedups from 35- to 500-fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modeling into complex data-rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. This article has supplementary material online.  相似文献   

20.
This paper introduces a risk-based optimization method to schedule projects. The method uses risk mitigation and optimal control techniques to minimize variables such as the project duration or the cost estimate at completion. Mitigation actions reduce the risk impacts that may affect the system. A model predictive control approach is used to determine the set of mitigation actions to be executed and the time in which they are taken. A real-life project in the field of semiconductor manufacturing has been taken as an example to show the benefits of the method in a deterministic case and a Monte Carlo simulation has also been carried out.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号