首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Key components of the multiple constraint satisfaction framework are explored in a series of experiments set in complex and ambiguous domains. All cases show the prevalence and importance of a purposeful structuring of the information by the participants. The participants gradually generate coherence, even in cases without increasing information. In accordance with multiple constraint satisfaction predictions, the assessments of inferences increasingly spread apart. Also, the correlations between the dependent variable (the decision) and the independent variables, as well as between the independent variables, consistently grow stronger as the participants progress through the decision stages. The information structuring—a gradual simplification of the component structure—is captured as principal components associated with the various decision stages. Neural networks predict the judgments in the various decision stages relatively well. Finally, the role of the ongoing structuring of the underlying information was explored through the application of trained networks to data in other decision stages.  相似文献   

2.
Organizational leaders increasingly recognize process management as an essential element in organizational performance. Two key tools for process management––Statistical Process Control and Maintenance Management––can create profound economic benefits, particularly when they are coordinated. This paper demonstrates the value of integrating Statistical Process Control and maintenance by jointly optimizing their policies to minimize the total costs associated with quality, maintenance, and inspection. While maintenance is often scheduled periodically, this analysis encourages “adaptive” maintenance where the maintenance schedule accelerates when the process becomes unstable. This paper presents a number of models to demonstrate the economic behavior and value of coordinating process control and maintenance. Finally, a sensitivity analysis is conducted to develop insights into the economic and process variables that influence the integration efforts.  相似文献   

3.
王珂  张玲珍  周建 《运筹与管理》2022,31(10):33-39
针对不确定环境下具有不同供应合约的供应商选择与订单分配问题,本文构建了基于风险-均值分析的模糊两阶段多周期集成优化模型。与传统的该问题研究并未充分考虑供应商选择与订单分配两阶段决策的交互影响不同,在该模型中,第一阶段供应商选择的评价目标依赖于后期实际运营中的订单分配决策;并考虑未来需求和实际运营成本的不确定性,引入在险价值和期望值两种决策准则对供应商选择方案的绩效进行评价。提出了该模型的分析求解方法,在险价值得以精确评估,期望值被控制在确定的误差范围内,并可以达到足够的精度要求。  相似文献   

4.
System dynamics models are becoming increasingly common in the analysis of policy and managerial issues. The usefulness of these models is predicated on their ability to link observable patterns of behavior to micro-level structure and decision-making processes. This paper posits that model calibration––the process of estimating the model parameters (structure) to obtain a match between observed and simulated structures and behaviors––is a stringent test of a hypothesis linking structure to behavior, and proposes a framework to use calibration as a form of model testing. It tackles the issue at three levels: theoretical, methodological, and technical. First, it explores the nature of model testing, and suggests that the modeling process be recast as an experimental approach to gain confidence in the hypothesis articulated in the model. At the methodological level, it proposes heuristics to guide the testing strategy, and to take advantage of the strengths of automated calibration algorithms. Finally, it presents a set of techniques to support the hypothesis testing process. The paper concludes with an example and a summary of the argument for the proposed approach.  相似文献   

5.
Firms are faced with uncertain sales responses even though they advertise appropriately. To help marketing managers make optimal budget decisions in this situation, we develop a stochastic model, depicting the problem of advertising budget decision as a special Markov decision process where a new objective, maximizing expected market utility, is proposed. In the model we introduce a two-dimension state variable including accumulative sales, which vary randomly with advertising budget, and the predicted probability that an advertising campaign obtains a full sales response. We make an analysis of the model on the premise of growing infinite market potential, deriving the property of optimal policies and that of optimal value function. These results are successfully used to make advertising budget decisions for a private university in Xi’an, China.  相似文献   

6.
Every human system is faced with the problem of choosing between alternative options, and methods of interactive programming have been suggested as the best way to lead decision makers reach decisions that are consistent with their preferences. However, even though a large number of interactive algorithms have been proposed for multiobjective decision making (MODM), there is yet no truly interactive goal programming (GP) algorithm, despite the preference of GP over other MODM methodologies. The current paper presents an algorithm for interactive GP modelling called SWIGP (systems welfare interactive GP) which ensures that the overall welfare of the system under consideration is adequately taken into account in the interactive process. To achieve this, this paper distinguishes between technical, allocative and economic efficiencies and combines an economic efficiency index with interactive GP process. Besides being of wide applicability, the algorithm exerts little cognitive burden on the decision maker (DM). Indeed, even if the DM is assumed to operate under conditions of complete ignorance, SWIGP provides the direction for searching the “best” compromise solution. Moreover, the algorithm converges very fast because of the economic efficiency index that complements the interactive process in aiding the DM arrive at a most preferred solution.  相似文献   

7.
Data envelopment analysis (DEA) is a method for measuring the efficiency of peer decision making units (DMUs), where the internal structures of DMUs are treated as a black-box. Recently DEA has been extended to examine the efficiency of DMUs that have two-stage network structures or processes, where all the outputs from the first stage are intermediate measures that make up the inputs to the second stage. The resulting two-stage DEA model not only provides an overall efficiency score for the entire process, but also yields an efficiency score for each of the individual stages. The current paper develops a Nash bargaining game model to measure the performance of DMUs that have a two-stage structure. Under Nash bargaining theory, the two stages are viewed as players and the DEA efficiency model is a cooperative game model. It is shown that when only one intermediate measure exists between the two stages, our newly developed Nash bargaining game approach yields the same results as applying the standard DEA approach to each stage separately. Two real world data sets are used to demonstrate our bargaining game model.  相似文献   

8.
In this paper, a case study in regional water resources planning is presented in which the process of design and use of computer-based models and man-computer interaction is considered, especially with respect to the roles of designers and potential users. Key decisions in the design process are recognized and analyzed. It is concluded that integration of normative models in decision processes with conflicts shows limits that partly may be overcome by emphasis on the planning stage of the design process, and on development of planning and decision models that allow flexibility. Interactive procedures should be developed that distribute flexibility over decision levels and also deal with the problems of asymmetrical distributed information. It is concluded further that there is a need for a more systematical investigation of design processes and decision behavior related to characteristics of decision problems and environments.  相似文献   

9.
Models for decision-making under uncertainty use probability distributions to represent variables whose values are unknown when the decisions are to be made. Often the distributions are estimated with observed data. Sometimes these variables depend on the decisions but the dependence is ignored in the decision maker??s model, that is, the decision maker models these variables as having an exogenous probability distribution independent of the decisions, whereas the probability distribution of the variables actually depend on the decisions. It has been shown in the context of revenue management problems that such modeling error can lead to systematic deterioration of decisions as the decision maker attempts to refine the estimates with observed data. Many questions remain to be addressed. Motivated by the revenue management, newsvendor, and a number of other problems, we consider a setting in which the optimal decision for the decision maker??s model is given by a particular quantile of the estimated distribution, and the empirical distribution is used as estimator. We give conditions under which the estimation and control process converges, and show that although in the limit the decision maker??s model appears to be consistent with the observed data, the modeling error can cause the limit decisions to be arbitrarily bad.  相似文献   

10.
For decision makers in the electricity sector, the decision process is complex with several different levels that have to be taken into consideration. These comprise for instance the planning of facilities and an optimal day-to-day operation of the power plant. These decisions address widely different time-horizons and aspects of the system. For accomplishing these tasks load forecasts are very important. Therefore, finding an appropriate approach and model is at core of the decision process. Due to the deregulation of energy markets, load forecasting has gained even more importance. In this article, we give an overview over the various models and methods used to predict future load demands.  相似文献   

11.
Post-decision activities have not had enough research within the decision making cycle. Perhaps they have been considered trivial or not meaningful in the past. However, without an appropriate follow-up, important decisions made in the previous phase may get lost or be implemented wrongly. This paper describes a computer-based support for decision implementation activities. The support includes the corresponding linkage of activities to the meeting decisions that originated them. The proposed system follows a process modeling approach to design the decision implementation activities and uses a workflow management system for process enactment.  相似文献   

12.
Organizational ambidexterity, defined as the pursuit of both exploitation and exploration, has become an important topic in the study of organizations, especially in innovation management theory. Previous literature has not focused on the strategic (game-theoretic) aspects of organizational ambidexterity or on its decision-making aspects. Little is known about how or even whether the decision to adopt ambidexterity is competitively advantageous in the presence of diverse strategies that competitors may adopt. This facet of the subject is inherently game-theoretic; the value of a decision by one firm depends in part on decisions made by other firms. This paper initiates systematic investigation of these strategic aspects, including the overall performance of available strategies. Specifically, this study examines questions of ambidexterity-related strategy performance in the context of new product development. The main contributions are (1) to introduce and make available to the research community an agent-based model and decision support system that captures many of the key aspects and tradeoffs, which have been identified in the literature, of the exploration–exploitation dilemma faced by firms in the new product development process, with a focus on organizations’ product investment decisions and (2) to report on results obtained from the model, calibrated with available data from the literature, augmented by new data collected from interviews with practitioners.  相似文献   

13.
When should one refinance a mortgage loan? It is one of the most common finance questions in today's world. There have been surprisingly few attempts to answer this question in a structured manner, however. Moreover, the existing guidelines for refinancing consist of a short list of very simple rules that have a limited application. This article addresses the question through a dynamic programming model coupled with an analysis of historical interest rates. The analysis reveals a more complex set of rules for an optional refinance decision––oftentimes conflicting with the conventionally accepted idea that rate differences must be greater than two percent.  相似文献   

14.
In an experiment decision-makers used time series information on the past demand for products to decide on production levels to meet the next period's demand. Either shortages cost more per unit than surpluses or the asymmetry of losses was in the opposite direction. The decision-makers were either: (i) unsupported, (ii) provided with statistical point forecasts of the next period's demand or (iii) asked to estimate probability distributions of next period's demand using the fractile method––the decisions were inferred from these distributions (Decomposition). Providing statistical forecasts led to decisions incurring significantly lower expected costs than those achieved by the unsupported decision makers. However, the decomposition procedure did not significantly reduce expected costs because, contrary to earlier evidence, the fractile method generally led to distributions that were too wide and flat. Decision-makers in both treatments (i) and (ii) also performed significantly better when shortages were more costly than surpluses.  相似文献   

15.
Repetitive testing process is commonly used in the final testing stage of semiconductor manufacturing to ensure high outgoing product quality and to reduce testing errors. The decision on testing lot size and the number of testing repetitions ultimately determines the effectiveness of the testing process. Setting the retest rule is often difficult in practice due to uncertainties in the incoming product quality and testing equipment condition. In this paper, we study a repetitive testing process where the testing equipment may shift randomly to an inferior state. We develop a cost model that helps us to make optimal decisions on retesting rule. Through numerical analysis, we provide practical insights about the effects of testing equipment shift rate, testing errors, and different costs such as cost of testing and cost of rejecting conforming products on the optimal decision and the system performance. We find that significant penalty may result if the potential testing equipment shift is ignored.  相似文献   

16.
This paper provides a new structure in data envelopment analysis (DEA) for assessing the performance of decision making units (DMUs). It proposes a technique to estimate the DEA efficient frontier based on the Arash Method in a way different from the statistical inferences. The technique allows decisions in the target regions instead of points to benchmark DMUs without requiring any more information in the case of interval/fuzzy DEA methods. It suggests three efficiency indexes, called the lowest, technical and highest efficiency scores, for each DMU where small errors occur in both input and output components of the Farrell frontier, even if the data are accurate. These efficiency indexes provide a sensitivity index for each DMU and arrange both inefficient and technically efficient DMUs together while simultaneously detecting and benchmarking outliers. Two numerical examples depicted the validity of the proposed method.  相似文献   

17.
Value-focused thinking, using the dialogue decision process (DDP), and interactive planning appear to be two totally unrelated processes for making decisions. As this paper shows, new results on the interpretation of utility functions and new ways of thinking about downstream decisions allows us to reinterpret interactive planning as an ideal-focused decision process which is theoretically equivalent to DDP's value-focused decision process. But Ackoff's ideal-focused decision process may be more natural for certain organizational decision settings.  相似文献   

18.
A general problem in relation to application of Markov decision processes to real world problems is the curse of dimensionality, since the size of the state space grows to prohibitive levels when information on all relevant traits of the system being modeled are included. In herd management, we face a hierarchy of decisions made at different levels with different time horizons, and the decisions made at different levels are mutually dependent. Furthermore, decisions have to be made without certainty about the future state of the system. These aspects contribute even further to the dimensionality problem. A new notion of a multilevel hierarchic Markov process specially designed to solve dynamic decision problems involving decisions with varying time horizon has been presented. The method contributes significantly to circumvent the curse of dimensionality, and it provides a framework for general herd management support instead of very specialized models only concerned with a single decision as, for instance, replacement. The applicational perspectives of the technique are illustrated by potential examples relating to the management of a sow herd and a dairy herd.  相似文献   

19.
Semu Mitiku 《PAMM》2007,7(1):2060003-2060004
In many decision processes there is a hierarchy of decision-makers and decisions are taken at different levels in this hierarchy. In business (and many other practical activities) decision making has changed over the last decades. From a single person (the boss!) and a single criterion (e.g. profit), decision environments have developed increasingly to become multi-person and multi-criteria and even multi-level (or hierarchical) situations. In organization with hierarchical decision systems, the sequential and preemptive nature of the decision process makes the problem of selecting an optimum strategy and action very different from the usual operations research methods. Therefore, a multilevel programming approach is considered in modeling such problems. In particular a three-level mathematical programming model has been proposed for an optimal resource allocation problem in Ethiopian universities. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

20.
This paper discusses the use of multi-criteria decision analysis for supporting strategic decision making in organisations. It begins by exploring the notions of strategic decisions and the strategic decision-making process. We suggest that structuring strategic objectives, dealing with high levels of uncertainty about the future, as well as considering the interconnectedness of strategic options and their long-term consequences are key aspects of strategic decision making support. We then consider the discursive nature of the processes within which strategic decisions are created and negotiated. Our exploration of these concepts leads us to propose a number of adaptations to the standard multi-criteria decision analysis approach, if it were to provide effective strategic decision support, particularly in strategy workshops. We make suggestions on how to implement these proposals, and illustrate their potential with examples drawn from real-world interventions in which we have provided strategic decision support.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号