首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
An expert system was desired for a group decision-making process. A highly variable data set from previous groups' decisions was available to simulate past group decisions. This data set has much missing information and contains many possible errors. Classification and regression trees (CART) was selected for rule induction, and compared with multiple linear regression and discriminant analysis. We conclude that CART's decision rules can be used for rule induction. CART uses all available information and can predict observations with missing data. Errors in results from CART compare well with those from multiple linear regression and discriminant analysis. CART results are easier to understand.  相似文献   

2.
This paper proposes a dynamic programming approach to modeling and determining batch sizes in a single period, multi-stage production process with random yields for each stage. To improve the computational performance of the proposed approach, a statistical bound is developed. A key decision incorporated into the model is whether to continue onto the next stage of processing or to scrap the entire current batch of product. This decision is based on the expected total profit from the remaining items for processing following the removal of all defectives. The decisions involving the locations of test stations after stages are also incorporated into the modeling approach.  相似文献   

3.
This article introduces a numerical method for finding optimal or approximately optimal decision rules and corresponding expected losses in Bayesian sequential decision problems. The method, based on the classical backward induction method, constructs a grid approximation to the expected loss at each decision time, viewed as a function of certain statistics of the posterior distribution of the parameter of interest. In contrast with most existing techniques, this method has a computation time which is linear in the number of stages in the sequential problem. It can also be applied to problems with insufficient statistics for the parameters of interest. Furthermore, it is well-suited to be implemented using parallel processors.  相似文献   

4.
We examine normal form solutions of decision trees under typical choice functions induced by lower previsions. For large trees, finding such solutions is hard as very many strategies must be considered. In an earlier paper, we extended backward induction to arbitrary choice functions, yielding far more efficient solutions, and we identified simple necessary and sufficient conditions for this to work. In this paper, we show that backward induction works for maximality and E-admissibility, but not for interval dominance and Γ-maximin. We also show that, in some situations, a computationally cheap approximation of a choice function can be used, even if the approximation violates the conditions for backward induction; for instance, interval dominance with backward induction will yield at least all maximal normal form solutions.  相似文献   

5.
With new treatments and novel technology available, precision medicine has become a key topic in the new era of healthcare. Traditional statistical methods for precision medicine focus on subgroup discovery through identifying interactions between a few markers and treatment regimes. However, given the large scale and high dimensionality of modern datasets, it is difficult to detect the interactions between treatment and high-dimensional covariates. Recently, novel approaches have emerged that seek to directly estimate individualized treatment rules (ITR) via maximizing the expected clinical reward by using, for example, support vector machines (SVM) or decision trees. The latter enjoys great popularity in clinical practice due to its interpretability. In this article, we propose a new reward function and a novel decision tree algorithm to directly maximize rewards. We further improve a single tree decision rule by an ensemble decision tree algorithm, ITR random forests. Our final decision rule is an average over single decision trees and it is a soft probability rather than a hard choice. Depending on how strong the treatment recommendation is, physicians can make decisions based on our model along with their own judgment and experience. Performance of ITR forest and tree methods is assessed through simulations along with applications to a randomized controlled trial (RCT) of 1385 patients with diabetes and an EMR cohort of 5177 patients with diabetes. ITR forest and tree methods are implemented using statistical software R (https://github.com/kdoub5ha/ITR.Forest). Supplementary materials for this article are available online.  相似文献   

6.
An important approach to decision modeling is the induction of knowledge structures—such as rules, trees, and graphs—from empirical data describing previous conditions and the resulting decisions. We examine here a specific knowledge structure, a logic tree, in which the conditions are leaves, the decision is the root, and the intermediate nodes are logical operators. We then use genetic algorithms (GAs) to construct logic trees that best represent the correspondence between conditions and decisions described by the empirical data. We also investigate an important characteristic of the GA search, the fitness distance correlation. Finally, we comment on the usefulness of GAs in knowledge modeling.  相似文献   

7.
The management of certain systems, such as manufacturing facilities, supply chains, or communication networks implies assessing the consequences of decisions, aimed for the most efficient operation. This kind of systems usually shows complex behaviors where subsystems present parallel evolutions and synchronizations. Furthermore, the existence of global objectives for the operation of the systems and the changes that experience the systems or their environment during their evolution imply a more or less strong dependence between decisions made at different time points of the life cycle. This paper addresses a complex problem that is scarcely present in the scientific literature: the sequences of decisions aimed for achieving several objectives simultaneously and with strong influence from one decision to the rest of them. In this case, the formal statement of the decision problem should take into account the whole decision sequence, making impractical the solving paradigm of “divide and conquer”. Only an integrated methodology may afford a realistic solution of such a type of decision problem. In this paper, an approach based on the formalism of the Petri nets is described, several considerations related to this problem are presented, a solving methodology based on the previous work of the authors, as well as a case-study to illustrate the main concepts.  相似文献   

8.
Hurricane forecasts are intended to convey information that is useful in helping individuals and organizations make decisions. For example, decisions include whether a mandatory evacuation should be issued, where emergency evacuation shelters should be located, and what are the appropriate quantities of emergency supplies that should be stockpiled at various locations. This paper incorporates one of the National Hurricane Center's official prediction models into a Bayesian decision framework to address complex decisions made in response to an observed tropical cyclone. The Bayesian decision process accounts for the trade-off between improving forecast accuracy and deteriorating cost efficiency (with respect to implementing a decision) as the storm evolves, which is characteristic of the above-mentioned decisions. The specific application addressed in this paper is a single-supplier, multi-retailer supply chain system in which demand at each retailer location is a random variable that is affected by the trajectory of an observed hurricane. The solution methodology is illustrated through numerical examples, and the benefit of the proposed approach compared to a traditional approach is discussed.  相似文献   

9.
There has always been a steady interest in how humans make decisions amongst researchers from various fields. Based on this interest, many approaches such as rational choice theory or expected utility hypothesis have been proposed. Although these approaches provide a suitable ground for modeling the decision making process of humans, they are unable to explain the corresponding irrationalities and existing paradoxes and fallacies. Recently, a new formulation of decision theory that can correctly describe these paradoxes and possibly provide a unified and general theory of decision making has been proposed. This new formulation is founded based on the application of the mathematical structure of quantum theory to the fields of human decision making and cognition. It is shown that by applying these quantum-like models, one can better describe the uncertainty, ambiguity, emotions and risks involved in the human decision making process. Even in computational environments, an agent that follows the correct patterns of human decision making will have a better functionality in performing its role as a proxy for a real user. In this paper, we present a comprehensive survey of the researches and the corresponding recent developments. Finally, the benefits of leveraging the quantum-like modeling approaches in computational domains and the existing challenges and limitations currently facing the field are discussed.  相似文献   

10.
Cellular manufacturing (CM) is an approach that can be used to enhance both flexibility and efficiency in today’s small-to-medium lot production environment. The design of a CM system (CMS) often involves three major decisions: cell formation, group layout, and group schedule. Ideally, these decisions should be addressed simultaneously in order to obtain the best results. However, due to the complexity and NP-complete nature of each decision and the limitations of traditional approaches, most researchers have only addressed these decisions sequentially or independently. In this study, a hierarchical genetic algorithm is developed to simultaneously form manufacturing cells and determine the group layout of a CMS. The intrinsic features of our proposed algorithm include a hierarchical chromosome structure to encode two important cell design decisions, a new selection scheme to dynamically consider two correlated fitness functions, and a group mutation operator to increase the probability of mutation. From the computational analyses, these proposed structure and operators are found to be effective in improving solution quality as well as accelerating convergence.  相似文献   

11.
This paper presents a general model for economic evaluation of treatments of chronic conditions. The model is applicable in the situation where multiple possible treatments are available none of which is completely satisfactory by virtue of either lack of effect of adverse effects occurring for some patients. Treatments may include any process designed to enable patients to manage their condition. Since none are completely satisfactory, the model cannot determine the ‘best’ treatment, merely help identify an order in which treatments should be tried. This economic model must be used in conjunction with clinical or other relevant information. The model should be used to make comparisons between treatment options within one discrete category. The information provided can thus assist allocative microeconomic decisions, where funds must be allocated between specific competing alternatives. These decisions may apply to either purchasers or providers of health care services.  相似文献   

12.
Models for decision-making under uncertainty use probability distributions to represent variables whose values are unknown when the decisions are to be made. Often the distributions are estimated with observed data. Sometimes these variables depend on the decisions but the dependence is ignored in the decision maker??s model, that is, the decision maker models these variables as having an exogenous probability distribution independent of the decisions, whereas the probability distribution of the variables actually depend on the decisions. It has been shown in the context of revenue management problems that such modeling error can lead to systematic deterioration of decisions as the decision maker attempts to refine the estimates with observed data. Many questions remain to be addressed. Motivated by the revenue management, newsvendor, and a number of other problems, we consider a setting in which the optimal decision for the decision maker??s model is given by a particular quantile of the estimated distribution, and the empirical distribution is used as estimator. We give conditions under which the estimation and control process converges, and show that although in the limit the decision maker??s model appears to be consistent with the observed data, the modeling error can cause the limit decisions to be arbitrarily bad.  相似文献   

13.
The paper develops a replacement action decision aid for a key furnace component subject to condition monitoring. A state space model is used to predict the erosion condition of the inductors in an induction furnace in which a measure of the conductance ratio (CR) is used to indirectly assess the relative condition of the inductors, and to guide replacement decisions. This study seeks to improve on this decision process by establishing the relationship between CR and the erosion condition of the inductors. To establish such a relationship, a state space model has been established and the system parameters estimated from CR data. A replacement cost model to balance at any time costly replacements with possible catastrophic failure is also proposed based upon the predicted probability of inductor erosion conditional upon all available information. The well known Kalman filter is employed to derive the predicted and updated probability of inductor erosion level conditional upon CR data to date. This is the first time the condition monitoring decision process has been modelled for real plant based upon filtering theory. The model fits the data well, gives a sensible answer to the actual problem, and is transferable to other condition monitoring contexts. Possible extensions are discussed in the paper.  相似文献   

14.
A motorist involved in an accident will have to decide whether to claim from his insurance company or not when he is at fault. An optimal decision rule can only be determined in the light of future developments and future decisions, since the consequences of claiming or not claiming are felt in the subsequent year's premiums. In this paper, optimal no-claim limits are determined for a common Dutch type of insurance policy with bonus-malus structures, using generalized Markovian programming. The computational results are given for various values of the expected number of accidents per year.  相似文献   

15.
Stochastic programs with recourse provide an effective modeling paradigm for sequential decision problems with uncertain or noisy data, when uncertainty can be modeled by a discrete set of scenarios. In two-stage problems the decision variables are partitioned into two groups: a set of structural, first-stage decisions, and a set of second-stage, recourse decisions. The structural decisions are scenario-invariant, but the recourse decisions are scenario-dependent and can vary substantially across scenarios. In several applications it is important to restrict the variability of recourse decisions across scenarios, or to investigate the tradeoffs between the stability of recourse decisions and expected cost of a solution.We present formulations of stochastic programs with restricted recourse that trade off recourse stability with expected cost. The models generate a sequence of solutions to which recourse robustness is progressively enforced via parameterized, satisficing constraints. We investigate the behavior of the models on several test cases, and examine the performance of solution procedures based on the primal-dual interior point method.  相似文献   

16.
Group decision making is an active area of research within multiple attribute decision making. This paper assumes that all the decision makers (DMs) are not equally qualified to contribute equitably to the decision process. The aim of this paper is to develop an approach to determine weights of DMs, in which the decision information on alternatives with respect to attributes, provided by each DM, is represented in the form of interval data. We define the average of all individual decisions as the positive ideal decision (PID), and the maximum separation from PID as the negative ideal decision, which are characterized by a matrix, respectively. The weight of each DM is determined according to the Euclidean distances between the individual decision and ideal decisions. By using the obtained weights of DMs, all individual decisions are aggregated into a collective decision. Then the alternatives is ranked based on the collective decision. Meanwhile, this paper also gives a humanized decision method by using an optimistic coefficient, which is used in adjusting the relative importance between profit and risk. Finally, we give an example to illustrate the developed approach.  相似文献   

17.
Iwamoto recently established a formal transformation via an invariant imbedding to construct a controlled Markov chain that can be solved in a backward manner, as in backward induction for finite-horizon Markov decision processes (MDPs), for a given controlled Markov chain with non-additive forward recursive objective function criterion. Chang et al. presented formal methods, called “parallel rollout” and “policy switching,” of combining given multiple policies in MDPs and showed that the policies generated by both methods improve all of the policies that the methods combine. This brief paper extends the methods of parallel rollout and policy switching for forward recursive objective function criteria and shows that the similar property holds as in MDPs. We further discuss how to implement these methods via simulation.  相似文献   

18.
Many decision support systems for feedstock companies include an option for the solution of large linear programming problems. A three-level decomposition algorithm is presented which substantially improves the solution times for such linear programming problems. When decisions must be made on the addition of new raw materials or extra quantities of existing raw materials to feed mixes, the usual approach is to use parametric linear programming. A new approach to this decision problem, based on the results of the three-level decomposition algorithm, is presented in the paper. Finally, implementation issues and the computational performance of the new approaches on real-world problems are discussed.  相似文献   

19.
We study six real-world major strategic decisions and discuss the role that analytic Multiple Criteria Decision Making (MCDM) models could play in helping decision makers structure and solve such problems. We have interviewed successful and well-educated managers who had access to quantitative decision models, but did not use them as part of their decision process. Our approach is a clinical one that takes a close look at the decision processes. We believe that the normative MCDM framework is oversimplified and does not always fit well with complex, real-world organizational decision processes. This may be one reason why decision tools are not used more widely for solving high-level decision problems. We believe that it would be worthwhile to revise some of the MCDM mainstream postulates and practices to make existing models and tools more suitable for practical purposes. The MCDM mainstream research has until today focused on the choice among alternatives. One should realize that MCDM models could also be used in creating alternatives, in assessing the importance of criteria, in providing the decision makers with “post-commitment support”, and as part of a devil's advocate approach.  相似文献   

20.
This paper describes the construction of a graphical decision tool to aid placement decisions of a multidisciplinary review panel for admissions to long-term care in a London borough in the UK. First we construct a prediction model of placement decisions based on an applicant's attributes. Using data from the London borough, a composite model comprising syndromic decision rules followed by a two-stage hierarchical logistic regression model is proposed. The model proved to be robust in differentiating cases needing residential home care and nursing home care. Placement outcomes generated by the model are then represented graphically on a triangle plot. This approach could potentially be used as a decision support tool by managers of long-term care for continuous monitoring and assessment of the appropriateness of placements with respect to residents’ needs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号