首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper we provide evidence of the benefits of an approach which combines data mining and mathematical programming to determining the premium to charge automobile insurance policy holders in order to arrive at an optimal portfolio. An non-linear integer programming formulation is proposed to determine optimal premiums based on the insurer's need to find a balance between profitability and market share. The non-linear integer programming approach to solving this problem is used within a data mining framework which consists of three components: classifying policy holders into homogenous risk groups and predicting the claim cost of each group using k-means clustering; determining the price sensitivity (propensity to pay) of each group using neural networks; and combining the results of the first two components to determine the optimal premium to charge. We have earlier presented the results of the first two components. In this paper we present the results of the third component. Using our approach, we have been able to increase revenue without affecting termination rates and market share.  相似文献   

2.
The empirical part of this article is based on a study on car insurance data to explore how global and local geographical effects on frequency and size of claims can be assessed with appropriate statistical methodology. Because these spatial effects have to be modeled and estimated simultaneously with linear and possibly nonlinear effects of further covariates such as age of policy holder, age of car or bonus-malus score, generalized linear models cannot be applied. Also, compared to previous analyses, the geographical information is given by the exact location of the residence of policy holders. Therefore, we employ a new class of geoadditive models, where the spatial component is modeled based on stationary Gaussian random fields, common in geostatistics (Kriging). Statistical inference is carried out by an empirical Bayes or penalized likelihood approach using mixed model technology. The results confirm that the methodological concept provides useful tools for exploratory analyses of the data at hand and in similar situations.  相似文献   

3.
复合二项过程风险模型的精细大偏差及有限时间破产概率   总被引:1,自引:0,他引:1  
马学敏  胡亦钧 《数学学报》2008,51(6):1119-113
讨论基于客户到来的复合二项过程风险模型.在该风险模型中,假设索赔额序列是独立同分布的重尾随机变量序列,不同保单发生实际索赔的概率可以不同,则在索赔额服从ERV的条件下,得到了损失过程的精细大偏差;进一步地,得到了有限时间破产概率的Lundberg极限结果.  相似文献   

4.
提出了一个基于客户到来的泊松过程风险模型,其中不同保单发生实际索赔的概率不同,假设潜在索赔额序列为负相依同分布的重尾随机变量序列,且属于重尾族L∩D族的条件下,得到了有限时间破产概率的渐近表达式.  相似文献   

5.
In this paper,we propose a customer-based individual risk model,in which potential claims by customers are described as i.i.d.heavy-tailed random variables,but different insurance policy holders are allowed to have different probabilities to make actual claims.Some precise large deviation results for the prospective-loss process are derived under certain mild assumptions,with emphasis on the case of heavy-tailed distribution function class ERV(extended regular variation).Lundberg type limiting results on the finite time ruin probabilities are also investigated.  相似文献   

6.
Pathology ordering by general practitioners (GPs) is a significant contributor to rising health care costs both in Australia and worldwide. A thorough understanding of the nature and patterns of pathology utilization is an essential requirement for effective decision support for pathology ordering. In this paper a novel methodology for integrating data mining and case-based reasoning for decision support for pathology ordering is proposed. It is demonstrated how this methodology can facilitate intelligent decision support that is both patient-oriented and deeply rooted in practical peer-group evidence. Comprehensive data collected by professional pathology companies provide a system-wide profile of patient-specific pathology requests by various GPs as opposed to that limited to an individual GP practice. Using the real data provided by XYZ Pathology Company in Australia that contain more than 1.5 million records of pathology requests by general practitioners (GPs), we illustrate how knowledge extracted from these data through data mining with Kohonen’s self-organizing maps constitutes the base that, with further assistance of modern data visualization tools and on-line processing interfaces, can provide “peer-group consensus” evidence support for solving new cases of pathology test ordering problem. The conclusion is that the formal methodology that integrates case-based reasoning principles which are inherently close to GPs’ daily practice, and data-driven computationally intensive knowledge discovery mechanisms which can be applied to massive amounts of the pathology requests data routinely available at professional pathology companies, can facilitate more informed evidential decision making by doctors in the area of pathology ordering.  相似文献   

7.
This paper presents an application of knowledge discovery via rough sets to a real life case study of global investing risk in 52 countries using 27 indicator variables. The aim is explanation of the classification of the countries according to financial risks assessed by Wall Street Journal international experts and knowledge discovery from data via decision rule mining, rather than prediction; i.e. to capture the explicit or implicit knowledge or policy of international financial experts, rather than to predict the actual classifications. Suggestions are made about the most significant attributes for each risk class and country, as well as the minimal set of decision rules needed. Our results compared favorably with those from discriminant analysis and several variations of preference disaggregation MCDA procedures. The same approach could be adapted to other problems with missing data in data mining, knowledge extraction, and different multi-criteria decision problems, like sorting, choice and ranking.  相似文献   

8.
There has been some work, e.g. Carriere (1998), Valdez (2000b), and Valdez (2001), leading to the development of statistical models in understanding the mortality pattern of terminated policies. However, there is a scant literature on the empirical evidence of the true nature of the relationship between survivorship and persistency in life insurance. When a life insurance contract terminates due to voluntary non-payment of premiums, there is a possible hidden cost resulting from mortality antiselection. This refers to the tendency of policyholders who are generally healthy to select against the insurance company by voluntarily terminating their policies. In this article, we explore the empirical results of the survival pattern of terminated policies, using a follow-up study of the mortality of those policies that terminated from a portfolio of life insurance contracts. The data has been obtained from a major insurer which traced the mortality of their policies withdrawn, for purposes of understanding the mortality antiselection, by obtaining their dates of death from the Social Security Administration office. Using a representative sample of this follow-up data, we modeled the time until a policy lapses and its subsequent mortality pattern. We find some evidence of mortality selection and we consequentially examined the financial cost of policy termination.  相似文献   

9.
CPI指数变换对产品销售影响的可拓数据挖掘   总被引:2,自引:0,他引:2  
目前对数据挖掘的研究主要集中在对静态数据的挖掘,而在实际工作中,经常要处理的矛盾问题,需要通过可拓变换和可拓变换的运算来解决,这就需要用到变换的知识,需要运用动态数据挖掘或可拓数据挖掘来解决问题.运用可拓逻辑和可拓数据挖掘的理论知识,根据国家消费者物价指数的变换对产品销售数据的影响来研究可拓数据挖掘中传导知识的挖掘,为企业的决策者在目前的市场环境下提出更加合理的销售策略提供依据.  相似文献   

10.
Optimization models have been used to support decision making in the forest industry for a long time. However, several of those models are deterministic and do not address the variability that is present in some of the data. Robust Optimization is a methodology which can deal with the uncertainty or variability in optimization problems by computing a solution which is feasible for all possible scenarios of the data within a given uncertainty set. This paper presents the application of the Robust Optimization Methodology to a Sawmill Planning Problem. In the particular case of this problem, variability is assumed in the yield coefficients associated to the cutting patterns used. The main results show that the loss in the function objective value (the “Price of Robustness”), due to computing robust solutions, is not excessive. Moreover, the computed solutions remain feasible for a large proportion of randomly generated scenarios, and tend to preserve the structure of the nominal solution. We believe that these results provide an application area for Robust Optimization in which several source of uncertainty are present.  相似文献   

11.
In this paper, we discuss combining expert knowledge and computer simulators in order to provide decision support for policy makers managing complex physical systems. We allow future states of the complex system to be viewed after initial policy is made, and for those states to influence revision of policy. The potential for future observations and intervention impacts heavily on optimal policy for today and this is handled within our approach. We show how deriving policy dependent system uncertainty using computer models leads to an intractable backwards induction problem for the resulting decision tree. We introduce an algorithm for emulating an upper bound on our expected loss surface for all possible policies and discuss how this might be used in policy support. To illustrate our methodology, we look at choosing an optimal CO2 abatement strategy, combining an intermediate complexity climate model and an economic utility model with climate data.  相似文献   

12.
医药消费者在面对双渠道零售时,会明显受到实体渠道医保报销政策的影响。基于这一现象,本文构建了同时考虑医保规制政策和消费者效用的双渠道医药供应链博弈模型,研究了医保报销比例和电商佣金比例对双渠道定价、供应链绩效以及社会福利的影响规律。研究结果表明:医药双渠道定价会同时随着平台收费标准和医保报销比例的增加而降低;医保报销比例的增加会提高社会福利、降低供应链绩效,同时会弱化网络渠道低价优势,能够有效抑制渠道蚕食现象;电商平台佣金比例的增加虽然在一定程度上提高了社会福利,但是会严重瓜分渠道利润,加剧渠道竞争,导致供应链绩效降低。  相似文献   

13.
Stakeholders and decision makers often develop visions of the ideal-type future as a response to complex societal problems and design their actions accordingly. However, these actors sometimes have a limited understanding as to whether their visions are feasible, what action is required and what the potential consequences are. This paper presents a methodology for linking visions with quantitative resource allocation scenarios which show different options in implementing the visions. The consequences are then appraised by multi-criteria assessment in order to find optimal and acceptable ways of implementation. As a result, stakeholders and decision makers learn about their visions and may even rethink them before decision making. The methodology thus couples visionary ideas with analytical information, providing a novel approach using quantitative techniques in a soft framework. The methodology is illustrated via a real-world case study concerning the future energy system in a small Swiss community.  相似文献   

14.
Robust Optimization (RO) is a modeling methodology, combined with computational tools, to process optimization problems in which the data are uncertain and is only known to belong to some uncertainty set. The paper surveys the main results of RO as applied to uncertain linear, conic quadratic and semidefinite programming. For these cases, computationally tractable robust counterparts of uncertain problems are explicitly obtained, or good approximations of these counterparts are proposed, making RO a useful tool for real-world applications. We discuss some of these applications, specifically: antenna design, truss topology design and stability analysis/synthesis in uncertain dynamic systems. We also describe a case study of 90 LPs from the NETLIB collection. The study reveals that the feasibility properties of the usual solutions of real world LPs can be severely affected by small perturbations of the data and that the RO methodology can be successfully used to overcome this phenomenon. Received: May 24, 2000 / Accepted: September 12, 2001?Published online February 14, 2002  相似文献   

15.
In the present study, a novel computational method to optimize window design for thermal comfort in naturally ventilated buildings is described. The methodology is demonstrated by means of a prototype case, which corresponds to a single-room, rural-type building. Initially, the airflow in and around the building is simulated using a Computational Fluid Dynamics model. Local climate data are recorded by a weather station and the prevailing conditions are imposed in the CFD model as inlet boundary conditions. The produced airflow patterns are utilized to predict thermal comfort indices, i.e. the PMV and its modifications for non-air-conditioned buildings, with respect to various occupant activities. Mean values of these indices (output/objective variables) within the occupied zone are calculated for different window-to-door configurations and building directions (input/design variables), to generate a database of input-output data pairs. The database is then used to train and validate Radial Basis Function Artificial Neural Network (RBF ANN) input-output “meta-models”. The produced meta-models are used to formulate an optimization problem, which takes into account thermal comfort constraints recommended by design guidelines. It is concluded that the proposed methodology provides the optimal window designs, which correspond to the best objective variables for both single and several activity levels.  相似文献   

16.
This work focuses on finding optimal barrier policy for an insurance risk model when the dividends are paid to the share holders according to a barrier strategy. A new approach based on stochastic optimization methods is developed. Compared with the existing results in the literature, more general surplus processes are considered. Precise models of the surplus need not be known; only noise-corrupted observations of the dividends are used. Using barrier-type strategies, a class of stochastic optimization algorithms are developed. Convergence of the algorithm is analyzed; rate of convergence is also provided. Numerical results are reported to demonstrate the performance of the algorithm.  相似文献   

17.
We consider price-driven dispatch planning under price uncertainty: A storable commodity is optimally sold and purchased over time. First, we consider models where the storage level is constrained in expectation. The dual of the corresponding optimization problem is related to the newsvendor problem. Exact solutions of bang-bang type are given. The second methodology is for high-frequency dispatch decisions in multistage stochastic programming models: To overcome the curse of dimensionality, prices are modeled by occupation times at price levels. In a case study, we consider a pumped-storage hydropower plant: Numerical solutions are given, which have similar patterns as for the first, exactly solvable problems.  相似文献   

18.
Data mining involves extracting interesting patterns from data and can be found at the heart of operational research (OR), as its aim is to create and enhance decision support systems. Even in the early days, some data mining approaches relied on traditional OR methods such as linear programming and forecasting, and modern data mining methods are based on a wide variety of OR methods including linear and quadratic optimization, genetic algorithms and concepts based on artificial ant colonies. The use of data mining has rapidly become widespread, with applications in domains ranging from credit risk, marketing, and fraud detection to counter-terrorism. In all of these, data mining is increasingly playing a key role in decision making. Nonetheless, many challenges still need to be tackled, ranging from data quality issues to the problem of how to include domain experts' knowledge, or how to monitor model performance. In this paper, we outline a series of upcoming trends and challenges for data mining and its role within OR.  相似文献   

19.
20.
Variations in service delivery have been identified as a major challenge to the success of process improvement studies in service departments of hospital such as radiology. Largely, these variations are due to inherent system level factors, i.e., system variations such as unavailability of resources (nurse, bed, doctors, and equipment). These system variations are largely unnecessary/unwarranted and mostly lead to longer waiting times, delays, and lowered productivity of the service units. There is limited research on identifying system variations and modelling them for service improvements within hospital. Therefore, this paper proposes a modelling methodology to model system variations in radiology based on real time locating system (RTLS) tracking data. The methodology employs concepts from graph theory to identify and represent system variations. In particular, edge coloured directed multi-graphs (ECDMs) are used to model system variations which are reflected in paths adopted by staff, i.e., sequence of rooms/areas traversed while delivering services. The main steps of the methodology are: (i) identifying the most standard path followed by staff for service delivery; (ii) filtering the redundant events in RTLS tracking database for analysis; (iii) identifying rooms/areas of hospital site involved in the service delivery; (iv) determining patterns of paths adopted by staff from filtered tracking database; and, (v) representation of patterns in graph based model called as edge coloured directed multigraphs (ECDMs) of a role. A case study of MR scanning process is utilized to illustrate the implementation of the proposed methodology for modelling system variations reflected in the paths adopted by staff.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号