首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, we are looking for the best financial alliance compromise structure between the executives of the banks and insurance companies and the bank and insurance supervisory authorities in Finland. First, we studied alternative alliance structures between banks and insurance companies from the point of view of supervisory authorities. Together with leaders and experts of the supervisory authorities, we introduced eight criteria for the evaluation of the six alternative alliance structures. The evaluation was carried out by an expert panel consisting of the representatives of the supervisory authorities.In our earlier research, the financial conglomerate was preferred by bank and insurance executives to the other alternatives. The alliance models based on plain cross-selling agreements received the highest ranks in the evaluation of supervisory authorities. Under certain conditions, the financial conglomerate might be an acceptable compromise alternative for the supervisory authorities as well.  相似文献   

2.
Additive utility function models are widely used in multiple criteria decision analysis. In such models, a numerical value is associated to each alternative involved in the decision problem. It is computed by aggregating the scores of the alternative on the different criteria of the decision problem. The score of an alternative is determined by a marginal value function that evolves monotonically as a function of the performance of the alternative on this criterion. Determining the shape of the marginals is not easy for a decision maker. It is easier for him/her to make statements such as “alternative a is preferred to b”. In order to help the decision maker, UTA disaggregation procedures use linear programming to approximate the marginals by piecewise linear functions based only on such statements. In this paper, we propose to infer polynomials and splines instead of piecewise linear functions for the marginals. In this aim, we use semidefinite programming instead of linear programming. We illustrate this new elicitation method and present some experimental results.  相似文献   

3.
Stochastic multicriteria acceptability analysis (SMAA) is a family of methods for aiding multicriteria group decision making. These methods are based on exploring the weight space in order to describe the preferences that make each alternative the most preferred one. The main results of the analysis are rank acceptability indices, central weight vectors and confidence factors for different alternatives. The rank acceptability indices describe the variety of different preferences resulting in a certain rank for an alternative; the central weight vectors represent the typical preferences favouring each alternative; and the confidence factors measure whether the criteria data are sufficiently accurate for making an informed decision.In some cases, when the problem involves a large number of efficient alternatives, the analysis may fail to discriminate between them. This situation is revealed by low confidence factors. In this paper we develop cross confidence factors, which are based on computing confidence factors for alternatives using each other’s central weight vectors. The cross confidence factors can be used for classifying efficient alternatives into sets of similar and competing alternatives. These sets are related to the concept of reference sets in Data Envelopment Analysis (DEA), but generalized for stochastic models. Forming these sets is useful when trying to identify one or more most preferred alternatives, or suitable compromise alternatives. The reference sets can also be used for evaluating whether criteria need to be measured more accurately, and at which alternatives the measurements should be focused. This may cause considerable savings in measurement costs. We demonstrate the use of the cross confidence factors and reference sets using a real-life example.  相似文献   

4.
We consider a problem faced by a buying office for one of the largest retail distributors in the world. The buying office plans the distribution of goods from Asia to various destinations across Europe. The goods are transported along shipping lanes by shipping companies, many of which have collaborated to form strategic alliances; each lane must be serviced by a minimum number of companies belonging to a minimum number of alliances. The task involves purchasing freight capacity from shipping companies for each lane based on projected demand, and subject to minimum quantity requirements for each selected shipping company, such that the total transportation cost is minimized. In addition, the allocation must not assign an overly high proportion of freight to the more expensive shipping companies servicing any particular lane, which we call the lane cost balancing constraint.This study is the first to consider the lane cost balancing constraint in the context of freight allocation. We formulate the freight allocation problem with this lane cost balancing constraint as a mixed integer programming model, and show that even finding a feasible solution to this problem is computationally intractable. Hence, in order to produce high-quality solutions in practice, we devised a meta-heuristic approach based on tabu search. Experiments show that our approach significantly outperforms the branch-and-cut approach of CPLEX 11.0 when the problem increases to practical size and the lane cost balancing constraint is tight. Our approach was developed into an application that is currently employed by decision-makers at the buying office in question.  相似文献   

5.
Various features influencing the formulation of a locational decision problem include the problem representations performance criteria, availability of data, the restrictions imposed, and the computational aspects. Although there is a great need for in-depth analyses of the impact of these factors-both individually and jointly - we have as a first step primarily focused on the influence of distance on modelling with respect to e.g. computational tractability and on the quality of the solutions obtained.While most papers on locational decision problems are concerned with the mathematical/technical aspects, we have preferred a non-mathematical exposition aiming at a target group of O.R. practitioners and regional planners.  相似文献   

6.
With the decline in the mortality level of populations, national social security systems and insurance companies of most developed countries are reconsidering their mortality tables taking into account the longevity risk. The Lee and Carter model is the first discrete-time stochastic model to consider the increased life expectancy trends in mortality rates and is still broadly used today. In this paper, we propose an alternative to the Lee-Carter model: an AR(1)-ARCH(1) model. More specifically, we compare the performance of these two models with respect to forecasting age-specific mortality in Italy. We fit the two models, with Gaussian and t-student innovations, for the matrix of Italian death rates from 1960 to 2003. We compare the forecast ability of the two approaches in out-of-sample analysis for the period 2004-2006 and find that the AR(1)-ARCH(1) model with t-student innovations provides the best fit among the models studied in this paper.  相似文献   

7.
In automobile insurance, it is useful to achieve a priori ratemaking by resorting to generalized linear models, and here the Poisson regression model constitutes the most widely accepted basis. However, insurance companies distinguish between claims with or without bodily injuries, or claims with full or partial liability of the insured driver. This paper examines an a priori ratemaking procedure when including two different types of claim. When assuming independence between claim types, the premium can be obtained by summing the premiums for each type of guarantee and is dependent on the rating factors chosen. If the independence assumption is relaxed, then it is unclear as to how the tariff system might be affected. In order to answer this question, bivariate Poisson regression models, suitable for paired count data exhibiting correlation, are introduced. It is shown that the usual independence assumption is unrealistic here. These models are applied to an automobile insurance claims database containing 80,994 contracts belonging to a Spanish insurance company. Finally, the consequences for pure and loaded premiums when the independence assumption is relaxed by using a bivariate Poisson regression model are analysed.  相似文献   

8.
Traditionally, an insurance risk process describes an insurance company’s risk through some criteria using the historical data under the framework of probability theory with the prerequisite that the estimated distribution function is close enough to the true frequency. However, because of the complexity and changeability of the world, economical and technological reasons in many cases enough historical data are unavailable and we have to base on belief degrees given by some domain experts, which motivates us to include the human uncertainty in the insurance risk process by regarding interarrival times and claim amounts as uncertain variables using uncertainty theory. Noting the expansion of insurance companies’ operation scale and the increase of businesses with different risk nature, in this paper we extend the uncertain insurance risk process with a single class of claims to that with multiple classes of claims, and derive expressions for the ruin index and the uncertainty distribution of ruin time respectively. As the ruin time can be infinite, we propose a proper uncertain variable and the corresponding proper uncertainty distribution of that. Some numerical examples are documented to illustrate our results. Finally our method is applied to a real-world problem with some satellite insurance data provided by global insurance brokerage MARSH.  相似文献   

9.
Fusing multiple Bayesian knowledge sources   总被引:1,自引:0,他引:1  
We address the problem of information fusion in uncertain environments. Imagine there are multiple experts building probabilistic models of the same situation and we wish to aggregate the information they provide. There are several problems we may run into by naively merging the information from each. For example, the experts may disagree on the probability of a certain event or they may disagree on the direction of causality between two events (e.g., one thinks A causes B while another thinks B causes A). They may even disagree on the entire structure of dependencies among a set of variables in a probabilistic network. In our proposed solution to this problem, we represent the probabilistic models as Bayesian Knowledge Bases (BKBs) and propose an algorithm called Bayesian knowledge fusion that allows the fusion of multiple BKBs into a single BKB that retains the information from all input sources. This allows for easy aggregation and de-aggregation of information from multiple expert sources and facilitates multi-expert decision making by providing a framework in which all opinions can be preserved and reasoned over.  相似文献   

10.
In this paper we demonstrate how to develop analytic closed form solutions to optimal multiple stopping time problems arising in the setting in which the value function acts on a compound process that is modified by the actions taken at the stopping times. This class of problem is particularly relevant in insurance and risk management settings and we demonstrate this on an important application domain based on insurance strategies in Operational Risk management for financial institutions. In this area of risk management the most prevalent class of loss process models is the Loss Distribution Approach (LDA) framework which involves modelling annual losses via a compound process. Given an LDA model framework, we consider Operational Risk insurance products that mitigate the risk for such loss processes and may reduce capital requirements. In particular, we consider insurance products that grant the policy holder the right to insure k of its annual Operational losses in a horizon of T years. We consider two insurance product structures and two general model settings, the first are families of relevant LDA loss models that we can obtain closed form optimal stopping rules for under each generic insurance mitigation structure and then secondly classes of LDA models for which we can develop closed form approximations of the optimal stopping rules. In particular, for losses following a compound Poisson process with jump size given by an Inverse-Gaussian distribution and two generic types of insurance mitigation, we are able to derive analytic expressions for the loss process modified by the insurance application, as well as closed form solutions for the optimal multiple stopping rules in discrete time (annually). When the combination of insurance mitigation and jump size distribution does not lead to tractable stopping rules we develop a principled class of closed form approximations to the optimal decision rule. These approximations are developed based on a class of orthogonal Askey polynomial series basis expansion representations of the annual loss compound process distribution and functions of this annual loss.  相似文献   

11.
Multi-period guarantees are often embedded in life insurance contracts. In this paper we consider the problem of hedging these multi-period guarantees in the presence of transaction costs. We derive the hedging strategies for the cheapest hedge portfolio for a multi-period guarantee that with certainty makes the insurance company able to meet the obligations from the insurance policies it has issued. We find that by imposing transaction costs, the insurance company reduces the rebalancing of the hedge portfolio. The cost of establishing the hedge portfolio also increases as the transaction cost increases. For the multi-period guarantee there is a rather large rebalancing of the hedge portfolio as we go from one period to the next. By introducing transaction costs we find the size of this rebalancing to be reduced. Transaction costs may therefore be one possible explanation for why we do not see the insurance companies performing a large rebalancing of their investment portfolio at the end of each year.  相似文献   

12.
Multiple objectives and dynamics characterize many sequential decision problems. In the paper we consider returns in partially ordered criteria space as a way of generalization of single criterion dynamic programming models to multiobjective case. In our problem evaluations of alternatives with respect to criteria are represented by distribution functions. Thus, the overall comparison of two alternatives is equivalent to the comparison of two vectors of probability distributions. We assume that the decision maker tries to find a solution preferred to all other solutions (the most preferred solution). In the paper a new interactive procedure for stochastic, dynamic multiple criteria decision making problem is proposed. The procedure consists of two steps. First, the Bellman principle is used to identify the set of efficient solutions. Next interactive approach is employed to find the most preferred solution. A numerical example and a real-world application are presented to illustrate the applicability of the proposed technique.  相似文献   

13.
车险事故总体预测问题一直是车辆保险公司研究的重点内容之一,目前最为常用的方法是与泊松分布相关的模型.基于车辆保险中索赔数据的结构特征,构建了Capture-Recapture模型,并使用一组车辆保险数据,利用Capture-Recapture及常用的零膨胀泊松等模型分别建模分析,得出了一些新的结论,即Capture-Recapture模型拟合效果整体较优,从而为车辆保险公司更好预测事故总体提供一定的理论依据.  相似文献   

14.
Short-sighted asset/liability strategies of the seventies left financial intermediaries — banks, insurance and pension fund companies, and government agencies — facing a severe mismatch between the two sides of their balance sheet. A more holistic view was introduced with a generation ofportfolio immunization techniques. These techniques have served the financial services community well over the last decade. However, increased interest rate volatilities, and the introduction of complex interest rate contingencies and asset-backed securities during the same period, brought to light the shortcomings of the immunization approach. This paper describes a series of (optimization) models that take a global view of the asset/liability management problem using interest rate contingencies. Portfolios containingmortgage-backed securities provide the typical example of the complexities faced by asset/liability managers in a volatile financial world. We use this class of instruments as examples for introducing the models. Empirical results are used to illustrate the effectiveness of the models, which become increasingly more complex but also afford the manager increasing flexibility.  相似文献   

15.
We study the problem of optimal insurance contract design for risk management under a budget constraint. The contract holder takes into consideration that the loss distribution is not entirely known and therefore faces an ambiguity problem. For a given set of models, we formulate a minimax optimization problem of finding an optimal insurance contract that minimizes the distortion risk functional of the retained loss with premium limitation. We demonstrate that under the average value-at-risk measure, the entrance-excess of loss contracts are optimal under ambiguity, and we solve the distributionally robust optimal contract-design problem. It is assumed that the insurance premium is calculated according to a given baseline loss distribution and that the ambiguity set of possible distributions forms a neighborhood of the baseline distribution. To this end, we introduce a contorted Wasserstein distance. This distance is finer in the tails of the distributions compared to the usual Wasserstein distance.  相似文献   

16.
In this study, we propose a modelling framework for evaluating companies financed by random liabilities, such as insurance companies or commercial banks. In this approach, earnings and costs are driven by double exponential jump–diffusion processes and bankruptcy is declared when the income falls below a default threshold, which is proportional to the charges. A change of numeraire, under the Esscher risk neutral measure, is used to reduce the dimension. A closed form expression for the value of equity is obtained in terms of the expected present value operators, with and without disinvestment delay. In both cases, we determine the default threshold that maximizes the shareholder’s equity. Subsequently, the probabilities of default are obtained by inverting the Laplace transform of the bankruptcy time. In numerical applications of the proposed model, we apply a procedure for calibration based on market and accounting data to explain the behaviour of shares for two real-world examples of insurance companies.  相似文献   

17.
The problem of designing filter banks for multidimensional multirate systems by using a lifting technique is considered. To solve it, we develop a design method for multidimensional digital filters with fractional shift. A symmetric structure is defined for τ = (1/2, 1/2) and a new structure is designed based on application of multidimensional Taylor series. Frequency and impulse responses are given for filters with fractional space shift and their L 2-norm is found. Relevant wavelet functions are calculated and results of image compression by the designed filter banks are presented.  相似文献   

18.
Abstract

In this paper, we give a method for computing the fair insurance fee associated with the guaranteed minimum death benefit (GMDB) clause included in many variable annuity contracts. We allow for partial withdrawals, a common feature in most GMDB contracts, and determine how this affects the GMDB fair insurance charge. Our method models the GMDB pricing problem as an impulse control problem. The resulting quasi-variational inequality is solved numerically using a fully implicit penalty method. The numerical results are obtained under both constant volatility and regime-switching models. A complete analysis of the numerical procedure is included. We show that the discrete equations are stable, monotone and consistent and hence obtain convergence to the unique, continuous viscosity solution, assuming this exists. Our results show that the addition of the partial withdrawal feature significantly increases the fair insurance charge for GMDB contracts.  相似文献   

19.
The p-median problem is one of the basic models in discrete location theory. As with most location problems, it is classified as NP-hard, and so, heuristic methods are usually used to solve it. Metaheuristics are frameworks for building heuristics. In this survey, we examine the p-median, with the aim of providing an overview on advances in solving it using recent procedures based on metaheuristic rules.  相似文献   

20.
The estimation of loss reserves for incurred but not reported (IBNR) claims presents an important task for insurance companies to predict their liabilities. Recently, individual claim loss models have attracted a great deal of interest in the actuarial literature, which overcome some shortcomings of aggregated claim loss models. The dependence of the event times with the delays is a crucial issue for estimating the claim loss reserving. In this article, we propose to use semi-competing risks copula and semi-survival copula models to fit the dependence structure of the event times with delays in the individual claim loss model. A nonstandard two-step procedure is applied to our setting in which the associate parameter and one margin are estimated based on an ad hoc estimator of the other margin. The asymptotic properties of the estimators are established as well. A simulation study is carried out to evaluate the performance of the proposed methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号