首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 781 毫秒
1.
Using a standard reduction argument based on conditional expectations, this paper argues that risk sharing is always beneficial (with respect to convex order or second degree stochastic dominance) provided the risk-averse agents share the total losses appropriately (whatever the distribution of the losses, their correlation structure and individual degrees of risk aversion). Specifically, all agents hand their individual losses over to a pool and each of them is liable for the conditional expectation of his own loss given the total loss of the pool. We call this risk sharing mechanism the conditional mean risk sharing. If all the conditional expectations involved are non-decreasing functions of the total loss then the conditional mean risk sharing is shown to be Pareto-optimal. Explicit expressions for the individual contributions to the pool are derived in some special cases of interest: independent and identically distributed losses, comonotonic losses, and mutually exclusive losses. In particular, conditions under which this payment rule leads to a comonotonic risk sharing are examined.  相似文献   

2.
传统理论认为个体是风险厌恶的,展望理论提出个体是损失厌恶的,对于损失的感受程度比赢得要强烈得多。本文通过实验表明,大部分个体对于股票投资是风险追求,而不是风险厌恶的,而且在两个参考点之外的区间个体更偏好风险,而在两个参考点之内的区间对风险的偏好程度相对要小。  相似文献   

3.
We develop a methodology for the estimation of extreme loss event probability and the value at risk, which takes into account both the magnitudes and the intensity of the extreme losses. Specifically, the extreme loss magnitudes are modeled with a generalized Pareto distribution, whereas their intensity is captured by an autoregressive conditional duration model, a type of self‐exciting point process. This allows for an explicit interaction between the magnitude of the past losses and the intensity of future extreme losses. The intensity is further used in the estimation of extreme loss event probability. The method is illustrated and backtested on 10 assets and compared with the established and baseline methods. The results show that our method outperforms the baseline methods, competes with an established method, and provides additional insight and interpretation into the prediction of extreme loss event probability. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

4.
In this paper, we consider four common types of ruin probabilities for a discrete‐time multivariate risk model, where the insurer is assumed to be exposed to a vector of net losses resulting from a number of business lines over each period. By assuming a large initial capital for the risk model and regularly varying distributions for the net losses, we establish some interesting asymptotic estimates for ruin probabilities in terms of the upper tail dependence function of the net loss vector. Our results insightfully characterize how the dependence structure among the individual net losses affect the ruin probabilities in an asymptotic sense, and more importantly, from our main results, explicit asymptotic estimates for those ruin probabilities can be obtained via specifying a copula for the net loss vectors. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

5.
6.
This paper deals with the estimation of loss severity distributions arising from historical data on univariate and multivariate losses. We present an innovative theoretical framework where a closed-form expression for the tail conditional expectation (TCE) is derived for the skewed generalised hyperbolic (GH) family of distributions. The skewed GH family is especially suitable for equity losses because it allows to capture the asymmetry in the distribution of losses that tends to have a heavy right tail. As opposed to the widely used Value-at-Risk, TCE is a coherent risk measure, which takes into account the expected loss in the tail of the distribution. Our theoretical TCE results are verified for different distributions from the skewed GH family including its special cases: Student-t, variance gamma, normal inverse gaussian and hyperbolic distributions. The GH family and its special cases turn out to provide excellent fit to univariate and multivariate data on equity losses. The TCE risk measure computed for the skewed family of GH distributions provides a conservative estimator of risk, addressing the main challenge faced by financial companies on how to reliably quantify the risk arising from the loss distribution. We extend our analysis to the multivariate framework when modelling portfolios of losses, allowing the multivariate GH distribution to capture the combination of correlated risks and demonstrate how the TCE of the portfolio can be decomposed into individual components, representing individual risks in the aggregate (portfolio) loss.  相似文献   

7.
The objective of analysing a company's risk exposures is togain an understanding of the risks that the company faces. Onlythen can the likely level of future losses be estimated, anddecisions about how best to manage these risks be made. To gaina full understanding, we first need to adjust for a number ofexternal factors to ensure that all data are on a consistentbasis. The historic data can then be analysed and the levelof variability determined. After identifying appropriate probabilitydistributions for the frequency and severity of the risks, simulationscan be run to make forecasts. Once forecasts have been made, the best way to manage and financethe risks can be considered. As such decisions typically dependupon many factors, utility theory can be used to summarize theadvantage that the company will obtain from each alternativein a given situation. This will involve defining a utility functionfor the company. Methods of eliciting these utility functionsexist, including influence diagrams. Decision theory can consequentlybe applied to determine the best course of action using thecompany's utility function and its beliefs about the future.Uncertainty inherent in the information can therefore be incorporatedin the decision process rather than be ignored. The decisionwill also depend upon the ability of the company to sustaina loss from retained risks and regulatory requirements relatingto the risks.  相似文献   

8.
It often happens that one or more aeroplanes from an airline fleet are taken out of operation for technical reasons and the airline has to operate on the existing network with a reduced number of planes. This paper presents the results of an effort to define a new ad hoc schedule for this situation, so that the total passenger delay on an airline network is minimized. A network is formed, in which nodes represent flights on a given airline network, and arcs are the total time losses on individual flights. The problem of determining a new routing and scheduling plan for the airline fleet is solved by branch and-bound methods. A numerical example illustrates the efficiency of the model.  相似文献   

9.
内部欺诈事件类型是中国商业银行最严重的操作风险类型。但由于操作风险本质特征和中国商业银行内部欺诈损失数据收集年度较短,数据匮乏,小样本数据容易导致参数结果不稳定。为了在小样本数据下进行更准确的度量,本文采用贝叶斯马尔科夫蒙特卡洛模拟方法,在损失分布法框架下,假设损失频率服从泊松-伽马分布,而损失强度服从广义帕累托-混合伽马分布,分析后验分布的形式,获得中国商业银行不同业务线的内部欺诈损失频率和损失强度的后验分布估计,并进行蒙特卡罗模拟获得不同业务线内部欺诈的风险联合分布。结果表明,拟合结果很好,与传统极值分析法相比,基于利用贝叶斯的分析获得的后验分布可以作为未来的先验分布,有利于在较小样本下获得较真实的参数估计,本方法有助于银行降低监管资本要求。  相似文献   

10.
操作风险是指金融机构由于内部工作流程、人员、技术或者外部事件所造成损失的风险。我国商业银行处于转型当中,由于操作风险引发的损失事件频频发生。本文通过所收集到的国内商业银行的操作风险损失事件,从影响操作风险四个因素角度进行分析,对操作风险损失事件的频度和幅度进行了定量分析,对我国银行业目前面临的状况给予一个初步的概括。  相似文献   

11.
尾部风险测度是风险管理中的关键点,本文利用缓冲超越概率模型,量化不同预期损失的风险概率分布,构建条件风险价值约束下的最小化“厚尾事件”概率的套期保值策略,从而将现有研究的视角拓展到考虑预期损失和风险概率的双重维度。本文通过实证数据统计和参数化拟合分布两个方法提供不同风险阈值及对应的缓冲超越概率的稳定解集合,研究结果发现,无论预期损失服从厚尾分布还是正态分布,缓冲超越概率模型均能够显著地降低市场风险和潜在的“厚尾事件”发生的概率,并提供比最小化方差稳定的套期保值比率。  相似文献   

12.
Motivated by real-world critical applications such as aircraft, medical devices, and military systems, this paper models non-repairable systems subject to a delay-time failure process involving hidden and fatal failures in two stages during their missions. A hidden failure cannot cause the system to stop functioning while a fatal failure causes the entire system loss. The system undergoes scheduled inspections for detecting the hidden failure. In the case of a positive inspection result, the system main mission is aborted and a rescue operation is started to mitigate the risk of the entire system loss. The inspections are imperfect and may produce false positive and negative failures. We propose probabilistic models for evaluating performance metrics of the system considered, including mission success probability, system survival probability, expected number of inspections during the mission, and total expected losses. Based on the evaluation models, we formulate and solve an optimization problem of finding the optimal inspection schedule on a fixed mission time horizon to minimize the total expected loss. Examples are provided to demonstrate the proposed methodology and effects of key system parameters on system performance and optimization solutions.  相似文献   

13.
The aim of this paper is to propose the first mathematical model for spillover effects caused by operational losses and to calibrate it based on an extensive empirical study of spillover effects and their influencing factors in the US and European banking and insurance industry. Our event study shows significant spillover effects due to operational losses, whereby a higher number of firms faces contagion effects than competitive effects. A regression analysis further reveals that spillover effects are rather information-based than pure, as event and firm characteristics have a significant impact, specifically external fraud, the return on equity of the announcing firm and the similarity between the announcing and the non-announcing firm in terms of size. Based on the empirical findings, we fit a distribution and model spillover effects and underlying operational losses to assess respective risk measures by means of a simulation analysis. The results show that spillover risk can be considerable for non-announcing firms as well as from a portfolio view, which has important risk management implications.  相似文献   

14.
This paper studies an equilibrium model between an insurance buyer and an insurance seller, where both parties’ risk preferences are given by convex risk measures. The interaction is modeled through a Stackelberg type game, where the insurance seller plays first by offering prices, in the form of safety loadings. Then the insurance buyer chooses his optimal proportional insurance share and his optimal prevention effort in order to minimize his risk measure. The loss distribution is given by a family of stochastically ordered probability measures, indexed by the prevention effort. We give special attention to the problems of self-insurance and self-protection, and show that if the buyer’s risk measure decreases faster in effort than his expected loss, optimal effort is non-decreasing in the safety loading with a potential discontinuity when optimal coverage switches from full to zero. On the contrary, if the decrease of the buyer’s risk measure is slower than the expected loss, optimal effort may or may not be non-decreasing in the safety loading. In case of Pareto distributed losses, the seller sets the highest possible price under which the buyer still prefers full insurance over no insurance. We also analyze the case of discrete distributions: on the one hand, for self-protection, under the assumption that the marginal impact of the effort is higher on small losses than it is on catastrophic losses, the optimal effort is non-decreasing in the safety loading. On the other hand, in the case of self-protection, more conditions are needed, in particular, we obtain sufficient conditions for the optimal effort to be non-decreasing or non-monotone in the safety loading.  相似文献   

15.
基于高频数据度量日内交易活动的风险是目前日内金融数据与风险管理中极具挑战性的研究课题之一。本文从实时交易的角度,使用中国股市分笔交易数据,基于价格持续时间的自回归条件持续时间(ACD)模型,研究日内不规则交易数据的风险测度,利用日内不等间隔波动模型估计了日内交易的即时条件波动率,对日内不等间隔风险价值进行了预测和检验。实证结果发现日内不等间隔风险价值模型能够比较好的刻画日内交易风险,股票投资者和市场监管者可以基于该工具对日内风险做出合理的预测,达到止损避险和控制风险的目的。  相似文献   

16.
Models developed to analyze facility location decisions have typically optimized one or more objectives, subject to physical, structural, and policy constraints, in a static or deterministic setting. Because of the large capital outlays that are involved, however, facility location decisions are frequently long-term in nature. Consequently, there may be considerable uncertainty regarding the way in which relevant parameters in the location decision will change over time. In this paper, we propose two approaches for analyzing these types of dynamic location problems, focussing on situations where the total number of facilities to be located in uncertain. We term this type of location problem NOFUN (Number Of Facilities Uncertain). We analyze the NOFUN problem using two well-established decision criteria: the minimization of expected opportunity loss (EOL), and the minimization of maximum regret. In general, these criteria assume that there are a finite number of decision options and a finite number of possible states of nature. The minisum EOL criterion assumes that one can assign probabilities for the occurrence of the various states of nature and, therefore, find the initial set of facility locations that minimize the sum of expected losses across all future states. The minimax regret criteria finds the pattern of initial facility locations whose maximum loss is minimized over all possible future states.  相似文献   

17.
Index-linked catastrophic loss instruments represent an alternative to traditional reinsurance to hedge against catastrophic losses. The use of these instruments comes with benefits, such as a reduction of moral hazard and higher transparency. However, at the same time, it introduces basis risk as a crucial key risk factor, since the index and the company’s losses are usually not fully dependent. The aim of this paper is to examine the impact of basis risk on an insurer’s solvency situation when an industry loss warranty contract is used for hedging. Since previous literature has consistently stressed the importance of a high degree of dependence between the company’s losses and the industry index, we extend previous studies by allowing for non-linear dependencies between relevant processes (high-risk and low-risk assets, insurance company’s loss and industry index). The analysis shows that both the type and degree of dependence play a considerable role with regard to basis risk and solvency capital requirements and that other factors, such as relevant contract parameters of index-linked catastrophic loss instruments, should not be neglected to obtain a comprehensive and holistic view of their effect upon risk reduction.  相似文献   

18.
陈倩  梁力军 《运筹与管理》2019,28(8):174-181
多个风险单元的集成度量是银行操作风险管理的关键步骤之一。立足于操作风险的“厚尾”、“截断”性,从分段损失分布法的视角出发,探讨操作风险集成度量的模式和数值方法。首先,引入两阶段损失分布法来拟合单个风险单元边际损失分布,用双截尾分布代替传统的完整分布来刻画“高频低损”损失数据的双截断特性,利用POT模型捕获“低频高损”事件的厚尾特性。再次,基于分段建模思路,对传统度量过程中边际分布为单一、完整分布的Copula模型进行了扩展,研究边际分布为分段分布、截尾分布条件下使用Copula函数集成度量操作风险的框架和步骤,并设计了Monte Carlo模拟算法。最后,以实证分析的形式验证所构建模型。通过对中国商业银行416个操作风险损失数据的实证分析,结果表明分段分布、截尾分布能对单个风险单元边际分布有更好的拟合效果,能减小由于分布选择不当而引发的模型风险。分段度量视角下Copula函数的引入能灵活处理多个操作风险单元间的相依结构,使风险度量结果更为合理。  相似文献   

19.
Banks and other financial institutions try to compute the necessary amount of total capital that they need for absorbing stochastically dependent losses from different risk types (e.g., credit risk and market risk). Two sophisticated procedures of this so-called integrated risk management are the top-down and the bottom-up approaches. When banks apply a more sophisticated risk integration approach at all, it is usually the top-down approach where copula functions are employed for linking the marginal distributions of profit and losses resulting from different risk types. However, it is not clear at all how accurate this approach is. Assuming that the bottom-up approach corresponds to the real-word data-generating process and using a comprehensive simulation study, it is shown that the top-down approach can underestimate the necessary amount of total capital for lower credit qualities. Furthermore, the direction and strength of the stochastic dependence between the risk types, the copula function employed, and the loss definitions all have an impact on the performance of the top-down approach. In addition, a goodness-of-fit test shows that, based on time series of loss data with realistic length, it is rather difficult to decide which copula function is the right one.  相似文献   

20.
索玮岚  陈锐 《运筹与管理》2015,24(2):140-145
针对城市生命线风险应对方案选择问题涉及的特征指标关联性和信息形式多样性,本文提出了一种考虑关联性特征匹配的混合型决策方法。首先,给出实际域、设定域和公共域的定义,并将具有区间数、语言短语等信息形式的风险事件特征指标值和应对方案特征指标值分别映射为实际域和设定域,进而通过两者的面积交织确定公共域;然后,将公理设计方法扩展到混合型决策环境,计算出反映风险事件与应对方案在各个特征指标下匹配程度的信息量,并进行应对方案的初筛;进一步地,采用2-additive Choquet积分算子将特征指标的权重、关联系数和初筛后剩余应对方案的信息量集结为反映风险事件与应对方案综合匹配程度的信息总量,并据此选择最优的应对方案。最后,通过算例分析验证了所提方法的有效性和可行性。研究结果表明,该方法能够为相关监管部门快速响应城市生命线风险、最大程度地降低风险损失和危害提供有效的决策支持。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号