首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Tail data are often modelled by fitting a generalized Pareto distribution (GPD) to the exceedances over high thresholds. In practice, a threshold is fixed and a GPD is fitted to the data exceeding . A difficulty in this approach is the selection of the threshold above which the GPD assumption is appropriate. Moreover the estimates of the parameters of the GPD may depend significantly on the choice of the threshold selected. Sensitivity with respect to the threshold choice is normally studied but typically its effects on the properties of estimators are not accounted for. In this paper, to overcome the difficulties of the fixed-threshold approach, we propose to model extreme and non-extreme data with a distribution composed of a piecewise constant density from a low threshold up to an unknown end point and a GPD with threshold for the remaining tail part. Since we estimate the threshold together with the other parameters of the GPD we take naturally into account the threshold uncertainty. We will discuss this model from a Bayesian point of view and the method will be illustrated using simulated data and a real data set.  相似文献   

2.
Traditionally, claim counts and amounts are assumed to be independent in non-life insurance. This paper explores how this often unwarranted assumption can be relaxed in a simple way while incorporating rating factors into the model. The approach consists of fitting generalized linear models to the marginal frequency and the conditional severity components of the total claim cost; dependence between them is induced by treating the number of claims as a covariate in the model for the average claim size. In addition to being easy to implement, this modeling strategy has the advantage that when Poisson counts are assumed together with a log-link for the conditional severity model, the resulting pure premium is the product of a marginal mean frequency, a modified marginal mean severity, and an easily interpreted correction term that reflects the dependence. The approach is illustrated through simulations and applied to a Canadian automobile insurance dataset.  相似文献   

3.
用POT方法估计损失分布尾部的效应分析   总被引:5,自引:1,他引:4  
在再保险中,对高超额层选取及定价等问题的讨论有着重要意义,从而引出对如何选取好的统计模型来拟合大的观测值这一问题的讨论。针对这一问题,我们考虑了POT方法(peaks over thoreholdapproach),即根据极值理论(EVT),用模拟的方法对这一方法进行评价,指出了POT方法的若干优缺点陷。  相似文献   

4.
We propose a new model – we call it a smoothed threshold life table (STLT) model – to generate life tables incorporating information on advanced ages. Our method allows a smooth mortality transition from non-extreme to extreme ages, and provides objectively determined highest attained ages with which to close the life table.We proceed by modifying the threshold life table (TLT) model developed by Li et al. (2008). In the TLT model, extreme value theory (EVT) is used to make optimal use of the relatively small number of observations at high ages, while the traditional Gompertz distribution is assumed for earlier ages. Our novel contribution is to constrain the hazard function of the two-part lifetime distribution to be continuous at the changeover point between the Gompertz and EVT models. This simple but far-reaching modification not only guarantees a smooth transition from non-extreme to extreme ages, but also provides a better and more robust fit than the TLT model when applied to a high quality Netherlands dataset. We show that the STLT model also compares favourably with other existing methods, including the Gompertz–Makeham model, logistic models, Heligman–Pollard model and Coale–Kisker method, and that a further generalisation, a time-dependent dynamic smooth threshold life table (DSTLT) model, generally has superior in-sample fitting as well as better out-of-sample forecasting performance, compared, for example, with the Cairns et al. (2006) model.  相似文献   

5.
Wind storm and hurricane risks are attracting increased attention as a result of recent catastrophic events. The aim of this paper is to select, tailor, and develop extreme value methods for use in wind storm insurance. The methods are applied to the 1982-2005 losses for the largest Swedish insurance company, the Länsförsäkringar group. Both a univariate and a new bivariate Generalized Pareto Distribution (GPD) gave models which fitted the data well. The bivariate model led to lower estimates of risk, except for extreme cases, but taking statistical uncertainty into account the two models lead to qualitatively similar results. We believe that the bivariate model provided the most realistic picture of the real uncertainties. It additionally made it possible to explore the effects of changes in the insurance portfolio, and showed that loss distributions are rather insensitive to portfolio changes. We found a small trend in the sizes of small individual claims, but no other trends. Finally, we believe that companies should develop systematic ways of thinking about “not yet seen” disasters.  相似文献   

6.
In this paper, we continue the development of the ideas introduced in England and Verrall (2001) by suggesting the use of a reparameterized version of the generalized linear model (GLM) which is frequently used in stochastic claims reserving. This model enables us to smooth the origin, development and calendar year parameters in a similar way as is often done in practice, but still keep the GLM structure. Specifically, we use this model structure in order to obtain reserve estimates and to systemize the model selection procedure that arises in the smoothing process. Moreover, we provide a bootstrap procedure to achieve a full predictive distribution.  相似文献   

7.
A hybrid Pareto model for asymmetric fat-tailed data: the univariate case   总被引:1,自引:0,他引:1  
Density estimators that can adapt to asymmetric heavy tails are required in many applications such as finance and insurance. Extreme value theory (EVT) has developed principled methods based on asymptotic results to estimate the tails of most distributions. However, the finite sample approximation might introduce a severe bias in many cases. Moreover, the full range of the distribution is often needed, not only the tail area. On the other hand, non-parametric methods, while being powerful where data are abundant, fail to extrapolate properly in the tail area. We put forward a non-parametric density estimator that brings together the strengths of non-parametric density estimation and of EVT. A hybrid Pareto distribution that can be used in a mixture model is proposed to extend the generalized Pareto (GP) to the whole real axis. Experiments on simulated data show the following. On one hand, the mixture of hybrid Paretos converges faster in terms of log-likelihood and provides good estimates of the tail of the distributions when compared with other density estimators including the GP distribution. On the other hand, the mixture of hybrid Paretos offers an alternate way to estimate the tail index which is comparable to the one estimated with the standard GP methodology. The mixture of hybrids is also evaluated on the Danish fire insurance data set.   相似文献   

8.
In this paper the mathematical modeling of extremes under power normalization is developed. An estimate of the shape parameter within the generalized extreme value distribution under power normalization is suggested. The statistical inference about the upper tail of a distribution function by using the power normalization is studied. Two models for generalized Pareto distribution under power normalization (GPDP) are given. Estimates for the shape and scale parameters within these GPDP’s are obtained. Finally, a simulation study illustrates and corroborates theoretical results.  相似文献   

9.
The connection between extreme values and record-low values is exploited to derive simply the limiting joint distribution of the r largest order statistics. The use of this distribution in the modelling of corrosion phenomena is considered, and the extrapolation of maxima in space and time is described in this context. There has been recent emphasis on movement away from classical extreme value theory to more efficient estimation procedures. This shift is continued with the illustration of the extra precision of predicted maxima obtained from a model based on extreme order statistics over the classical extreme value approach.  相似文献   

10.
Backward recurrence times in stationary renewal processes and current durations in dynamic populations observed at a cross-section may yield estimates of underlying interarrival times or survival distributions under suitable stationarity assumptions. Regression models have been proposed for these situations, but accelerated failure time models have the particularly attractive feature that they are preserved when going from the backward recurrence times to the underlying survival distribution of interest. This simple fact has recently been noticed in a sociological context and is here illustrated by a study of current duration of time to pregnancy.  相似文献   

11.
Summary Bounds for the convergence uniformly over all Borel sets of the largest order statistic as well as of the joint distribution of extremes are established which reveal in which way these rates are determined by the distance of the underlying density from the density of the corresponding generalized Pareto distribution. The results are highlighted by several examples among which there is a bound for the rate at which the joint distribution of thek largest order statistics from a normal distribution converges uniformly to its limit.  相似文献   

12.
In this paper, we derive the exact uniform convergence rate of the asymmetric normal distribution of the maximum and minimum to its extreme value limit.  相似文献   

13.
In the Wicksell corpuscle problem, the maximum size of random spheres in a volume part is to be predicted from the sectional circular distribution of spheres cut by a plane. The size of the spheres is assumed to follow the generalized gamma distribution. Some prediction methods according to measurement methods on the sectional plane are proposed, and their performances are evaluated by simulation. The prediction method based on the r largest sizes and the total number of the sectional circles is recommended, because of its satisfactory performance.  相似文献   

14.
This is a continuing paper of the authors (1998, Ann. Inst. Statist. Math., 50, 361–377). In the Wicksell corpuscle problem, the maximum size of random spheres in a volume part is to be predicted from the sectional circular distribution of spheres cut by a plane. The size of the spheres is assumed to follow the three-parameter generalized gamma distribution. Prediction methods based on the moment estimation are proposed and their performances are evaluated by simulation. For a practically probable case, one of these prediction methods is as good as a method previously proposed by the authors where the two shape parameters are assumed to be known.  相似文献   

15.
The last few years have seen a significant increase in publicly available software specifically targeted to the analysis of extreme values. This reflects the increase in the use of extreme value methodology by the general statistical community. The software that is available for the analysis of extremes has evolved in essentially independent units, with most forming extensions of larger software environments. An inevitable consequence is that these units are spread about the statistical landscape. Scientists seeking to apply extreme value methods must spend considerable time and effort in determining whether the currently available software can be usefully applied to a given problem. We attempt to simplify this process by reviewing the current state, and suggest future approaches for software development. These suggestions aim to provide a basis for an initiative leading to the successful creation and distribution of a flexible and extensible set of tools for extreme value practitioners and researchers alike. In particular, we propose a collaborative framework for which cooperation between developers is of fundamental importance. AMS 2000 Subject Classification Primary—62P99  相似文献   

16.
We consider an extended version of a model proposed by Ledford and Tawn [Ledford, A.W., Tawn, J.A., 1997. Modelling dependence within joint tail regions. J. R. Stat. Soc. 59 (2), 475-499] for the joint tail distribution of a bivariate random vector, which essentially assumes an asymptotic power scaling law for the probability that both the components of the vector are jointly large. After discussing how to fit the model, we devise a graphical tool that analyzes the differences between certain empirical probabilities and model based estimates of the same probabilities. The asymptotic normality of these differences allows the construction of statistical tests for the model assumption. The results are applied to claims of a Danish fire insurance and to medical claims from US health insurances.  相似文献   

17.
A theorem of this paper proves that if the size distribution of random spheres is generalized gamma, its Wicksell transform and other related distributions belong to the domain of attraction of the Gumbel distribution. The theorem also shows the attraction coefficients of the distributions. The fatigue strength of high-strength steel is closely related to the maximum size of nonmetallic inclusions in the region of maximum stress of the steel. Murakami and others developed a method, making use of the Gumbel QQ-plot, for predicting the maximum size from the size distribution of inclusion circles in microscopic view-fields. Based on the Gumbel approximation of the maximum of wicksell transforms, a modified and extended version of Murakami's method is justified, and its performance is evaluated by simulation.  相似文献   

18.
In this paper, a new method for nonlinear system identification via extreme learning machine neural network based Hammerstein model (ELM-Hammerstein) is proposed. The ELM-Hammerstein model consists of static ELM neural network followed by a linear dynamic subsystem. The identification of nonlinear system is achieved by determining the structure of ELM-Hammerstein model and estimating its parameters. Lipschitz quotient criterion is adopted to determine the structure of ELM-Hammerstein model from input–output data. A generalized ELM algorithm is proposed to estimate the parameters of ELM-Hammerstein model, where the parameters of linear dynamic part and the output weights of ELM neural network are estimated simultaneously. The proposed method can obtain more accurate identification results with less computation complexity. Three simulation examples demonstrate its effectiveness.  相似文献   

19.
20.
Optimal design of coastal or offshore structures requires the estimation of extreme quantiles of oceanographic data such as wave heights and wave periods. Since there are strong correlations between oceanographic variables, it is necessary to use multivariate models in order to capture its dependencies. To achieve this, an approach based on copulas is proposed and is compared to a model based on the physical behaviour of waves.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号