首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 203 毫秒
1.
极值理论主要研究小概率、大影响的极端事件.当前,复合极值分布已经广泛应用于水文、气象、地震、保险、金融等领域.本文以极值类型定理和PBDH定理为理论依据,构建了二项-广义Pareto复合极值分布模型;使用概率加权矩方法,对所建立的复合模型推导参数估计式;利用计算机模拟,得到了Kolmogorov-Smirnov(简称KS)检验统计量的临界值.  相似文献   

2.
针对目前系统概率安全评估中对人为差错研究的不足,重点讨论了考虑人为因素的飞控系统概率安全评估问题.首先针对飞行员操纵差错这一特定领域,提出了基于CREAM理论的飞行员操纵差错概率量化方法.在此基础上通过DFT建模与Monte-Carlo仿真对飞控系统进行人机概率安全评估.最后通过实例计算验证了该方法的有效性和实用价值,初步解决了人机系统概率安全评估难题.  相似文献   

3.
基于现实研究中遇到的信息不完备、数据有残缺的情况,利用信息扩散原理进行二维小样本信息的处理、分析并应用于实际项目.采用均匀信息扩散优化处理方法和非均匀信息扩散方法对二维小样本进行探索,结果与大样本处理的结论是一致的;同时对样本点数影响信息扩散结果进行了理论推导.研究表明,小样本数量不会大幅度影响预测结果.  相似文献   

4.
“相互独立事件同时发生的概率”,是高中数学必修课的内容,但我们在教学调查中发现,不少教师在理解“事件的独立性”这一概念时,还存在一些偏差.现分析如下.1概念什么是事件的独立性?课本给出的定义是:事件A(或B)是否发生对事件B(或A)发生的概率没有影响,这样的两个事件叫做相互独立事件.这里说的不是“对事件B(或A)发生没有影响”,而是“对事件B(或A)发生的概率没有影响”.但很多人并没有对“概率”一词引起注意.特别地,在对两个具体事件事件进行判断时,往往用直观的方法,这也容易导致对“概率”一词的忽略.事实上,“概率”一词在这个…  相似文献   

5.
针对反辐射战斗部预制破片对相控阵天线目标毁伤试验鉴定方面存在的困难,围绕小样本条件下合理利用仿真数据实现破片毁伤效能试验鉴定展开方法的研究.先利用试验室条件获取少量试验数据,再根据试验条件利用数值仿真方法获取大量的试验数据.通过试验数据和仿真数据的融合,采用基于相容性概率的可信度近似求法判断数据的可信度.最后在Bayes小子样统计推断基本原理的基础上,采用融入可信度后的Bayes估计方法对试验数据做出较为精确的评估.  相似文献   

6.
此综述文章介绍随机模拟方面的两个新进展:构造小概率事件估计的有效算法,产生形式不封闭的平稳分布的样本.估计一个非常小的量,需要极其准确地取定一个有用的置信区间.这使得慢收敛的小概率事件模拟在有效性和准确性两方面都成为具有挑战性的任务.在此文中,我们介绍一些有趣的小概率事件例子以及在估计它们时的困难所在.然后沿着发展脉络,寻求稳健且有效的估计量的各种方法将被讨论和评估.估计破产概率的数值实验则用来显示这些方法的质量.在稳定态模拟中,如何产生平稳随机过程的样本长期以来是一个关键性课题.通常的做法是在初始的短暂时期内丢弃掉所得数据.然而,热身准备必须多长时间则成为另一个没有满意答案的问题.幸运地,经过过去二十年的发展,对一些特定的随机模型,精确模拟已经成为可能.在此文中,我们将介绍两个重要的方法及其相关的应用.  相似文献   

7.
针对弹药试验数据样本量少时无法直接采用经典统计学方法的问题,介绍了弹药贮存可靠度评估的传统方法和现有自助扩容方法,并针对其不足提出了改进自助扩容算法,增加了自助样本量信息、保证了抽样样本与原始样本特征的一致性、降低了异常值的影响,并对所提算法利用Matlab软件仿真验证.结果表明,所提算法能有效进行贮存可靠度评估,具有显著优越性.该结论可用于弹药小样本数据贮存可靠度评估,为弹药贮存管理提供理论支撑.  相似文献   

8.
在极值理论广义极值分布模型的基础上,对上证指数日回报率的极值作了实证研究.给出了近两年间出现的极值的概率与等待时间,为风险的度量提供了量化的依据.  相似文献   

9.
车辆荷载作用下桥梁应变极值估计的阈值选取   总被引:1,自引:1,他引:0       下载免费PDF全文
采用过阈法估计车辆荷载作用下桥梁的应变极值,合理的阈值选取十分关键.阈值选取过大,信息量少,阈值选取过小,广义Pareto分布模型参数估计偏差大.常用的阈值选取方法不能较好地适用于车辆荷载作用下的应变极值估计.基于太平湖大桥车辆荷载作用下1年的应变数据,对拟合结果较好的3种混合分布进行Monte-Carlo(蒙特 卡洛)抽样,对比同一样本基于不同阈值的广义Pareto分布模型的极值估计结果,提出了一种经验式的阈值选取方法.与常用阈值选取方法相比,根据文中方法所得阈值估计的周应变极值分布与实测结果更为接近,估计结果更好.  相似文献   

10.
研究了亚指数分布族中一类特殊的分布,在年索赔额服从该特殊分布的假设下,推导出了的终极破产概率的渐进表达式,提出了可以用随机模拟方法对于服从亚指数分布的破产概率进行模拟计算的方法.从实践的角度来说,更具有可操作性,为保险业提供了一些应对极值概率事件的理论依据和检验方法.  相似文献   

11.
Estimating financial risk is a critical issue for banks and insurance companies. Recently, quantile estimation based on extreme value theory (EVT) has found a successful domain of application in such a context, outperforming other methods. Given a parametric model provided by EVT, a natural approach is maximum likelihood estimation. Although the resulting estimator is asymptotically efficient, often the number of observations available to estimate the parameters of the EVT models is too small to make the large sample property trustworthy. In this paper, we study a new estimator of the parameters, the maximum Lq-likelihood estimator (MLqE), introduced by Ferrari and Yang (Estimation of tail probability via the maximum Lq-likelihood method, Technical Report 659, School of Statistics, University of Minnesota, 2007 ). We show that the MLqE outperforms the standard MLE, when estimating tail probabilities and quantiles of the generalized extreme value (GEV) and the generalized Pareto (GP) distributions. First, we assess the relative efficiency between the MLqE and the MLE for various sample sizes, using Monte Carlo simulations. Second, we analyze the performance of the MLqE for extreme quantile estimation using real-world financial data. The MLqE is characterized by a distortion parameter q and extends the traditional log-likelihood maximization procedure. When q→1, the new estimator approaches the traditional maximum likelihood estimator (MLE), recovering its desirable asymptotic properties; when q ≠ 1 and the sample size is moderate or small, the MLqE successfully trades bias for variance, resulting in an overall gain in terms of accuracy (mean squared error).   相似文献   

12.
Deep Learning (DL) is combined with extreme value theory (EVT) to predict peak loads observed in energy grids. Forecasting energy loads and prices is challenging due to sharp peaks and troughs that arise due to supply and demand fluctuations from intraday system constraints. We propose a deep temporal extreme value model to capture these effects, which predicts the tail behavior of load spikes. Deep long‐short‐term memory architectures with rectified linear unit activation functions capture trends and temporal dependencies, while EVT captures highly volatile load spikes above a prespecified threshold. To illustrate our methodology, we develop forecasting models for hourly price and demand from the PJM interconnection. The goal is to show that DL‐EVT outperforms traditional methods, both in‐ and out‐of‐sample, by capturing the observed nonlinearities in prices and demand spikes. Finally, we conclude with directions for future research.  相似文献   

13.
??This review article introduces two recent advances in stochastic simulation: the construction of efficient algorithms for estimating rare events and the generation of samples from a stationary distribution that has no closed form. Estimating a very small quantity requires extreme accuracy to form a useful confidence interval. This makes the slowly convergent rare-event simulation a challenge task in both efficiency and accuracy. In this report, we introduce the examples of rare events of interest and the difficulties in estimating them. Various approaches to pursue robust and efficient estimators along the development are discussed and evaluated. Numerical experiments on estimating ruin probability are provided to show the quality of these approaches. In steady-state simulation, how to generate samples from a stationary stochastic process has long been the key subject. The common practice is to discard the data gathered during the initial transient period. However, how long the warm-up period must be raises another problem that has no satisfactory answer. Fortunately, by the development in the past two decades, exact simulation has become possible for certain stochastic models. In this report, we will introduce two important methods and related applications.  相似文献   

14.
极端洪水给人类造成了巨大损失,极端洪水保险是分散极端洪水风险的一种有效手段.基于政府、市场和公众合作的极端洪水保险模式是适合我国国情的.在此模式下,建立政府有效参与的保险公司和保险区域风险组合随机优化模型,保证极端洪水保险的有效供给和需求,为合理厘定保险费率提供理论基础.随机优化模型中充分考虑了保险公司的破产概率、稳定性经营和保险区域的灾后恢复能力.最后给出了此模型的收敛性定理.  相似文献   

15.
Michael Falk 《Extremes》2008,11(1):55-80
Since the publication of his masterpiece on regular variation and its application to the weak convergence of (univariate) sample extremes in 1970, Laurens de Haan (Thesis, Mathematical Centre Tract vol. 32, University of Amsterdam, 1970) is among the leading mathematicians in the world, with a particular focus on extreme value theory (EVT). On the occasion of his 70th birthday it is a great pleasure and a privilege to follow his route through multivariate EVT, which started only seven years later in 1977, when Laurens de Haan published his first paper on multivariate EVT, jointly with Sid Resnick.   相似文献   

16.
研究了江苏省西部能源供需随机系统的稳定性.主要是基于一维扩散过程的奇异边界理论,应用摄动方法研究系统的随机分岔行为.研究结果表明随机因素以及参数的选择会使系统发生分岔行为,从而使系统的稳定性发生质的变化.于是,可以通过调节参数降低发生分岔的概率,使系统处于稳定的发展中.  相似文献   

17.
Sufficient conditions for almost surely asymptotic stability with a certain decay function of sample paths, which are given by mild solutions to a class of semilinear stochastic evolution equations, are presented. The analysis is based on introducing approximating system with strong solution and using a limiting argument to pass on some properties of strong solution to our purposes. Several examples are studied to illustrate our theory. In particular, by means of the derived results we lose conditions of certain stochastic evolution systems from Haussmann (1978) to obtain the pathwise stability for mild solution with probability one.  相似文献   

18.
This paper employs a multivariate extreme value theory (EVT) approach to study the limit distribution of the loss of a general credit portfolio with low default probabilities. A latent variable model is employed to quantify the credit portfolio loss, where both heavy tails and tail dependence of the latent variables are realized via a multivariate regular variation (MRV) structure. An approximation formula to implement our main result numerically is obtained. Intensive simulation experiments are conducted, showing that this approximation formula is accurate for relatively small default probabilities, and that our approach is superior to a copula-based approach in reducing model risk.  相似文献   

19.
A analysis of hydrological risk is presented associated with decisions based on stochastic flood models. The maxima of a stream-flow are described by a marked Poisson process with a cyclic trend and exponentially distributed marks. Typical design criteria like the expected largest exceedance of a fixed level in a given period are derived from the extreme value process. The approach adopted is based on the whole record of flood data, which consists of the number, the occurrence times and the exceedances of the maxima in the observation period. Thus, compared to the series of largest annual exceedances more information is extracted. This yields an improvement in the evaluation of risk.  相似文献   

20.
We propose a new model – we call it a smoothed threshold life table (STLT) model – to generate life tables incorporating information on advanced ages. Our method allows a smooth mortality transition from non-extreme to extreme ages, and provides objectively determined highest attained ages with which to close the life table.We proceed by modifying the threshold life table (TLT) model developed by Li et al. (2008). In the TLT model, extreme value theory (EVT) is used to make optimal use of the relatively small number of observations at high ages, while the traditional Gompertz distribution is assumed for earlier ages. Our novel contribution is to constrain the hazard function of the two-part lifetime distribution to be continuous at the changeover point between the Gompertz and EVT models. This simple but far-reaching modification not only guarantees a smooth transition from non-extreme to extreme ages, but also provides a better and more robust fit than the TLT model when applied to a high quality Netherlands dataset. We show that the STLT model also compares favourably with other existing methods, including the Gompertz–Makeham model, logistic models, Heligman–Pollard model and Coale–Kisker method, and that a further generalisation, a time-dependent dynamic smooth threshold life table (DSTLT) model, generally has superior in-sample fitting as well as better out-of-sample forecasting performance, compared, for example, with the Cairns et al. (2006) model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号