首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A completely dependent risk process with perturbation and phase-type distributed claim sizes is analyzed. Claim arrivals are modeled by a Markovian arrival process. Using a vector-valued martingale, the Laplace transform of the time to ruin is derived algorithmically. The conditional memoryless property of the phase-type distribution yields the distribution of the deficit at ruin as a corollary.  相似文献   

2.
It is proved that generalized excursion measures can be constructed via time change of Itô’s Brownian excursion measure. A tightness-like condition on strings is introduced to prove a convergence theorem of generalized excursion measures. The convergence theorem is applied to obtain a conditional limit theorem, a kind of invariance principle where the limit is the Bessel meander.  相似文献   

3.
We study the mean sojourn times in two M/G/1 weighted round-robin systems: the weight of a customer at any given point in time in the first system is a function of its age (imparted service), while in the second system the weight is a function of the customer’s remaining processing time (RPT). We provide a sufficient condition under which the sojourn time of a customer with large service requirement (say, x) and that arrives in the steady state is close to that of a customer which starts a busy period and has the same service requirement. A sufficient condition is then provided for continuity of the performance metric (the mean sojourn time) as the quanta size in the discrete time system converges to 0. We then consider a multi-class system and provide relative ordering of the mean sojourn times among the various classes.  相似文献   

4.
The statements in the title are explained and proved, as a little exercise in elementary normed vector space theory at the level of Chap. 5 of Dieudonné’s Foundations of Mathematical Analysis. A connection to recent moment bounds for submartingales is sketched.  相似文献   

5.
In this paper, two new tests for heteroscedasticity in nonparametric regression are presented and compared. The first of these tests consists in first estimating nonparametrically the unknown conditional variance function and then using a classical least-squares test for a general linear model to test whether this function is a constant. The second test is based on using an overall distance between a nonparametric estimator of the conditional variance function and a parametric estimator of the variance of the model under the assumption of homoscedasticity. A bootstrap algorithm is used to approximate the distribution of this test statistic. Extended versions of both procedures in two directions, first, in the context of dependent data, and second, in the case of testing if the variance function is a polynomial of a certain degree, are also described. A broad simulation study is carried out to illustrate the finite sample performance of both tests when the observations are independent and when they are dependent.  相似文献   

6.
In this paper, an extension of the standard newsboy problem is presented, involving an extraordinary order and a variable mixture of backorders and lost sales. The backlogged demand ratio is given by a nonincreasing function of the quantity of shortage. Some general properties for the expected cost are derived under weak assumptions about the backorder rate function. When the backorder rate is a linear function, some sufficient conditions for the global convexity of the expected cost are derived. A sufficient condition for the local concavity of this function is also provided. Numerical examples are presented to illustrate the theoretical results and a specific practical case is proposed and solved. Moreover, a sensitivity analysis of the optimal solution with respect to the parameters of the backorder rate function is included. Finally, some extensions of the proposed model are suggested as possible directions for future research.  相似文献   

7.
The mathematical literature knows several quite different ways to determine the area of a parabolic segment. In this article they are brought together, compared and varied. Finally two new proofs are added to the collection. Each proof is displayed by an expressive figure. A colorful compilation of those figures accompanies the article as an appendix.  相似文献   

8.
A lattice point in the plane is a point with integer coordinates. A lattice polygon K is a polygon whose vertices are lattice points. In this note we prove that any convex lattice 11-gon contains at least 15 interior lattice points.  相似文献   

9.
In this paper, we derive an optimal leverage function for Constant Proportion Debt Obligations (CPDOs) by using stochastic control techniques. The investor’s goal is to maximise redemption of capital at maturity. The control variable of the problem is the leverage process, i.e. the time dependent notional exposure to the underlying risky index/portfolio. The control problem is solved explicitly with the help of the Legendre transform applied to the HJB equation of stochastic control. A closed form solution is given for the optimal leverage. Contrary to the industry practise, the optimal leverage derived in this paper is a non-linear, bell-shaped function of the CPDO assets value.  相似文献   

10.
Nonparametric copula models are based on observations whose distributions are generally unknown. Estimation of these copula models is based on pseudo-observations consisting of the ranked data. To determine distributional properties (e.g., the variance) of the models and their estimators, resampling methods such as bootstrapping are involved. These methods require drawing samples with replacement from the ranked data. The newly generated samples have to be reranked and existing ties have to be solved by mid-ranks. Since a large number of samples has to be generated in order to attain a suitable accuracy of the estimate, the speed of the algorithm for reranking the samples highly affects the overall computation time. However, commonly used ranking procedures are computationally expensive and their running time is of order O(n* log(n*) + n*). We discuss a faster, more feasible approach using the specific copula setting with a running time that is only of order O(n + n*), where n denotes the sample size and n* the size of the bootstrap sample. In a simulation study, the algorithm performs up to 9 times faster than Matlab’s tiedrank.m-procedure.  相似文献   

11.
基于修正方差比率函数给出一种检验厚尾序列持久性变点的统计量.在无变点的假设下得到了统计量的渐近分布.为避免检验渐近分布中的厚尾指数,构造Bootstrap抽样方法来确定渐近分布的经验临界值.数值模拟研究结果说明修正方差比率统计量及Bootstrap抽样方法的有效性.  相似文献   

12.
周杰  吴婷 《中国科学:数学》2011,41(6):559-576
对具有随机误差的观测数据, 讨论了常系数线性常微分方程参数稳定性的统计推断问题. 通过残差项的Karhunen-Loeve 分解, 给出了变点检验步骤及其在原假设下的极限分布. 在对立假设下定义了变点的估计, 证明了检验以及估计的一致性. 对常系数二阶常微分方程进行了统计模拟, 结果表明原假设下的极限分布是对真实分布非常好的近似; 对立假设下, 即使输入函数的频率存在0.75% 的变化, 上述检验也能以大概率拒绝原假设. 最后利用上述方法研究了英国中部地区的气温数据, 揭示了数据一些新的特点.  相似文献   

13.
Inference for the Mean Difference in the Two-Sample Random Censorship Model   总被引:1,自引:0,他引:1  
Inference for the mean difference in the two-sample random censorship model is an important problem in comparative survival and reliability test studies. This paper develops an adjusted empirical likelihood inference and a martingale-based bootstrap inference for the mean difference. A nonparametric version of Wilks' theorem for the adjusted empirical likelihood is derived, and the corresponding empirical likelihood confidence interval of the mean difference is constructed. Also, it is shown that the martingale-based bootstrap gives a correct first order asymptotic approximation of the corresponding estimator of the mean difference, which ensures that the martingale-based bootstrap confidence interval has asymptotically correct coverage probability. A simulation study is conducted to compare the adjusted empirical likelihood, the martingale-based bootstrap, and Efron's bootstrap in terms of coverage accuracies and average lengths of the confidence intervals. The simulation indicates that the proposed adjusted empirical likelihood and the martingale-based bootstrap confidence procedures are comparable, and both seem to outperform Efron's bootstrap procedure.  相似文献   

14.
We describe and contrast several different bootstrap procedures for penalized spline smoothers. The bootstrap methods considered are variations on existing methods, developed under two different probabilistic frameworks. Under the first framework, penalized spline regression is considered as an estimation technique to find an unknown smooth function. The smooth function is represented in a high-dimensional spline basis, with spline coefficients estimated in a penalized form. Under the second framework, the unknown function is treated as a realization of a set of random spline coefficients, which are then predicted in a linear mixed model. We describe how bootstrap methods can be implemented under both frameworks, and we show theoretically and through simulations and examples that bootstrapping provides valid inference in both cases. We compare the inference obtained under both frameworks, and conclude that the latter generally produces better results than the former. The bootstrap ideas are extended to hypothesis testing, where parametric components in a model are tested against nonparametric alternatives.

Datasets and computer code are available in the online supplements.  相似文献   

15.
Several techniques for resampling dependent data have already been proposed. In this paper we use missing values techniques to modify the moving blocks jackknife and bootstrap. More specifically, we consider the blocks of deleted observations in the blockwise jackknife as missing data which are recovered by missing values estimates incorporating the observation dependence structure. Thus, we estimate the variance of a statistic as a weighted sample variance of the statistic evaluated in a “complete” series. Consistency of the variance and the distribution estimators of the sample mean are established. Also, we apply the missing values approach to the blockwise bootstrap by including some missing observations among two consecutive blocks and we demonstrate the consistency of the variance and the distribution estimators of the sample mean. Finally, we present the results of an extensive Monte Carlo study to evaluate the performance of these methods for finite sample sizes, showing that our proposal provides variance estimates for several time series statistics with smaller mean squared error than previous procedures.  相似文献   

16.
The German proposal for a Solvency II-compatible standard model for the life insurance branch calculates the risk capital that is necessary for a sufficient risk capitalisation of the company at hand. This capital is called ‘‘target capital’’ or Solvency Capital Requirement (SCR for short). For this to achieve it is applied the book value of the actuarial reserve onto the well-known market value formula getting the market value (or present value) by means of the classical duration concept as a global approach (cf. the documentation of the standard model of the GDV p. 26). This formula takes into account the impact of the interest rate but leaves aside all the other actuarial assumptions. In particular, the influence of the biometrical assumptions is not considered. This is at least one reason, why this ansatz is – at the time being – no more compatible with the Solvency II requirements and thus does no more satisfy its own entitlements. In the work at hand it is proposed and worked out a concept that overcomes this drawback. The result is a formula with the help of which the present value of the actuarial liabilities is calculated from their book value in fact by taking into account the interest rate as well as the biometrical assumptions. It is to be remarked that the proposed two-dimensional duration concept may be developed completely along the lines given by the classical one-dimensional analogue leaving some arbitraries only on determining the biometrical gauge, i.e., the mapping of the vector that represents the formula of the active lives remaining onto its average value. For this to achieve one has to consider the underlying business in force. The superordinate relevance of such a two-dimensional ansatz lies in the fact that the developments of the project Solvency II during the last months have shown that its success depends crucially on the availability of efficient and well-elaborated approximation procedures.  相似文献   

17.
After determining the initial care level corresponding to the regulations of the German Institutional Long-Term Care Insurance, the care level may change over the course of time. Care assistance may even be cancelled owing to a reduction in necessity for care or due to the death of the recipient. From a cohort of 88,575 long term care patients of the statutory health insurance ‘‘Deutsche BKK’’ this report describes the probabilities for a change or loss of care level and death with regard to age, gender and the length of care in each level. This study thus provides a basis for actuarial calculations of long-term care insurance based on empirical data.  相似文献   

18.
Empirical processes with estimated parameters are a well established subject in nonparametric statistics. In the classical theory they are based on the empirical distribution function which is the nonparametric maximum likelihood estimator for a completely unknown distribution function. In the presence of some “nonparametric” auxiliary information about the distribution, like a known mean or a known median, for example, the nonparametric maximum likelihood estimator is a modified empirical distribution function which puts random masses on the observations in order to take the available information into account [see Owen, Biometrika 75 (1988) 237–249, Ann. Statist. 18 (1990) 90–120, Empirical Likelihood, Chapman & Hall/CRC, London/Boca Raton, FL; Qin and Lawless, Ann. Statist. 22 (1994) 300–325]. Zhang [Metrika 46 (1997) 221–244] has proved a functional central limit theorem for the empirical process pertaining to this modified empirical distribution function. We will consider the corresponding empirical process with estimated parameters here and derive its asymptotic distribution. The limiting process is a centered Gaussian process with a complicated covariance function depending on the unknown parameter. The result becomes useful in practice through the bootstrap, which is shown to be consistent in case of a known mean. The performance of the resulting bootstrap goodness-of-fit test based on the Kolmogorov–Smirnov statistic is studied through simulations.  相似文献   

19.
This study considers the bootstrap cumulative sum (CUSUM) test for a parameter change in location‐scale time series models with heteroscedasticity. The CUSUM test has been popular for detecting an abrupt change in time series models because it performs well in many applications. However, it has severe size distortions in many situations. As a remedy, we consider the bootstrap CUSUM test, particularly focusing on the CUSUM test based on score vectors, and demonstrate the weak consistency of the bootstrap test for its justification. A simulation study and data analysis are conducted for illustration.  相似文献   

20.
It is well known that the empirical copula process converges weakly to a centered Gaussian field. Because the covariance structure of the limiting process depends on the partial derivatives of the unknown copula several bootstrap approximations for the empirical copula process have been proposed in the literature. We present a brief review of these procedures. Because some of these procedures also require the estimation of the derivatives of the unknown copula we propose an alternative approach which circumvents this problem. Finally a simulation study is presented in order to compare the different bootstrap approximations for the empirical copula process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号