排序方式: 共有30条查询结果,搜索用时 15 毫秒
1.
Qi Yongcheng 《数学年刊B辑(英文版)》1998,19(4):499-510
1.IntroductionLet{X.,n21}beasequenceoflidry'swithanondegeneratedistributionfunctionF(x).Supposethereexistsomeconstantsan>0,b.6RandsomeacERsuchthatwhereG.standsforoneoftheextremevaluedistributions:Heretheindex7ERisarealparameter(interpret(1 box)--'/ryase--"for7~0).Theestimationoftheextreme-valueindex7isveryimportantbothintheextremevaluetheoryandinpractice.Manystatistics,suchasHillestimator(forcaseac>0),PickandsestimatorandDekkers-EinmahLdeHaan'smomentestimatorwhicharebasedonafinitesample,… 相似文献
2.
3.
In continuous time, rates of convergence of density estimators fluctuate with the nature of observed sample paths. In this paper, we give a family of rates reached by the kernel estimator and we show that these rates are minimax. Finally, we study applications of these results for specific classes of processes including the Gaussian ones 相似文献
4.
均匀设计抽样的应用 总被引:3,自引:0,他引:3
王兆军 《高校应用数学学报(A辑)》1997,(3):299-310
均匀设计抽样是张润楚和王兆军提出的,并且张润楚和王兆军从理论上证明了它的优良性质。本文考虑了均匀设计抽样在求函数的最大值,积分的近似计算,回归直线的拟合和极大似然估计的求取方面的应用。模拟的结果再次说明了均匀设计抽样的优良性。 相似文献
5.
α-混合序列和的强大数律及其应用 总被引:1,自引:0,他引:1
杨善朝 《高校应用数学学报(A辑)》1996,(4)
研究α-混合序列加权和的强大数律,并将这些结果应用于Priestley,M.B.和Chao,M.T.(1972)提出的非参数回归函数加权核估计,获得较理想结论. 相似文献
6.
王炳章 《高校应用数学学报(A辑)》1997,(2):157-162
研究了一种最近邻回归估计的分布逼近问题,利用随机加权法,给出了最近邻回归估计误差的逼近分布及其逼近的精度,从而改进了文献「1」的结论。 相似文献
7.
对于具有序列协方差结构的正态增长曲线模型,证明了在一定条件下不存在方差的一致最小无偏估计。给出了在任意凸损失下存在回归系数矩阵的任何线性可估函数的一致最小风险无偏估计的另一个充要条件。 相似文献
8.
丁邦俊 《高校应用数学学报(A辑)》1998,13(2):159-166
设X1,X2……Xn为非负随机变量,相互独立具有共同的分布函数F(t),Y1,Y2……Yn是相应的干扰随机变量,非负,相互独立具有共同的分布G(t),并且Xi与Yi也相互独立,文章在仅能观察到Zi=min(Xi,Yi).δi=I(Xi≤Yi),i=1,2……,n和假设G已知的情况下.分别定义了F的均值和方差的估计量,并求出了估计量的近似分布. 相似文献
9.
In this paper, we present a deviation inequality for a common estimator of the conditional value-at-risk for bounded random variables. The result improves a deviation inequality which is obtained by Brown [D.B. Brown, Large deviations bounds for estimating conditional value-at-risk, Operations Research Letters 35 (2007) 722-730]. 相似文献
10.
Frank Nielsen 《Entropy (Basel, Switzerland)》2021,23(11)
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have been proposed in the literature to either estimate, approximate, or lower and upper bound this divergence. In this paper, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two univariate Gaussian mixtures with arbitrary number of components. Our heuristic relies on converting the mixtures into pairs of dually parameterized probability densities belonging to an exponential-polynomial family. To measure with a closed-form formula the goodness of fit between a Gaussian mixture and an exponential-polynomial density approximating it, we generalize the Hyvärinen divergence to -Hyvärinen divergences. In particular, the 2-Hyvärinen divergence allows us to perform model selection by choosing the order of the exponential-polynomial densities used to approximate the mixtures. We experimentally demonstrate that our heuristic to approximate the Jeffreys divergence between mixtures improves over the computational time of stochastic Monte Carlo estimations by several orders of magnitude while approximating the Jeffreys divergence reasonably well, especially when the mixtures have a very small number of modes. 相似文献