首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 529 毫秒
1.
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.  相似文献   

2.
A maximum likelihood method is used to deal with the combined estimation of multi-measurements of a branching ratio, where each result can be presented as an upper limit. The joint likelihood function is constructed using observed spectra of all measurements and the combined estimate of the branching ratio is obtained by maximizing the joint likelihood function. The Bayesian credible interval or upper limit of the combined branching ratio,is given in cases both with and without inclusion of systematic error.  相似文献   

3.
Dynamic cumulative residual (DCR) entropy is a valuable randomness metric that may be used in survival analysis. The Bayesian estimator of the DCR Rényi entropy (DCRRéE) for the Lindley distribution using the gamma prior is discussed in this article. Using a number of selective loss functions, the Bayesian estimator and the Bayesian credible interval are calculated. In order to compare the theoretical results, a Monte Carlo simulation experiment is proposed. Generally, we note that for a small true value of the DCRRéE, the Bayesian estimates under the linear exponential loss function are favorable compared to the others based on this simulation study. Furthermore, for large true values of the DCRRéE, the Bayesian estimate under the precautionary loss function is more suitable than the others. The Bayesian estimates of the DCRRéE work well when increasing the sample size. Real-world data is evaluated for further clarification, allowing the theoretical results to be validated.  相似文献   

4.
Entropy measures the uncertainty associated with a random variable. It has important applications in cybernetics, probability theory, astrophysics, life sciences and other fields. Recently, many authors focused on the estimation of entropy with different life distributions. However, the estimation of entropy for the generalized Bilal (GB) distribution has not yet been involved. In this paper, we consider the estimation of the entropy and the parameters with GB distribution based on adaptive Type-II progressive hybrid censored data. Maximum likelihood estimation of the entropy and the parameters are obtained using the Newton–Raphson iteration method. Bayesian estimations under different loss functions are provided with the help of Lindley’s approximation. The approximate confidence interval and the Bayesian credible interval of the parameters and entropy are obtained by using the delta and Markov chain Monte Carlo (MCMC) methods, respectively. Monte Carlo simulation studies are carried out to observe the performances of the different point and interval estimations. Finally, a real data set has been analyzed for illustrative purposes.  相似文献   

5.
The point and interval estimations for the unknown parameters of an exponentiated half-logistic distribution based on adaptive type II progressive censoring are obtained in this article. At the beginning, the maximum likelihood estimators are derived. Afterward, the observed and expected Fisher’s information matrix are obtained to construct the asymptotic confidence intervals. Meanwhile, the percentile bootstrap method and the bootstrap-t method are put forward for the establishment of confidence intervals. With respect to Bayesian estimation, the Lindley method is used under three different loss functions. The importance sampling method is also applied to calculate Bayesian estimates and construct corresponding highest posterior density (HPD) credible intervals. Finally, numerous simulation studies are conducted on the basis of Markov Chain Monte Carlo (MCMC) samples to contrast the performance of the estimations, and an authentic data set is analyzed for exemplifying intention.  相似文献   

6.
The statistical inference of the reliability and parameters of the stress–strength model has received great attention in the field of reliability analysis. When following the generalized progressive hybrid censoring (GPHC) scheme, it is important to discuss the point estimate and interval estimate of the reliability of the multicomponent stress–strength (MSS) model, in which the stress and the strength variables are derived from different distributions by assuming that stress follows the Chen distribution and that strength follows the Gompertz distribution. In the present study, the Newton–Raphson method was adopted to derive the maximum likelihood estimation (MLE) of the model parameters, and the corresponding asymptotic distribution was adopted to construct the asymptotic confidence interval (ACI). Subsequently, the exact confidence interval (ECI) of the parameters was calculated. A hybrid Markov chain Monte Carlo (MCMC) method was adopted to determine the approximate Bayesian estimation (BE) of the unknown parameters and the high posterior density credible interval (HPDCI). A simulation study with the actual dataset was conducted for the BEs with squared error loss function (SELF) and the MLEs of the model parameters and reliability, comparing the bias and mean squares errors (MSE). In addition, the three interval estimates were compared in terms of the average interval length (AIL) and coverage probability (CP).  相似文献   

7.
Solving the inverse problem of nanoparticle characterization has the potential to advance science and benefit society. While considerable progress has been made within a framework based on the scattering of surface plasmon-polaritons, an aspect not heretofore considered is the quantification of uncertainty in the estimation of a nanoparticle characteristic. Therefore, the present article offers a technique by which an investigator may augment an estimate of a nanoparticle characteristic with a companion “credible interval”. Analogous to the familiar confidence interval but arising from within the Bayesian statistical paradigm, a credible interval allows the investigator to make a statement such as “the nanoparticle diameter lies between 36 and 48 nm with 95% probability” instead of merely “the nanoparticle diameter is estimated to be 42 nm”. Our technique may even be applied outside of the surface plasmon-polariton scattering framework, as long as the investigator specifies his/her prior beliefs about the nanoparticle characteristic and indicates which potential outcomes are likely or unlikely in whatever experiment he/she designs to estimate the nanoparticle characteristic. Two numerical studies illustrate the implementation and performance of our technique in constructing ranges of likely values for nanoparticle diameters and agglomeration levels, respectively.  相似文献   

8.
In this paper, we discuss the statistical analysis of a simple step-stress accelerated competing failure model under progressively Type-II censoring. It is assumed that there is more than one cause of failure, and the lifetime of the experimental units at each stress level follows exponential distribution. The distribution functions under different stress levels are connected through the cumulative exposure model. The maximum likelihood, Bayesian, Expected Bayesian, and Hierarchical Bayesian estimations of the model parameters are derived based on the different loss function. Based on Monte Carlo Simulations. We also get the average length and the coverage probability of the 95% confidence intervals and highest posterior density credible intervals of the parameters. From the numerical studies, it can be seen that the proposed Expected Bayesian estimations and Hierarchical Bayesian estimations have better performance in terms of the average estimates and mean squared errors, respectively. Finally, the methods of statistical inference discussed here are illustrated with a numerical example.  相似文献   

9.
In this paper, we study the statistical inference of the generalized inverted exponential distribution with the same scale parameter and various shape parameters based on joint progressively type-II censored data. The expectation maximization (EM) algorithm is applied to calculate the maximum likelihood estimates (MLEs) of the parameters. We obtain the observed information matrix based on the missing value principle. Interval estimations are computed by the bootstrap method. We provide Bayesian inference for the informative prior and the non-informative prior. The importance sampling technique is performed to derive the Bayesian estimates and credible intervals under the squared error loss function and the linex loss function, respectively. Eventually, we conduct the Monte Carlo simulation and real data analysis. Moreover, we consider the parameters that have order restrictions and provide the maximum likelihood estimates and Bayesian inference.  相似文献   

10.
This paper presents a Bayesian approach for force reconstruction which can deal with both measurement noise and model uncertainty. In particular, an uncertain model is considered for inversion in the form of a matrix of frequency response functions whose modal parameters originate from either measurements or a finite element model. The model uncertainty and the regularization parameter are jointly determined with the unknown force through Monte Carlo Markov chain methods. Bayesian credible intervals of the force are built from its posterior probability density function by taking into account the quantified model uncertainty and measurement noise. The proposed approach is illustrated and validated on numerical and experimental examples.  相似文献   

11.
This paper investigates the statistical inference of inverse power Lomax distribution parameters under progressive first-failure censored samples. The maximum likelihood estimates (MLEs) and the asymptotic confidence intervals are derived based on the iterative procedure and asymptotic normality theory of MLEs, respectively. Bayesian estimates of the parameters under squared error loss and generalized entropy loss function are obtained using independent gamma priors. For Bayesian computation, Tierney–Kadane’s approximation method is used. In addition, the highest posterior credible intervals of the parameters are constructed based on the importance sampling procedure. A Monte Carlo simulation study is carried out to compare the behavior of various estimates developed in this paper. Finally, a real data set is analyzed for illustration purposes.  相似文献   

12.
A general probabilistic technique for estimating background contributions to measured spectra is presented. A Bayesian model is used to capture the defining characteristics of the problem, namely, that the background is smoother than the signal. The signal is allowed to have positive and/or negative components. The background is represented in terms of a cubic spline basis. A variable degree of smoothness of the background is attained by allowing the number of knots and the knot positions to be adaptively chosen on the basis of the data. The fully Bayesian approach taken provides a natural way to handle knot adaptivity and allows uncertainties in the background to be estimated. Our technique is demonstrated on a particle induced x-ray emission spectrum from a geological sample and an Auger spectrum from iron, which contains signals with both positive and negative components.  相似文献   

13.
Conditional Source-term Estimation (CSE) obtains the conditional species mass fractions by inverting a Fredholm integral equation of the first kind. In the present work, a Bayesian framework is used to compare two different regularisation methods: zeroth-order temporal Tikhonov regulatisation and first-order spatial Tikhonov regularisation. The objectives of the current study are: (i) to elucidate the ill-posedness of the inverse problem; (ii) to understand the origin of the perturbations in the data and quantify their magnitude; (iii) to quantify the uncertainty in the solution using different priors; and (iv) to determine the regularisation method best suited to this problem. A singular value decomposition shows that the current inverse problem is ill-posed. Perturbations to the data may be caused by the use of a discrete mixture fraction grid for calculating the mixture fraction PDF. The magnitude of the perturbations is estimated using a box filter and the uncertainty in the solution is determined based on the width of the credible intervals. The width of the credible intervals is significantly reduced with the inclusion of a smoothing prior and the recovered solution is in better agreement with the exact solution. The credible intervals for temporal and spatial smoothing are shown to be similar. Credible intervals for temporal smoothing depend on the solution from the previous time step and a smooth solution is not guaranteed. For spatial smoothing, the credible intervals are not dependent upon a previous solution and better predict characteristics for higher mixture fraction values. These characteristics make spatial smoothing a promising alternative method for recovering a solution from the CSE inversion process.  相似文献   

14.
高阶累积量具有高斯噪声抑制和阵元扩展特性,将高阶累积量引入水声信号的方位估计中,提出了离格稀疏贝叶斯学习重构的高阶累积量测向算法。该方法利用高阶累积量对高斯噪声的自然盲性,计算阵列信号四阶累积量来滤除高斯噪声,使阵元在原来的结构上扩展了一倍;并构造出选择矩阵剔除了四阶累积量中的冗余项,能再一次的扩展阵元,得到的新观测模型具有更好的统计性能;最后利用空域稀疏性,推导出四阶累积量下的离格稀疏表示模型,采用贝叶斯学习解算出源信号的最大后验概率,实现了目标方位估计。数值仿真和海试实验数据表明,该方法在相邻声源方位间隔为4°的情况下分辨概率可达到95%以上,在信噪比大于-5 dB时目标方位估计的均方根误差在1°以内,可显著抑制背景噪声干扰,在多声源密集分布条件下也能准确、稳健的对水声目标方位进行估计。   相似文献   

15.
The maximum likely and optimal (Bayesian) algorithms for detecting an arbitrary-shaped signal observed against the background of Gaussian white noise and for measuring the duration are synthesized. Exact expressions for the characteristics of the maximum likely algorithms are found. The characteristics of the Bayesian algorithms are obtained using computer simulations.  相似文献   

16.
This paper discussed the estimation of stress-strength reliability parameter R=P(Y<X) based on complete samples when the stress-strength are two independent Poisson half logistic random variables (PHLD). We have addressed the estimation of R in the general case and when the scale parameter is common. The classical and Bayesian estimation (BE) techniques of R are studied. The maximum likelihood estimator (MLE) and its asymptotic distributions are obtained; an approximate asymptotic confidence interval of R is computed using the asymptotic distribution. The non-parametric percentile bootstrap and student’s bootstrap confidence interval of R are discussed. The Bayes estimators of R are computed using a gamma prior and discussed under various loss functions such as the square error loss function (SEL), absolute error loss function (AEL), linear exponential error loss function (LINEX), generalized entropy error loss function (GEL) and maximum a posteriori (MAP). The Metropolis–Hastings algorithm is used to estimate the posterior distributions of the estimators of R. The highest posterior density (HPD) credible interval is constructed based on the SEL. Monte Carlo simulations are used to numerically analyze the performance of the MLE and Bayes estimators, the results were quite satisfactory based on their mean square error (MSE) and confidence interval. Finally, we used two real data studies to demonstrate the performance of the proposed estimation techniques in practice and to illustrate how PHLD is a good candidate in reliability studies.  相似文献   

17.
I review methods for modeling gravitational lens systems comprising multiple images of a background source surrounding a foreground galaxy. In a Bayesian framework, the likelihood is driven by the nature of the data, which in turn depends on whether the source is point-like or extended. The prior encodes astrophysical expectations about lens galaxy mass distributions, either through a careful choice of model families, or through an explicit Bayesian prior applied to under-constrained free-form models. We can think about different lens modeling methods in terms of their choices of likelihoods and priors.  相似文献   

18.
19.
改进的贝叶斯压缩感知目标方位估计   总被引:2,自引:0,他引:2       下载免费PDF全文
周明阳  郭良浩  闫超 《声学学报》2019,44(6):961-969
针对基于高斯先验模型的贝叶斯压缩感知在目标方位(Direction Of Arrival,DOA)估计中可能出现明显随机伪峰的问题,改进了高斯先验模型,并在此基础上提出了一种贝叶斯压缩感知目标方位估计方法。通过波束输出噪声背景预估与二值指示变量标记,并引入基于信号先验方差的噪声方差估计方法,与变分贝叶斯推断相结合改进目标方位估计性能和优化迭代收敛过程。利用32元线阵对改进算法进行数值仿真处理和分析结果表明,该改进方法不仅可以准确估计目标信号的方位,而且可以显著地减少空间谱中伪峰的数量。实际海上实验数据处理结果表明,使用改进后的贝叶斯压缩感知方法进行DOA估计,可以显著地抑制空间谱中随机的伪峰,提高波束输出峰值背景比,具有更强的目标检测能力。  相似文献   

20.
红外成像探测系统作用距离分析方法研究   总被引:26,自引:4,他引:22  
根据视频跟踪测量技术工程实践中,可靠跟踪测量对目标在探测器靶面上的成像尺寸、照度及对比度的要求,综合考虑背景辐射和目标成像弥散的影响,对原有红外探测系统作用距离计算公式进行了改进,给出了适用于红外焦平面成像探测系统的新的作用距离分析方法,并针对水平观测、非水平穿越大气层观测等应用条件推导了具体的计算公式.文中通过应用示例进行了比较分析,得出了新分析方法比原分析方法具有更高可信度的结论.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号