首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到6条相似文献,搜索用时 0 毫秒
1.
For semiparametric survival models with interval-censored data and a cure fraction, it is often difficult to derive nonparametric maximum likelihood estimation due to the challenge in maximizing the complex likelihood function. In this article, we propose a computationally efficient EM algorithm, facilitated by a gamma-Poisson data augmentation, for maximum likelihood estimation in a class of generalized odds rate mixture cure (GORMC) models with interval-censored data. The gamma-Poisson data augmentation greatly simplifies the EM estimation and enhances the convergence speed of the EM algorithm. The empirical properties of the proposed method are examined through extensive simulation studies and compared with numerical maximum likelihood estimates. An R package “GORCure” is developed to implement the proposed method and its use is illustrated by an application to the Aerobic Center Longitudinal Study dataset. Supplementary material for this article is available online.  相似文献   

2.
We analyze the reliability of NASA composite pressure vessels by using a new Bayesian semiparametric model. The data set consists of lifetimes of pressure vessels, wrapped with a Kevlar fiber, grouped by spool, subject to different stress levels; 10% of the data are right censored. The model that we consider is a regression on the log‐scale for the lifetimes, with fixed (stress) and random (spool) effects. The prior of the spool parameters is nonparametric, namely they are a sample from a normalized generalized gamma process, which encompasses the well‐known Dirichlet process. The nonparametric prior is assumed to robustify inferences to misspecification of the parametric prior. Here, this choice of likelihood and prior yields a new Bayesian model in reliability analysis. Via a Bayesian hierarchical approach, it is easy to analyze the reliability of the Kevlar fiber by predicting quantiles of the failure time when a new spool is selected at random from the population of spools. Moreover, for comparative purposes, we review the most interesting frequentist and Bayesian models analyzing this data set. Our credibility intervals of the quantiles of interest for a new random spool are narrower than those derived by previous Bayesian parametric literature, although the predictive goodness‐of‐fit performances are similar. Finally, as an original feature of our model, by means of the discreteness of the random‐effects distribution, we are able to cluster the spools into three different groups. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

3.
A hybrid Pareto model for asymmetric fat-tailed data: the univariate case   总被引:1,自引:0,他引:1  
Density estimators that can adapt to asymmetric heavy tails are required in many applications such as finance and insurance. Extreme value theory (EVT) has developed principled methods based on asymptotic results to estimate the tails of most distributions. However, the finite sample approximation might introduce a severe bias in many cases. Moreover, the full range of the distribution is often needed, not only the tail area. On the other hand, non-parametric methods, while being powerful where data are abundant, fail to extrapolate properly in the tail area. We put forward a non-parametric density estimator that brings together the strengths of non-parametric density estimation and of EVT. A hybrid Pareto distribution that can be used in a mixture model is proposed to extend the generalized Pareto (GP) to the whole real axis. Experiments on simulated data show the following. On one hand, the mixture of hybrid Paretos converges faster in terms of log-likelihood and provides good estimates of the tail of the distributions when compared with other density estimators including the GP distribution. On the other hand, the mixture of hybrid Paretos offers an alternate way to estimate the tail index which is comparable to the one estimated with the standard GP methodology. The mixture of hybrids is also evaluated on the Danish fire insurance data set.   相似文献   

4.
A Bayesian approach is used to analyze the seismic events with magnitudes at least 4.7 on Taiwan. Following the idea proposed by Ogata (1988,Journal of the American Statistical Association,83, 9–27), an epidemic model for the process of occurrence times given the observed magnitude values is considered, incorporated with gamma prior distributions for the parameters in the model, while the hyper-parameters of the prior are essentially determined by the seismic data in an earlier period. Bayesian inference is made on the conditional intensity function via Markov chain Monte Carlo method. The results yield acceptable accuracies in predicting large earthquake events within short time periods.  相似文献   

5.
In reliability and life-testing experiments, the researcher is often interested in the effects of extreme or varying stress factors such as temperature, voltage and load on the lifetimes of experimental units. Step-stress test, which is a special class of accelerated life-tests, allows the experimenter to increase the stress levels at fixed times during the experiment in order to obtain information on the parameters of the life distributions more quickly than under normal operating conditions. In this paper, we consider the simple step-stress model from the exponential distribution when there is time constraint on the duration of the experiment. We derive the maximum likelihood estimators (MLEs) of the parameters assuming a cumulative exposure model with lifetimes being exponentially distributed. The exact distributions of the MLEs of parameters are obtained through the use of conditional moment generating functions. We also derive confidence intervals for the parameters using these exact distributions, asymptotic distributions of the MLEs and the parametric bootstrap methods, and assess their performance through a Monte Carlo simulation study. Finally, we present two examples to illustrate all the methods of inference discussed here.  相似文献   

6.
The recovery of 3D left ventricle(LV) shape using 2D echocardiography is very attractable topic in the field of ultrasound imaging. In this paper, we propose a mathematical model to determine the 3D position of LV contours extracted from multiple 2D echocardiography images. We formulate the proposed model as a non-convex constrained minimization problem. To solve it, we propose a proximal alternating minimization algorithm with a solver OPTI for quadratically constrained quadratic program. For validating the proposed model, numerical experiments are performed with real ultrasound data. The experimental results show that the proposed model is promising and available for real echocardiography data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号