首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
As a compromise between nonhomogeneous Poisson process and renewal process, the modulated power law process is more appropriate to model the failures of repairable systems. In this article, objective Bayesian methods are proposed to analyze the modulated power law process. Seven reference priors, one of which is also the Jeffreys prior, are derived. However, only four of them are taken into consideration because of their practicality. Propriety of the posterior densities considering the four reference priors is proved. Predictive distribution of the future failure time is obtained additionally. For the purpose of comparison, the simulation work and real data analysis are carried out based on both the objective Bayesian and maximum likelihood approaches, which show that the objective Bayesian estimation and prediction have much better statistical properties in a frequentist context, and outperforms the maximum likelihood method even with small or moderate sample sizes.  相似文献   

2.
In this paper, the objective Bayesian method is applied to investigate the competing risks model involving both catastrophic and degradation failures. By modeling soft failure as the Wiener degradation process, and hard failures as a Weibull distribution, we obtain the noninformative priors (Jefferys prior and two reference priors) for the parameters. Moreover, we show that their posterior distributions have good properties and we propose Gibbs sampling algorithms for the Bayesian inference based on the Jefferys prior and two reference priors. Some simulation studies are conducted to illustrate the superiority of objective Bayesian method. Finally, we apply our methods to two real data examples and compare the objective Bayesian estimates with the other estimates.  相似文献   

3.
The objective of studying software reliability is to assist software engineers in understanding more of the probabilistic nature of software failures during the debugging stages and to construct reliability models. In this paper, we consider modeling of a multiplicative failure rate whose components are evolving stochastically over testing stages and discuss its Bayesian estimation. In doing so, we focus on the modeling of parameters such as the fault detection rate per fault and the number of faults. We discuss how the proposed model can account for “imperfect debugging” under certain conditions. We use actual inter-failure data to carry out inference on model parameters via Markov chain Monte Carlo methods and present additional insights from Bayesian analysis.  相似文献   

4.
Nonhomogeneous Poisson process (NHPP) is a commonly used stochastic model that is utilized to describe the pattern of repeated occurrence of certain events or conditions. Aninhomogeneous gamma process evolves as a generalization to NHPP, where the observed failure epochs correspond to every successive κ-th event of the underlying Poisson process, κ being an unknown parameter to be estimated from the data. This article focuses on a special class of inhomogeneous gamma process, calledmodulated power law process (MPLP) that assumes the Weibull form of the intensity function. The traditional power law process is a popular stochastic formulation of certain empirical relationships between the time to failure and the cumulative number of failures, often observed in industrial experiments. The MPLP retains this underlying physical basis and provides a more flexible modeling environment potentially leading to a better fit to the failure data at hand. In this paper, we investigate inference issues related to MPLP. The maximum likelihood estimators (MLE’s) of the model parameters are not in closed form and enjoy the curious property that they are asymptotically normal with a singular variance-covariance matrix. Consequently, the derivation of the large-sample results requires non-standard modifications of the usual arguments. We also propose a set of simple closed-form estimators that are asymptotically equivalent to the MLE’s. Extensive simulation results are carried out to supplement the theoretical findings. Finally, we implement our inference results to a failure dataset arising from a repairable system.  相似文献   

5.
In this paper, we consider a latent Markov process governing the intensity rate of a Poisson process model for software failures. The latent process enables us to infer performance of the debugging operations over time and allows us to deal with the imperfect debugging scenario. We develop the Bayesian inference for the model and also introduce a method to infer the unknown dimension of the Markov process. We illustrate the implementation of our model and the Bayesian approach by using actual software failure data.  相似文献   

6.
7.
The reliability for Weibull distribution with homogeneous heavily censored data is analyzed in this study. The universal model of heavily censored data and existing methods, including maximum likelihood, least-squares, E-Bayesian estimation, and hierarchical Bayesian methods, are introduced. An improved method is proposed based on Bayesian inference and least-squares method. In this method, the Bayes estimations of failure probabilities are focused on for all the samples. The conjugate prior distribution of failure probability is set, and an optimization model is developed by maximizing the information entropy of prior distribution to determine the hyper-parameters. By integrating the likelihood function, the posterior distribution of failure probability is then derived to yield the Bayes estimation of failure probability. The estimations of reliability parameters are obtained by fitting distribution curve using least-squares method. The four existing methods are compared with the proposed method in terms of applicability, precision, efficiency, robustness, and simplicity. Specifically, the closed form expressions concerning E-Bayesian estimation and hierarchical Bayesian methods are derived and used. The comparisons demonstrate that the improved method is superior. Finally, three illustrative examples are presented to show the application of the proposed method.  相似文献   

8.
A new method for predicting failures of a partially observable system is presented. System deterioration is modeled as a hidden, 3-state continuous time homogeneous Markov process. States 0 and 1, which are not observable, represent good and warning conditions, respectively. Only the failure state 2 is assumed to be observable. The system is subject to condition monitoring at equidistant, discrete time epochs. The vector observation process is stochastically related to the system state. The objective is to develop a method for optimally predicting impending system failures. Model parameters are estimated using EM algorithm and a cost-optimal Bayesian fault prediction scheme is proposed. The method is illustrated using real data obtained from spectrometric analysis of oil samples collected at regular time epochs from transmission units of heavy hauler trucks used in mining industry. A comparison with other methods is given, which illustrates effectiveness of our approach.  相似文献   

9.
重要度是现代网络薄弱环节识别的常用工具,其能量化网络不同的边对网络可靠性的影响程度。以往的K-终端网络重要度计算方法需已知网络边的可靠性以及边发生失效相互独立的条件,不能满足现实网络对于网络重要度计算的需求。鉴于此,为了突破这些条件的限制,本文在给定失效边数目的概率分布的背景下,发展K-终端网络重要度的计算方法,并提供一个十二面体网络的算例,验证了该计算方法的有效性和正确性。  相似文献   

10.
Recently, a Bayesian network model for inferring non-stationary regulatory processes from gene expression time series has been proposed. The Bayesian Gaussian Mixture (BGM) Bayesian network model divides the data into disjunct compartments (data subsets) by a free allocation model, and infers network structures, which are kept fixed for all compartments. Fixing the network structure allows for some information sharing among compartments, and each compartment is modelled separately and independently with the Gaussian BGe scoring metric for Bayesian networks. The BGM model can equally be applied to both static (steady-state) and dynamic (time series) gene expression data. However, it is this flexibility that renders its application to time series data suboptimal. To improve the performance of the BGM model on time series data we propose a revised approach in which the free allocation of data points is replaced by a changepoint process so as to take the temporal structure into account. The practical inference follows the Bayesian paradigm and approximately samples the network, the number of compartments and the changepoint locations from the posterior distribution with Markov chain Monte Carlo (MCMC). Our empirical results show that the proposed modification leads to a more efficient inference tool for analysing gene expression time series.  相似文献   

11.
A burn‐in study is applied to demonstrate compliance with a targeted early life failure probability of semiconductor products. This is achieved by investigating a sample of the produced chips for reliability‐relevant failures. Usually, a burn‐in study is carried out for a specific reference product with the aim to scale the reference product's failure probability to follower products with different chip sizes. It also appears, however, that there are multiple, differently sized reference products for which burn‐in studies are performed. In this paper, we present a novel model for estimating the failure probability of a chip, which is capable of handling burn‐in studies on multiple reference products. We discuss the model from a combinatorial and a Bayesian perspective; both approaches are shown to provide more accurate estimation results in comparison with a simple area‐based approach. Moreover, we discuss the required modifications of the model if the observed failures are tackled by countermeasures implemented in the chip production process. Finally, the model is applied to the problem of determining the failure probabilities of follower products on the basis of multiple reference products.  相似文献   

12.
失效率的综合E-Bayes估计   总被引:2,自引:0,他引:2       下载免费PDF全文
该文提出了可靠性参数的一种新估计方法综合E-Bayes估计法.在无失效数据情形下给出了失效率的E-Bayes估计的定义,并给出了失效率的E-Bayes估计。在引进失效信息后,给出了失效率的E-Bayes估计,并在此基础上给出了失效率和其它参数的综合E-Bayes估计。最后,结合实际问题进行计算,结果表明该文提出的方法可行且便于应用。  相似文献   

13.
Since last seventies, various software reliability growth models (SRGMs) have been developed to estimate different measures related to quality of software like: number of remaining faults, software failure rate, reliability, cost, release time, etc. Most of the exiting SRGMs are probabilistic. These models have been developed based on various assumptions. The entire software development process is performed by human being. Also, a software can be executed in different environments. As human behavior is fuzzy and the environment is changing, the concept of fuzzy set theory is applicable in developing software reliability models. In this paper, two fuzzy time series based software reliability models have been proposed. The first one predicts the time between failures (TBFs) of software and the second one predicts the number of errors present in software. Both the models have been developed considering the software failure data as linguistic variable. Usefulness of the models has been demonstrated using real failure data.  相似文献   

14.
It is often the case that some information is available on the parameter of failure time distributions from previous experiments or analyses of failure time data. The Bayesian approach provides the methodology for incorporation of previous information with the current data. In this paper, given a progressively type II censored sample from a Rayleigh distribution, Bayesian estimators and credible intervals are obtained for the parameter and reliability function. We also derive the Bayes predictive estimator and highest posterior density prediction interval for future observations. Two numerical examples are presented for illustration and some simulation study and comparisons are performed. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

15.
In reliability theory, the notion of monotone failure rates plays a central role. When prior information indicates that such monotonicity is meaningful, it must be incorporated into the prior distribution whenever inference about the failure rates needs to be made. In this paper we show how this can be done in a straightforward and intuitively pleasing manner. The time interval is partitioned into subintervals of equal width and the number of failures and censoring in each interval is recorded. By defining a Dirichlet as the joint prior distribution for the forward or the backward differences of the conditional probabilities of survival in each interval, we find that the monotonicity is presenved in the posterior estimate of the failure rates. A posterior estimate of the survival function can also be obtained. We illustrate our method by applying it to some real life medical data.  相似文献   

16.
Increasingly large volumes of space–time data are collected everywhere by mobile computing applications, and in many of these cases, temporal data are obtained by registering events, for example, telecommunication or Web traffic data. Having both the spatial and temporal dimensions adds substantial complexity to data analysis and inference tasks. The computational complexity increases rapidly for fitting Bayesian hierarchical models, as such a task involves repeated inversion of large matrices. The primary focus of this paper is on developing space–time autoregressive models under the hierarchical Bayesian setup. To handle large data sets, a recently developed Gaussian predictive process approximation method is extended to include autoregressive terms of latent space–time processes. Specifically, a space–time autoregressive process, supported on a set of a smaller number of knot locations, is spatially interpolated to approximate the original space–time process. The resulting model is specified within a hierarchical Bayesian framework, and Markov chain Monte Carlo techniques are used to make inference. The proposed model is applied for analysing the daily maximum 8‐h average ground level ozone concentration data from 1997 to 2006 from a large study region in the Eastern United States. The developed methods allow accurate spatial prediction of a temporally aggregated ozone summary, known as the primary ozone standard, along with its uncertainty, at any unmonitored location during the study period. Trends in spatial patterns of many features of the posterior predictive distribution of the primary standard, such as the probability of noncompliance with respect to the standard, are obtained and illustrated. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

17.
In the competing risks/multiple decrement model, the joint distribution is often not identifiable given only the observed time of failure and the cause of failure. The traditional approach is consequently to assume a parametric model. In this paper we shall not do this, but rather assume a Bayesian stance, take a Dirichlet process as a prior distribution, and then calculate the posterior distribution given the data. In this paper we show that in dimensions ? 2, the posterior mean yields an inconsistent estimator of the joint probability law, contrary to the common assumption that the prior law ‘washes out’ with large samples. For single decrement mortality tables however, the non-parametric Bayesian method allows a flexible method for adjusting a standard mortality table to reflect mortality experience, or covariate information.  相似文献   

18.
We analyze the reliability of NASA composite pressure vessels by using a new Bayesian semiparametric model. The data set consists of lifetimes of pressure vessels, wrapped with a Kevlar fiber, grouped by spool, subject to different stress levels; 10% of the data are right censored. The model that we consider is a regression on the log‐scale for the lifetimes, with fixed (stress) and random (spool) effects. The prior of the spool parameters is nonparametric, namely they are a sample from a normalized generalized gamma process, which encompasses the well‐known Dirichlet process. The nonparametric prior is assumed to robustify inferences to misspecification of the parametric prior. Here, this choice of likelihood and prior yields a new Bayesian model in reliability analysis. Via a Bayesian hierarchical approach, it is easy to analyze the reliability of the Kevlar fiber by predicting quantiles of the failure time when a new spool is selected at random from the population of spools. Moreover, for comparative purposes, we review the most interesting frequentist and Bayesian models analyzing this data set. Our credibility intervals of the quantiles of interest for a new random spool are narrower than those derived by previous Bayesian parametric literature, although the predictive goodness‐of‐fit performances are similar. Finally, as an original feature of our model, by means of the discreteness of the random‐effects distribution, we are able to cluster the spools into three different groups. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

19.
We consider systems of serial-parallel elements with time to failure following an exponential reliability law. Assuming a testing plan with a bounded number of failures, we obtain an analytical representation of the exact lower confidence limit for the probability of failfree operation of the system.Translated from Statisticheskie Metody, pp. 164–168, 1980.  相似文献   

20.
This paper focuses on the estimation of some models in finance and in particular, in interest rates. We analyse discretized versions of the constant elasticity of variance (CEV) models where the normal law showing up in the usual discretization of the diffusion part is replaced by a range of heavy‐tailed distributions. A further extension of the model is to allow the elasticity of variance to be a parameter itself. This generalized model allows great flexibility in modelling and simplifies the model implementation considerably using the scale mixtures representation. The mixing parameters provide a means to identify possible outliers and protect inference by down‐weighting the distorting effects of these outliers. For parameter estimation, Bayesian approach is adopted and implemented using the software WinBUGS (Bayesian inference using Gibbs sampler). Results from a real data analysis show that an exponential power distribution with a random shape parameter, which is highly leptokurtic compared with the normal distribution, forms the best CEV model for the data. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号