首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
Exact confidence intervals and regions are proposed for the location and scale parameters of the Rayleigh distribution. These sets are valid for complete data, and also for standard and progressive failure censoring. Constrained optimization problems are solved to find the minimum-size confidence sets for the Rayleigh parameters with the required confidence level. The smallest-area confidence region is derived by simultaneously solving four nonlinear equations. Three numerical examples regarding remission times of leukemia patients, strength data and the titanium content in an aircraft-grade alloy, as well as a simulation study, are included for illustrative purposes. Further applications in hypothesis testing and the construction of pointwise and simultaneous confidence bands are also pointed out.  相似文献   

3.
The coefficient of variation (CV) of a population is defined as the ratio of the population standard deviation to the population mean. It is regarded as a measure of stability or uncertainty, and can indicate the relative dispersion of data in the population to the population mean. CV is a dimensionless measure of scatter or dispersion and is readily interpretable, as opposed to other commonly used measures such as standard deviation, mean absolute deviation or error factor, which are only interpretable for the lognormal distribution. CV is often estimated by the ratio of the sample standard deviation to the sample mean, called the sample CV. Even for the normal distribution, the exact distribution of the sample CV is difficult to obtain, and hence it is difficult to draw inferences regarding the population CV in the frequentist frame. Different methods of estimating the sample standard deviation as well as the sample mean result in different shapes of the sampling distribution of the sample CV, from which inferences about the population CV can be made. In this paper we propose a simulation-based Bayesian approach to tackle this problem. A set of real data is used to generate the sampling distribution of the CV under the assumption that the data follow the three-parameter Gamma distribution. A probability interval is then constructed. The method also applies easily to lognormal and Weibull distributions.  相似文献   

4.
Based on progressively type-II censored samples, this paper considers progressive stress accelerated life tests when the lifetime of an item under use condition follows the Weibull distribution with a scale parameter satisfying the inverse power law. It is assumed that the progressive stress is directly proportional to time and the cumulative exposure model for the effect of changing stress holds. Point estimation of the model parameters is obtained graphically by using Weibull probability paper plot that serves as a tool for model identification and also by using the maximum likelihood method. Interval estimation is performed by finding approximate confidence intervals (CIs) for the parameters as well as the studentized-t and percentile bootstrap CIs. Monte Carlo simulation study is carried out to investigate the precision of the estimates and compare the performance of CIs obtained. Finally, two examples are presented to illustrate our results.  相似文献   

5.
This work deals with an asymptotic almost-sure representation of the quantile process under type-II progressive censoring. A convergence rate of the law-of-the-iterated-logarithm type is obtained for this representation. To cite this article: S. Alvarez-Andrade, C. R. Acad. Sci. Paris, Ser. I 347 (2009).  相似文献   

6.
Accelerated life test (ALT) provides a feasible and efficient way to obtain information quickly on lifetime of products by testing them at higher-than-use operating conditions. In this paper, the lifetime of products is assumed to follow a lower truncated family of distributions, when both resilience and threshold parameters are nonconstant and affected by operating stress, inference is discussed for simple constant-stress ALT under progressive Type-II censoring. Point estimates for unknown parameters are presented based on maximum likelihood and pivotal quantities based estimation methods. Meanwhile, generalized, asymptotic and bootstrap confidence intervals for the parameters of interest are constructed as well. Simulation studies and illustrative examples are carried out to investigate the performance of the proposed methods.  相似文献   

7.
This paper focuses on the problem of the estimation of a distribution on an arbitrary complete separable metric space when the data points are subject to censoring by a general class of random sets. If the censoring mechanism is either totally observable or totally ordered, a reverse probability estimator may be defined in this very general framework. Functional central limit theorems are proven for the estimator when the underlying space is Euclidean. Applications are discussed, and the validity of bootstrap methods is established in each case.  相似文献   

8.
Shiang-Tai Liu 《TOP》2016,24(1):1-18
The coefficient of variation is a useful statistical measure, which has long been widely used in many areas. In real-world applications, there are situations where the observations are inexact and imprecise in nature and they have to be estimated. This paper investigates the sample coefficient of variation (CV) with uncertain observations, which are represented by interval values. Since the observations are interval-valued, the derived CV should be interval-valued as well. A pair of mathematical programs is formulated to calculate the lower bound and upper bound of the CV. Originally, the pair of mathematical programs is nonlinear fractional programming problems, which do not guarantee to have global optimum solutions. By model reduction and variable substitutions, the mathematical programs are transformed into a pair of quadratic programs. Solving the pair of quadratic programs produces the global optimum solutions and constructs the interval of the CV. The given example shows that the proposed model is indeed able to help the manufacturer select the most suitable manufacturing process with interval-valued observations.  相似文献   

9.
This paper focuses on the problem of estimating a bivariate distribution when the data points are subject to censoring by a general class of randomsets. A path-dependent estimator for the distribution is proposed. The estimator is sequential in the sense that at any fixed point, it depends only on the data preceding the point. If the censoring mechanism is totally ordered, the paths may be chosen in such a way that the estimate of the distribution is an increasing function. In this case, a functional central limit theorem is proven for the estimator. Applications are discussed, and the validity of bootstrap methods is established.  相似文献   

10.
A new weighted version of the Gompertz distribution is introduced. It is noted that the model represents a mixture of classical Gompertz and second upper record value of Gompertz densities, and using a certain transformation it gives a new version of the two-parameter Lindley distribution. The model can be also regarded as a dual member of the log-Lindley-X family. Various properties of the model are obtained, including hazard rate function, moments, moment generating function, quantile function, skewness, kurtosis, conditional moments, mean deviations, some types of entropy, mean residual lifetime and stochastic orderings. Estimation of the model parameters is justified by the method of maximum likelihood. Two real data sets are used to assess the performance of the model among some classical and recent distributions based on some evaluation goodness-of-fit statistics. As a result, the variance-covariance matrix and the confidence interval of the parameters, and some theoretical measures have been calculated for such data for the proposed model with discussions.  相似文献   

11.
In this paper we consider stochastic programming problems where the objective function is given as an expected value function. We discuss Monte Carlo simulation based approaches to a numerical solution of such problems. In particular, we discuss in detail and present numerical results for two-stage stochastic programming with recourse where the random data have a continuous (multivariate normal) distribution. We think that the novelty of the numerical approach developed in this paper is twofold. First, various variance reduction techniques are applied in order to enhance the rate of convergence. Successful application of those techniques is what makes the whole approach numerically feasible. Second, a statistical inference is developed and applied to estimation of the error, validation of optimality of a calculated solution and statistically based stopping criteria for an iterative alogrithm. © 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.Supported by CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico), Brasília, Brazil, through a Doctoral Fellowship under grant 200595/93-8.  相似文献   

12.
Thomas and Wilson (Technometrics 14 (1972) 679) developed a computational method for calculating the single and product moments of order statistics from progressively censored samples by making use of the corresponding moments of the usual order statistics. The absence of an explicit representation for the marginal and joint density function of order statistics under progressive censoring makes their method extremely tedious. By deriving the required marginal and joint density functions in explicit form, we obtain an alternative, highly efficient, method for computing the desired moments.  相似文献   

13.
14.
Observation of lifetimes by means of cross-sectional surveys typically results in left-truncated, right-censored data. In some applications, it may be assumed that the truncation variable is uniformly distributed on some time interval, leading to the so-called length-biased sampling. This information is relevant, since it allows for more efficient estimation of survival and related parameters. In this work we introduce and analyze new empirical methods in the referred scenario, when the sampled lifetimes are at risk of Type I censoring from the right. We illustrate the method with real economic data. Work supported by the Grants PGIDIT02PXIA30003PR and BFM2002-03213.  相似文献   

15.
Censoring models are frequently used in reliability analysis to reduce experimental time. The three types of censoring models are type-I, type-II and random censoring. In this study, we focus on the right random censoring model. In this model, if the failure time exceeds its associated censoring time, then the failure time becomes a censored observation. In this case, many authors (see: Lee, Statistical Methods for Survival Data Analysis, 2nd Edition, Wiley, New York, 1992; Lawless, Statistical Models and Methods for Lifetime Data, Wiley, New York, 1982; Miller, Survival Analysis, Wiley, New York, 1981, among others) considered using the observed censoring time to impute the censored observation which, however, underestimates the true failure time. Herein, two methods to impute the censored observations are proposed in a right random censoring model for a 2-parameter Weibull distribution. By a Monte Carlo simulation, the quantile estimates are calculated to assess the performance of the proposed imputation methods with respect to their relative mean square error. Simulation results indicate that the two imputation methods proposed herein are superior to the method proposed by the above authors if the shape parameter of Weibull distribution exceeds 1, except for the lower quantiles.  相似文献   

16.
This paper takes into account the estimation for the unknown parameter of the Rayleigh distribution under Type II progressive censoring with binomial removals, where the number of units removed at each failure time follows a binomial distribution. Maximum likelihood and Bayes procedure are used to derive both point and interval estimates of the parameters involved in the model. The expected termination point to complete the censoring test is computed and analyzed under binomial censoring scheme. Numerical examples are given to illustrate the approach by means of Monte Carlo simulation. A real life data set is used for illustrative purposes in conclusion.  相似文献   

17.
In many real world problems, the design space is huge and the estimation of performance measure has to rely on simulation which is time-consuming. Hence, to find the optimal design in the design space based on the simulation output is not trivial. It is important to have a computing time allocation rule to decide how much effort to spend in sampling the design space, how many designs to sample, and how long to run for each design alternative within a given computing budget. In this paper, we propose a framework for making these allocation decisions. We use the problem of assemble-to-order (ATO) systems to demonstrate how this framework can be applied. The sample average approximation (SAA) method is chosen as the sampling method used in this application example. The numerical results show that this framework provides a good basis for allocation decisions.  相似文献   

18.
This paper describes a simulation-based project to help North Mersey Community National Health Service Trust to design and plan the operation of a NHS Walk-in Centre. The simulation model developed of this multi-service facility was used to facilitate managers and health professionals to recognize existing problems and potential future problems, and to investigate ideas for their ‘solution’. In the fast-moving NHS where initiatives to improve access, such as walk-in centres, are a recent development and where no two centres are the same, ideas for best practice borrowed from elsewhere can be quickly tested for suitability in the local situation.  相似文献   

19.
Conditional inference about a mean of an inverse Gaussian distribution with known coefficient of variation is discussed. For a random sample from the distribution, sufficient statistics with respect to the mean parameter include an ancillary statistic. The effects of conditioning on the ancillary statistic are investigated. It is shown that the model provides a good illustration of R. A. Fisher's recommendation concerning use of the observed second derivative of the log likelihood function in normal approximations.This work was started while Ksei Iwase was visiting the Institute of Statistical Mathematics in Spring, 1987, and was partly supported by the ISM Cooperative Research Program (88-ISM·CRP-7), and by Scientific Research Fund No. 62540173 from the Ministry of Education, Science and Culture of Japan.  相似文献   

20.
Joint progressive censoring schemes are quite useful to conduct comparative life‐testing experiment of different competing products. Recently, Mondal and Kundu (“A New Two Sample Type‐II Progressive Censoring Scheme,” Commun Stat‐Theory Methods; 2018) introduced a joint progressive censoring scheme on two samples known as the balanced joint progressive censoring (BJPC) scheme. Optimal planning of such progressive censoring scheme is an important issue to the experimenter. This article considers optimal life‐testing plan under the BJPC scheme using the Bayesian precision and D‐optimality criteria, assuming that the lifetimes follow Weibull distribution. In order to obtain the optimal BJPC life‐testing plans, one needs to carry out an exhaustive search within the set of all admissible plans under the BJPC scheme. However, for large sample size, determination of the optimal life‐testing plan is difficult by exhaustive search technique. A metaheuristic algorithm based on the variable neighborhood search method is employed for computation of the optimal life‐testing plan. Optimal plans are provided under different scenarios. The optimal plans depend upon the values of the hyperparameters of the prior distribution. The effect of different prior information on optimal scheme is studied.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号