首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
Step‐stress accelerated degradation testing (SSADT) has become a common approach to predicting lifetime for highly reliable products that are unlikely to fail in a reasonable time under use conditions or even elevated stress conditions. In literature, the planning of SSADT has been widely investigated for stochastic degradation processes, such as Wiener processes and gamma processes. In this paper, we model the optimal SSADT planning problem from a Bayesian perspective and optimize test plans by determining both stress levels and the allocation of inspections. Large‐sample approximation is used to derive the asymptotic Bayesian utility functions under 3 planning criteria. A revisited LED lamp example is presented to illustrate our method. The comparison with optimal plans from previous studies demonstrates the necessity of considering the stress levels and inspection allocations simultaneously.  相似文献   

2.
This paper explores inferential procedures for the Wiener constant-stress accelerated degradation model under degradation mechanism invariance. The exact confidence intervals are obtained for the parameters of the proposed accelerated degradation model. The generalized confidence intervals are also proposed for the reliability function and pth quantile of the lifetime at the normal operating stress level. In addition, the prediction intervals are developed for the degradation characteristic, lifetime and remaining useful life of the product at the normal operating stress level. The performance of the proposed generalized confidence intervals and the prediction intervals is assessed by the Monte Carlo simulation. Furthermore, a new optimum criterion is proposed based on minimizing the mean of the upper prediction limit for the degradation characteristic at the design stress level. The exact optimum plan is also derived for the Wiener accelerated degradation model according to the proposed optimal criterion. The proposed interval procedures and optimum plan are the free of the equal testing interval assumption. Finally, two examples are provided to illustrate the proposed interval procedures and exact optimum plan. Specifically, based on the degradation data of LEDs, some interval estimates of quantities related to reliability indicators are obtained. For the degradation data of carbon-film resistors, the optimal allocation of test units is derived in terms of the proposed optimal criterion.  相似文献   

3.
Acceptance sampling has been one of practical tools for quality assurance applications, which provide a general rule to the producer and the consumer for product acceptance determination. It has been shown that variables sampling plans requires less sampling compared with attributes sampling plans. Thus, variables sampling plans become more attractive and desirable especially when the required quality level is very high or the allowable fraction non-conforming is very small. This paper attempts to develop an efficient and economic sampling scheme, variables repetitive group sampling plan, by incorporating the concept of Taguchi loss function. The OC curve of the proposed plan is derived based on the exact sampling distribution and the plan parameters are determined by minimizing the average sample number with two constraints specified by the producer and the consumer. The efficiency of the proposed variables RGS is examined and also compared with the existing variables single sampling plan in terms of the sample size required for inspection. In addition, tables of the plan parameters for various combinations of entry parameters are provided and an example is presented for illustration.  相似文献   

4.
In this paper, optimal constant-stress accelerated degradation test plans are developed under the assumption that the degradation characteristic follows a Gamma processes. The test stress levels and the proportion of units allocated to each stress level are determined by D-criterion and V-criterion. The general equivalence theorem (GET) is used to verify that the optimized test plans are globally optimum. In addition, compromise test plans are also studied. Finally, an example is provided to illustrate the proposed method and a sensitivity analysis is conducted to investigate the robustness of optimal plans.  相似文献   

5.
定时截尾下指数分布产品可靠性抽样检验方案   总被引:8,自引:0,他引:8  
本文给出了制订定时截尾下指数分布产品可靠性抽样检验方案的统计方法.检验统计量是平均寿命倒数的极大似然估计.提出了一种选择截尾时间的方法.利用分布分位数的Cornish-Fisher展开近似地确定了样本量和接收常数.模拟结果表明,本文给出的方法是可行的.  相似文献   

6.
Economical sampling plans to ensure the qualities of Burr type XII distributed lifetimes were established using a truncated life test. The Bayesian inference method was used to address the lot-to-lot variation of products. The sampling plan was characterized by the sample size and the acceptance number to minimize the expected total cost. A simple empirical Bayesian estimation method was provided to estimate the hyperparameters of prior distribution, and simulation studies were conducted to validate the proposed empirical Bayesian estimation method. Lastly, the application of this proposed method was illustrated using two examples.  相似文献   

7.
In this paper, a statistical analysis method is proposed to research life characteristics of products based on the partially accelerated life test. We discuss the statistical analysis for constant-stress partially accelerated life tests with Lomax distribution based on interval censored samples. The EM algorithm is used to obtain the maximum likelihood estimations(MLEs) and interval estimations for the shape parameter and acceleration factor.The average relative errors(AREs), mean square errors(MSEs), the confidence intervals for the parameters, and the influence of the sample size are discussed. The results show that the AREs and MSEs of the MLEs decrease with the increase of sample size. Finally, a simulation sample is used to estimate the reliability under different stress levels.  相似文献   

8.
The paper proposes methodology for resource allocation and target setting based on DEA (data envelopment analysis). It deals with organization can be modeled as consisting of several production units, each of which has parallel production lines. The previous studies in the DEA literature only deal with reallocating/allocating organizational resources to production units and set targets for them. In their researches, the production unit is treated as a black box. In such circumstances, how to arrange the production at production unit level is not clear. This paper serves to generate resource allocation and target setting plan for each production unit by opening the black box. The proposed model exploits production information of production lines in generating production plans. The resulting plan has following characteristics: (1) the performance of each production lines are evaluated under common weights; (2) the weights chose for evaluation keep the efficiency of the entire unit not worse off; (3) the worst behaved production line in the production unit under evaluation are improved as much as possible. Finally, the real data of a production system extracted from extant literature are used to demonstrate the proposed method.  相似文献   

9.
A pension plan is said to be exactly vested if it provides in addition to the benefit available upon retirement, a benefit, upon termination for any cause prior to retirement, which is exactly equivalent to the actuarial accured liability for the terminating participant.The concept of exact vesting has simple application in defined contribution plans such as those of the Teachers Insurance and Annuity Association. It is also feasible to develop the exact vesting concept for a defined benefit plan which uses an individual type of actuarial cost method. An exactly vested plan would have more individual equity than is available under customary vesting and early retirement provisions of defined benefit plans.In this paper, theory is developed for an exactly vested model plan in parallel to the theory for a pure pension model plan discussed in previous papers on pension funding dynamics.  相似文献   

10.
As a new reliability test plan, generalized progressive hybrid censoring can improve test efficiency by allowing experimenters to observe a pre-specified number of failure samples before the final termination point. Based on a class of widely used life distribution in life data analysis --- generalized exponential distribution, this paper discusses its parameters inference issue under generalized progressive hybrid censoring scheme. EM Algorithm is used to estimate parameters of the considered model. Simulation studies and a real-data analysis are carried out to illustrate the performance of finite sample for the proposed procedure.  相似文献   

11.
This paper proposes a systematic method of modeling accelerated degradation data based on the acceleration factor constant principle. Wiener stochastic process is considered because it is the most extensively used for degradation modeling. For the Wiener stochastic processes with three different time functions, the parameter relationships, which should be satisfied under any two different stress levels, are deduced according to the acceleration factor constant principle. The deduced parameter relationships indicate the stress-related parameters, which are applied to establish accurate accelerated degradation models. In addition, the deduced parameter relationships provide a guidance to test the consistency of the degradation mechanisms under different stress levels. A hypothesis method based on Analysis of Variance is adopted to identify the accelerated stress levels with different degradation mechanism. The degradation data under these stress levels should not be used to assess the product's reliability. The methods of validating accelerated degradation models and reliability assessments are also proposed. The simulation results prove the feasibility and effectiveness of the proposed methods. From the numerical example, it is concluded that the accelerated degradation model established based on the acceleration factor constant principle is more credible and accurate.  相似文献   

12.
In many reliability analyses, the probability of obtaining a defective unit in a production process should not be considered constant even though the process is stable and in control. Engineering experience or previous data of similar or related products may often be used in the proper selection of a prior model to describe the random fluctuations in the fraction defective. A generalized beta family of priors, several maximum entropy priors and other prior models are considered for this purpose. In order to determine the acceptability of a product based on the lifelengths of some test units, failure-censored reliability sampling plans for location-scale distributions using average producer and consumer risks are designed. Our procedure allows the practitioners to incorporate a restricted parameter space into the reliability analysis, and it is reasonably insensitive to small disturbances in the prior information. Impartial priors are used to reflect prior neutrality between the producer and the consumer when a consensus on the elicited prior model is required. Nonetheless, our approach also enables the producer and the consumer to assume their own prior distributions. The use of substantial prior information can, in many cases, significantly reduce the amount of testing required. However, the main advantage of utilizing a prior model for the fraction defective is not necessarily reduced sample size but improved assessment of the true sampling risks. An example involving shifted exponential lifetimes is considered to illustrate the results.  相似文献   

13.
14.
Joint progressive censoring schemes are quite useful to conduct comparative life‐testing experiment of different competing products. Recently, Mondal and Kundu (“A New Two Sample Type‐II Progressive Censoring Scheme,” Commun Stat‐Theory Methods; 2018) introduced a joint progressive censoring scheme on two samples known as the balanced joint progressive censoring (BJPC) scheme. Optimal planning of such progressive censoring scheme is an important issue to the experimenter. This article considers optimal life‐testing plan under the BJPC scheme using the Bayesian precision and D‐optimality criteria, assuming that the lifetimes follow Weibull distribution. In order to obtain the optimal BJPC life‐testing plans, one needs to carry out an exhaustive search within the set of all admissible plans under the BJPC scheme. However, for large sample size, determination of the optimal life‐testing plan is difficult by exhaustive search technique. A metaheuristic algorithm based on the variable neighborhood search method is employed for computation of the optimal life‐testing plan. Optimal plans are provided under different scenarios. The optimal plans depend upon the values of the hyperparameters of the prior distribution. The effect of different prior information on optimal scheme is studied.  相似文献   

15.
Lio et al (2010a,?2010b) introduced two single acceptance sampling plans (SASPs) for the percentiles of Birnbaum-Saunders and Burr type XII distribution with a truncated censoring scheme. They assured that the acceptance sampling plans for percentiles significantly improve the traditional ones for mean life. In this paper, a double-sampling procedure is developed for Burr type XII distribution percentiles to save sample resource with a truncated censoring scheme. Minimum sample sizes for implementing the proposed double-sampling method are determined under the satisfied levels of consumer's risk and producer's risk. Illustrative examples are used to demonstrate the applications of the proposed method. Compared with the SASP, the proposed double-sampling acceptance plan uses less sample resource in average for truncated life testing.  相似文献   

16.
The design of single sampling plans in which the lot acceptance decision is based on both variables and attribute measurement of quality is discussed. A new plan, called the combined attributes–variables plan, is proposed incorporating an acceptance number to the regular variables plan for consumer protection. A design approach for the new plan is also developed for food manufacturing applications in which the sample size cannot be predetermined because of short production lengths and other analytical testing issues. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

17.
Although Multiscale Cancer Modeling has a realistic view in the process of tumor growth, its numerical algorithm is time consuming. Therefore, it is problematic to run and to find the best treatment plan for chemotherapy, even in case of a small size of tissue. Using an artificial neural network, this paper simulates the multiscale cancer model faster than its numerical algorithm. In order to find the best treatment plan, it suggests applying a simpler avascular model called Gompertz. By using these proposed methods, multiscale cancer modeling may be extendable to chemotherapy for a realistic size of tissue.In order to simulate multiscale model, a hierarchical neural network called Nested Hierarchical Self Organizing Map (NHSOM) is used. The basis of the NHSOM is an enhanced version of SOM, with an adaptive vigilance parameter. Corresponding parameter and the overall bottom-up design guarantee the quality of clustering, and the embedded top-down architecture reduces computational complexity.Although by applying NHSOM, the process of simulation runs faster compared with that of the numerical algorithm, it is not possible to check a simple search space. As a result, a set containing the best treatment plans of a simpler model (Gompertz) is used. Additionally, it is assumed in this paper, that the distribution of drug in vessels has a linear relation with the blood flow rate. The technical advantage of this assumption is that by using a simple linear relation, a given diffusion of a drug dosage may be scaled to the desired one.By extracting a proper feature vector from the multiscale model and using NHSOM, applying the scaled-best treatment plans of Gompertz model is done for a small size of tissue. In addition, simulating the effect of stress reduction on normal tissue after chemotherapy is another advantage of using NHSOM, which is a kind of “emergent”.  相似文献   

18.
We consider the problem of designing single and the double sampling plans for monitoring dependent production processes. Based on simulated samples from the process, Nelson proposed a new approach of estimating the characteristics of single sampling plans and, using these estimates, designing optimal plans. In this paper, we extend his approach to the design of optimal double sampling plans. We first propose a simple methodology for obtaining the unbiased estimators of various characteristics of single and double sampling plans. This is achieved by defining the various characteristics of sampling plans as explicit random variables. Some of the important properties of the double sampling plans are established. Using these results, an efficient algorithm is developed to obtain optimal double sampling plans. A comparison with a crude search shows that our algorithm leads to about 90% savings, on the average, in computational timings. The procedure is also explained through a suitable example for the ARMA(1,1) model. It is observed, for instance, that an optimal double sampling plan leads to about 23% reduction in average sample number, compared to an optimal single sampling plan. Tables for choosing the optimal plans for certain auto regressive moving average processes at some practically useful values of acceptable quality level and rejectable quality level are also presented.  相似文献   

19.
Subset simulation is an efficient Monte Carlo technique originally developed for structural reliability problems, and further modified to solve single-objective optimization problems based on the idea that an extreme event (optimization problem) can be considered as a rare event (reliability problem). In this paper subset simulation is extended to solve multi-objective optimization problems by taking advantages of Markov Chain Monte Carlo and a simple evolutionary strategy. In the optimization process, a non-dominated sorting algorithm is introduced to judge the priority of each sample and handle the constraints. To improve the diversification of samples, a reordering strategy is proposed. A Pareto set can be generated after limited iterations by combining the two sorting algorithms together. Eight numerical multi-objective optimization benchmark problems are solved to demonstrate the efficiency and robustness of the proposed algorithm. A parametric study on the sample size in a simulation level and the proportion of seed samples is performed to investigate the performance of the proposed algorithm. Comparisons are made with three existing algorithms. Finally, the proposed algorithm is applied to the conceptual design optimization of a civil jet.  相似文献   

20.
This paper considers an attribute acceptance sampling problem in which inspection errors can occur. Unlike many common situations, the source of the inspection errors is the uncertainty associated with statistical sampling. Consider a lot that consists of N containers, with each container including a large number of units. It is desired to sample some of the containers and inspect a sample of units from these selected containers to determine proper disposition of the entire lot. Results presented in the paper demonstrate significant shortcomings in traditional sampling plans when applied in this context. Alternative sampling plans designed to address the risk of statistical classification error are presented. These plans estimate the rate of classification errors and set plan parameters to reduce the potential impact of such errors. Results are provided comparing traditional plans with the proposed alternatives. Limitations of the new plans are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号