首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper proposes an innovative Bayesian sequential censored sampling inspection method to improve the inspection level and reduce the sample size in acceptance test plans for continuous lots. A mathematical model of Bayesian sequential censored sampling is built, where a new inspection parameter is created and two types of risk are modified. As the core of Bayesian risk formulas, a new structure method of the prior distribution is presented by combining the empirical distribution with the uncertainty of the estimation. To improve the fitting accuracy of parameter estimation, an adaptive genetic algorithm is applied and compared with different parameter estimation methods. In the prior distribution, a prior estimator is introduced to design a sampling plan for continuous lots. Then, three types of producer's and consumer's risks are derived and compared. The simulation results indicate that the modified Bayesian sampling method performs well, with the lowest risks and the smallest sample size. Finally, a new sequential censored sampling plan for continuous lots is designed for the accuracy acceptance test of an aircraft. The test results show that compared with the traditional single sampling plan, the sample size is reduced by 66.7%, saving a vast amount of test costs.  相似文献   

2.
Economical sampling plans to ensure the qualities of Burr type XII distributed lifetimes were established using a truncated life test. The Bayesian inference method was used to address the lot-to-lot variation of products. The sampling plan was characterized by the sample size and the acceptance number to minimize the expected total cost. A simple empirical Bayesian estimation method was provided to estimate the hyperparameters of prior distribution, and simulation studies were conducted to validate the proposed empirical Bayesian estimation method. Lastly, the application of this proposed method was illustrated using two examples.  相似文献   

3.
Method of sequential mesh on Koopman-Darmois distributions   总被引:1,自引:0,他引:1  
For costly and/or destructive tests,the sequential method with a proper maximum sample size is needed.Based on Koopman-Darmois distributions,this paper proposes the method of sequential mesh,which has an acceptable maximum sample size.In comparison with the popular truncated sequential probability ratio test,our method has the advantage of a smaller maximum sample size and is especially applicable for costly and/or destructive tests.  相似文献   

4.
概率方法在纺织产品质量检验中的应用   总被引:1,自引:0,他引:1  
本文用统计分析方法分析了针织厂原产品质量检验方案 ,指出了其不合理性 ;从该厂的实际出发 ,针对三种不同批量建立了新的检验方案。实际证明 ,新方案既可以确保产品出厂质量 (限定在AQL值内 ) ,又可以对生产质量水平起到监督的作用  相似文献   

5.
This paper discusses empirical studies with both the adaptive correlated sequential sampling method and the adaptive importance sampling method which can be used in solving matrix and integral equations. Both methods achieve geometric convergence (provided the number of random walks per stage is large enough) in the sense: eνcλν, where eν is the error at stage ν, λ∈(0,1) is a constant, c>0 is also a constant. Thus, both methods converge much faster than the conventional Monte Carlo method. Our extensive numerical test results show that the adaptive importance sampling method converges faster than the adaptive correlated sequential sampling method, even with many fewer random walks per stage for the same problem. The methods can be applied to problems involving large scale matrix equations with non-sparse coefficient matrices. We also provide an application of the adaptive importance sampling method to the numerical solution of integral equations, where the integral equations are converted into matrix equations (with order up to 8192×8192) after discretization. By using Niederreiter’s sequence, instead of a pseudo-random sequence when generating the nodal point set used in discretizing the phase space Γ, we find that the average absolute errors or relative errors at nodal points can be reduced by a factor of more than one hundred.  相似文献   

6.
Single and repetitive sampling schemes are conventional methods for evaluating the quality of lots or batches of products. Truncation repetitive sampling plans are introduced in this paper in order to significantly reduce the size of samples drawn from the lot. Under this scheme type, if a decision about the acceptance or rejection of the lots cannot be made in the original inspection, they can be reinspected, at most, a prefixed number of times. The Poisson distribution is assumed for the number of defects found in the samples drawn from the lot. Best truncated repetitive sampling plans are determined by solving integer nonlinear programming problems. A simplified methodology is suggested to obtain the plans with optimal inspection effort and controlled risks by using an iterative procedure. According to the results obtained, optimal truncated plans are shown to be better than the optimal single and repetitive schemes in reducing the average sample number of the inspection. An application to the manufacturing of glass is presented for illustrative purposes.  相似文献   

7.
Lio et al (2010a,?2010b) introduced two single acceptance sampling plans (SASPs) for the percentiles of Birnbaum-Saunders and Burr type XII distribution with a truncated censoring scheme. They assured that the acceptance sampling plans for percentiles significantly improve the traditional ones for mean life. In this paper, a double-sampling procedure is developed for Burr type XII distribution percentiles to save sample resource with a truncated censoring scheme. Minimum sample sizes for implementing the proposed double-sampling method are determined under the satisfied levels of consumer's risk and producer's risk. Illustrative examples are used to demonstrate the applications of the proposed method. Compared with the SASP, the proposed double-sampling acceptance plan uses less sample resource in average for truncated life testing.  相似文献   

8.
We propose new sequential importance sampling methods for sampling contingency tables with given margins. The proposal for each method is based on asymptotic approximations to the number of tables with fixed margins. These methods generate tables that are very close to the uniform distribution. The tables, along with their importance weights, can be used to approximate the null distribution of test statistics and calculate the total number of tables. We apply the methods to a number of examples and demonstrate an improvement over other methods in a variety of real problems. Supplementary materials are available online.  相似文献   

9.
This paper investigates double sampling series derivatives for bivariate functions defined on R2 that are in the Bernstein space. For this sampling series, we estimate some of the pointwise and uniform bounds when the function satisfies some decay conditions. The truncated series of this formula allow us to approximate any order of partial derivatives for function from Bernstein space using only a finite number of samples from the function itself. This sampling formula will be useful in the approximation theory and its applications, especially after having the truncation error well-established. Examples with tables and figures are given at the end of the paper to illustrate the advantages of this formula.  相似文献   

10.
High-dimensional reliability analysis is still an open challenge in structural reliability community. To address this problem, a new sampling approach, named the good lattice point method based partially stratified sampling is proposed in the fractional moments-based maximum entropy method. In this approach, the original sample space is first partitioned into several orthogonal low-dimensional sample spaces, say 2 and 1 dimensions. Then, the samples in each low-dimensional sample space are generated by the good lattice point method, which are deterministic points and possess the property of large variance reduction. Finally, the samples in the original space can be obtained by randomly pairing the samples in low-dimensions, which may also significantly reduce the variance in high-dimensional cases. Then, this sampling approach is applied to evaluate the low-order fractional moments in the maximum entropy method with the tradeoff of efficiency and accuracy for high-dimensional reliability problems. In this regard, the probability density function of the performance function involving a large number of random inputs can be derived accordingly, where the reliability can be straightforwardly evaluated by a simple integral over the probability density function. Numerical examples are studied to validate the proposed method, which indicate the proposed method is of accuracy and efficiency for high-dimensional reliability analysis.  相似文献   

11.
In this paper, we develop integrated inventory inspection models with and without replacement of nonconforming items. Inspection policies include no inspection, sampling inspection, and 100% inspection. We consider a buyer who places an order from a supplier when his inventory level drops to a certain point, due to demand which is stochastic in nature. When a lot is received, the buyer uses some type of inspection policy. The fraction nonconforming is assumed to be a random variable following a beta distribution. The order quantity, reorder point and the inspection policy are decision variables. In the inspection policy involving determining sampling plan parameters, constraints on the buyer and manufacturer risks is set in order to obtain a fair plan for both parties. A solution procedure for determining the operating policies for inventory and inspection consisting of order quantity, sample size, and acceptance number is proposed. Numerical examples are presented to conduct a sensitivity analysis for important model parameters and to illustrate important issues about the developed models.  相似文献   

12.
In this paper, we propose a procedure for detecting multiple change-points in a mean-shift model, where the number of change-points is allowed to increase with the sample size. A theoretic justification for our new method is also given. We first convert the change-point problem into a variable selection problem by partitioning the data sequence into several segments. Then, we apply a modified variance inflation factor regression algorithm to each segment in sequential order. When a segment that is suspected of containing a change-point is found, we use a weighted cumulative sum to test if there is indeed a change-point in this segment. The proposed procedure is implemented in an algorithm which, compared to two popular methods via simulation studies, demonstrates satisfactory performance in terms of accuracy, stability and computation time. Finally, we apply our new algorithm to analyze two real data examples.  相似文献   

13.
We propose a sequential importance sampling strategy to estimate subgraph frequencies and detect network motifs. The method is developed by sampling subgraphs sequentially node by node using a carefully chosen proposal distribution. Viewing the subgraphs as rooted trees, we propose a recursive formula that approximates the number of subgraphs containing a particular node or set of nodes. The proposal used to sample nodes is proportional to this estimated number of subgraphs. The method generates subgraphs from a distribution close to uniform, and performs better than competing methods. We apply the method to four real-world networks and demonstrate outstanding performance in practical examples. Supplemental materials for the article are available online.  相似文献   

14.
In this paper, we present a new approach to solve the railway rescheduling problem. This problem deals with the reparation of a disturbed railway timetable after incidents in such a way to minimize the difference between the original plan and the new provisional plan. We use a mixed integer linear programming (MIP) formulation that models this problem correctly. However, the large number of variables and constraints denies the possibility to solve this problem efficiently using a standard MIP solver. A new approach called SAPI (Statistical Analysis of Propagation of Incidents) has been developed to tackle the problem. The key point of SAPI is to estimate the probability that an event, one step of the itinerary of a train, is affected by a set of incidents. Using these probabilities, the search space is reduced, obtaining very good solutions in a short time. The method has been tested with two different networks located in France and Chile. The numerical results show that our procedure is viable in practice.  相似文献   

15.
定时截尾下指数分布产品可靠性抽样检验方案   总被引:8,自引:0,他引:8  
本文给出了制订定时截尾下指数分布产品可靠性抽样检验方案的统计方法.检验统计量是平均寿命倒数的极大似然估计.提出了一种选择截尾时间的方法.利用分布分位数的Cornish-Fisher展开近似地确定了样本量和接收常数.模拟结果表明,本文给出的方法是可行的.  相似文献   

16.
This paper deals with the optimal designing of step-stress partially accelerated life tests (PALTs) in which items are run at both accelerated and use conditions under censoring. It is assumed that the lifetime of the items follow truncated logistic distribution truncated at point zero. Truncated distributions arise when sample selection is not possible in some sub-region of the sample space. The logistic distribution is considered inappropriate for modeling lifetime data because left hand side of its distribution extends to negative infinity, and this could conceivably result in modeling negative times-to-failure. This has necessitated the use of truncated logistic distribution truncated at point zero for modeling lifetime data. Unlike the widely studied exponential, Weibull and lognormal life distributions, the failure rate of truncated logistic distribution is increasing and more realistically bounded below and above by non-zero finite quantity. The optimal change-time for the step PALT is determined by minimizing either the generalized asymptotic variance of maximum likelihood estimates (MLEs) of the acceleration factor and the hazard rate at use condition or the asymptotic variance of MLE of the acceleration factor. Inferential procedure involving model parameters and acceleration factor are studied. Sensitivity analysis is also performed.  相似文献   

17.
通过添加部分缺失寿命变量数据,得到了删失截断情形下失效率变点模型相对简单的似然函数.讨论了所添加缺失数据变量的概率分布和随机抽样方法.利用Monte Carlo EM算法对未知参数进行了迭代.结合Metropolis-Hastings算法对参数的满条件分布进行了Gibbs抽样,基于Gibbs样本对参数进行估计,详细介绍了MCMC方法的实施步骤.随机模拟试验的结果表明各参数Bayes估计的精度较高.  相似文献   

18.
In our previous work published in this journal, we showed how the Hit-And-Run (HAR) procedure enables efficient sampling of criteria weights from a space formed by restricting a simplex with arbitrary linear inequality constraints. In this short communication, we note that the method for generating a basis of the sampling space can be generalized to also handle arbitrary linear equality constraints. This enables the application of HAR to sampling spaces that do not coincide with the simplex, thereby allowing the combined use of imprecise and precise preference statements. In addition, it has come to our attention that one of the methods we proposed for generating a starting point for the Markov chain was flawed. To correct this, we provide an alternative method that is guaranteed to produce a starting point that lies within the interior of the sampling space.  相似文献   

19.
As a new reliability test plan, generalized progressive hybrid censoring can improve test efficiency by allowing experimenters to observe a pre-specified number of failure samples before the final termination point. Based on a class of widely used life distribution in life data analysis --- generalized exponential distribution, this paper discusses its parameters inference issue under generalized progressive hybrid censoring scheme. EM Algorithm is used to estimate parameters of the considered model. Simulation studies and a real-data analysis are carried out to illustrate the performance of finite sample for the proposed procedure.  相似文献   

20.
Acceptance sampling has been one of practical tools for quality assurance applications, which provide a general rule to the producer and the consumer for product acceptance determination. It has been shown that variables sampling plans requires less sampling compared with attributes sampling plans. Thus, variables sampling plans become more attractive and desirable especially when the required quality level is very high or the allowable fraction non-conforming is very small. This paper attempts to develop an efficient and economic sampling scheme, variables repetitive group sampling plan, by incorporating the concept of Taguchi loss function. The OC curve of the proposed plan is derived based on the exact sampling distribution and the plan parameters are determined by minimizing the average sample number with two constraints specified by the producer and the consumer. The efficiency of the proposed variables RGS is examined and also compared with the existing variables single sampling plan in terms of the sample size required for inspection. In addition, tables of the plan parameters for various combinations of entry parameters are provided and an example is presented for illustration.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号