共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper proposes a multiple dependent (or deferred) state sampling plan by variables for the inspection of normally distributed quality characteristics. The decision upon the acceptance of the lot is based on the states of the preceding lots (dependent state plan) or on the states of the forthcoming lots (deferred state plan). The lot acceptance probability is derived and the two-point approach to determining the plan parameters is described. The advantages of this new variables plan over conventional sampling plans are discussed. Tables are constructed for the selection of parameters of this plan under the specific values of the producer’s and consumer’s risks, indexed by acceptable quality level and limiting quality level, when the standard deviation is known or unknown. 相似文献
2.
M Aslam Y Mahmood Y L Lio T-R Tsai M A Khan 《The Journal of the Operational Research Society》2012,63(7):1010-1017
Lio et al (2010a,?2010b) introduced two single acceptance sampling plans (SASPs) for the percentiles of Birnbaum-Saunders and Burr type XII distribution with a truncated censoring scheme. They assured that the acceptance sampling plans for percentiles significantly improve the traditional ones for mean life. In this paper, a double-sampling procedure is developed for Burr type XII distribution percentiles to save sample resource with a truncated censoring scheme. Minimum sample sizes for implementing the proposed double-sampling method are determined under the satisfied levels of consumer's risk and producer's risk. Illustrative examples are used to demonstrate the applications of the proposed method. Compared with the SASP, the proposed double-sampling acceptance plan uses less sample resource in average for truncated life testing. 相似文献
3.
M. S. Aminzadeh 《Computational Statistics》2009,24(1):95-111
This paper presents variable acceptance sampling plans based on the assumption that consecutive observations on a quality
characteristic(X) are autocorrelated and are governed by a stationary autoregressive moving average (ARMA) process. The sampling
plans are obtained under the assumption that an adequate ARMA model can be identified based on historical data from the process.
Two types of acceptance sampling plans are presented: (1) Non-sequential acceptance sampling: In this case historical data
is available based on which an ARMA model is identified. Parameter estimates are used to determine the action limit (k) and
the sample size(n). A decision regarding acceptance of a process is made after a complete sample of size n is selected. (2)
Sequential acceptance sampling: Here too historical data is available based on which an ARMA model is identified. A decision
regarding whether or not to accept a process is made after each individual sample observation becomes available. The concept
of Sequential Probability Ratio Test (SPRT) is used to derive the sampling plans. Simulation studies are used to assess the
effect of uncertainties in parameter estimates and the effect of model misidentification (based on historical data) on sample
size for the sampling plans. Macros for computing the required sample size using both methods based on several ARMA models
can be found on the author’s web page . 相似文献
4.
Bing Zheng Li 《数学学报(英文版)》2008,24(3):511-528
The purpose of this paper is to provide an error analysis for the multicategory support vector machine (MSVM) classificaton problems. We establish the uniform convergency approach for MSVMs and estimate the misclassification error. The main difficulty we overcome here is to bound the offset vector. As a result, we confirm that the MSVM classification algorithm with polynomial kernels is always efficient when the degree of the kernel polynomial is large enough. Finally the rate of convergence and examples are given to demonstrate the main results. 相似文献
5.
《Applied Mathematics Letters》2006,19(4):351-355
When the sampled values are corrupted by noise, error estimates for the localized sampling series for approximating a band-limited function are obtained. The result provides error bounds for practical cases including error caused by average sampling, jitter error and amplitude error. 相似文献
6.
7.
Ingolf Terveer 《Mathematical Methods of Operations Research》1995,41(3):359-380
Multistage Bayesian decision procedures in statistical quality control are known from attribute sampling. In this paper they are introduced in a more general framework occuring in lot-control by using the theory of Bayesian sequentially planned decision procedures. We show that under sufficiency and transitivity assumptions and monotonicity properties concerning the distribution and cost set-up these Bayes-procedures have (z, c
–,c
+)-structure which, on one hand, generalizes results of K.-H. Waldmann and, on the other hand, reduces computational effort significantly. Finally, examples taken from attribute sampling and life testing for an outgoing lot are presented.The theory presented in this paper is taken from [11]. 相似文献
8.
In this article, an acceptance sampling procedure for the gamma lifetime model is established under the interval censored test. Both producer and consumer risks are considered to develop the ordinary and approximate sampling plans with grouped data. Some of the proposed sampling plans are tabulated and the use of the tables is illustrated by an example. A sensitivity study is conducted to evaluate the influence of the length of the time interval on the proposed sampling plans. 相似文献
9.
Summary A technique is described for determining most economical single sample inspection schemes in the situation where costs associated with inspection and failure to detect defective items can be estimated. A computational procedure is outlined, and a comparison made between the schemes determined by this method, and those previously in use. A least-cost decision rule is also derived for sequential sampling schemes.
Now with the Operations Research Section of B.O.A.C., Comet House, London Airport, Hounslow, Middx., England.
Vorgel. v.:M. Sasieni 相似文献
Zusammenfassung Ein Verfahren zum Aufstellen wirtschaftlichster einfacher Stichprobenverfahren wird für den Fall beschrieben, in dem die Kosten für das Überprüfen und das Nichtentdecken fehlerhafter Stücke geschätzt werden können. Es wird ein numerisches Verfahren aufgezeigt und ein Vergleich zwischen dem hier beschriebenen Verfahren und den früher verwendeten durchgeführt. Eine Entscheidungsregel für geringste Kosten wird ebenso für sequentielle Stichprobenpläne hergeleitet.
Now with the Operations Research Section of B.O.A.C., Comet House, London Airport, Hounslow, Middx., England.
Vorgel. v.:M. Sasieni 相似文献
10.
Received February 10, 1997 / Revised version received June 6, 1998 Published online October 9, 1998 相似文献
11.
S. Ejaz Ahmed Abdulkadir Hussein Sévérien Nkurunziza 《Statistics & probability letters》2010,80(7-8):726-732
In this paper, we consider a statistical model where samples are subject to measurement errors. Further, we propose a shrinkage estimation strategy by using the maximum empirical likelihood estimator (MELE) as the base estimator. Our asymptotic results clearly demonstrate the superiority of our proposed shrinkage strategy over the MELE. Monte Carlo simulation results show that such a performance still holds in finite samples. We apply our method to real data set. 相似文献
12.
13.
L Bessegato R Quinino L L Ho L Duczmal 《The Journal of the Operational Research Society》2011,62(7):1365-1375
Online process control consists of inspecting a single item at every mth items produced, where m is an integer greater than two. Based on the results of the inspection, one decides if the process is in-control (the fraction of conforming item is p1—state I) or out-of-control (the fraction of conforming item is p2—state II). If the inspected item is non-conforming, the process is designated as out-of-control and production is stopped for possible adjustment; otherwise, production goes on. In this paper, a contribution to online process control is presented, where the inspection system is considered to be subject to classification errors. After every adjustment, the sampling interval is L units (L?m), and in the case of non-adjustment, the sampling interval is m units. The expression for the average cost per produced item is calculated, and optimum parameters (the sampling intervals L and m) are obtained by a direct search. The procedure is illustrated by a numerical example. 相似文献
14.
Assuming a beta prior distribution on the fraction defective, $p$ , failure-censored sampling plans for Weibull lifetime models using classical (or average) and Bayesian (or posterior) producer’s and consumer’s risks are designed to determine the acceptability of lots of a given product. The average risk criterion provides a certain assurance that good (bad) lots will be accepted (rejected), whereas the posterior risk criterion provides a determined confidence that an accepted (rejected) lot is indeed good (bad). The performance of classical and Bayesian risks are analyzed in developing sampling plans when the lifetime variable follows the Weibull distribution. Several figures and tables illustrate the sensitivity of the risks and optimal sample sizes for selected censoring levels and specifications according to the available prior information on $p$ . The analysis clarifies the distinction among the different risks for a given sampling plan, and the effect of the prior knowledge on the required sample size. The study shows that, under uncertainty in the prior variance of $p$ , the designs using Bayesian risks are more appropriate. 相似文献
15.
16.
A.M. Abd-Elfattah E.A. El-Sherpieny S.M. Mohamed O.F. Abdou 《Applied mathematics and computation》2010,215(12):4198-3099
This paper proposes some estimators for the population mean by adapting the estimator in Singh et al. (2008) [5] to the ratio estimators presented in Kadilar and Cingi 2006 [2]. We obtain mean square error (MSE) equation for all proposed estimators, and show that all proposed estimators are always more efficient than ratio estimator in Naik and Gupta (1996) [3], and Singh et al. (2008) [5]. The results have been illustrated numerically by taking some empirical population considered in the literature. 相似文献
17.
The majority of actions designed to improve processes and quality include the assessment of the capability of a measurement system. The statistical model relating the measured value to the true, but not observable, value of a product characteristic is usually Gaussian and additive. In this paper we propose to extend the said model to a more general formulation by introducing the structure of the two‐component error model. An approximated method for evaluating the misclassification rates under the two‐component error model is proposed and assessed. Copyright © 2009 John Wiley & Sons, Ltd. 相似文献
18.
As a special shift-invariant spaces, spline subspaces yield many advantages so that there are many practical applications for signal or image processing. In this paper, we pay attention to the sampling and reconstruction problem in spline subspaces. We improve lower bound of sampling set conditions in spline subspaces. Based on the improved explicit lower bound, a improved explicit convergence ratio of reconstruction algorithm is obtained. The improved convergence ratio occupies faster convergence rate than old one. At the end, some numerical examples are shown to validate our results. 相似文献
19.
Ranked set sampling (RSS) is a statistical technique that uses auxiliary ranking information of unmeasured sample units in an attempt to select a more representative sample that provides better estimation of population parameters than simple random sampling. However, the use of RSS can be hampered by the fact that a complete ranking of units in each set must be specified when implementing RSS. Recently, to allow ties declared as needed, Frey (Environ Ecol Stat 19(3):309–326, 2012) proposed a modification of RSS, which is to simply break ties at random so that a standard ranked set sample is obtained, and meanwhile record the tie structure for use in estimation. Under this RSS variation, several mean estimators were developed and their performance was compared via simulation, with focus on continuous outcome variables. We extend the work of Frey (2012) to binary outcomes and investigate three nonparametric and three likelihood-based proportion estimators (with/without utilizing tie information), among which four are directly extended from existing estimators and the other two are novel. Under different tie-generating mechanisms, we compare the performance of these estimators and draw conclusions based on both simulation and a data example about breast cancer prevalence. Suggestions are made about the choice of the proportion estimator in general. 相似文献
20.
To use a control chart, the quality engineer should specify three decision variables, namely the sample size, the sampling interval and the critical region of the chart. A significant part of recent research relaxed the constraint of using fixed design parameters to open the way to a new type of control charts called adaptive ones where at least one of the decision variables may change in real time based on the last data information. These adaptive schemes have proven their effectiveness from economical and statistical point of views. In this paper, the economic design of an attribute np control chart using a variable sampling interval (VSI) is treated. A sensitivity analysis is conducted to search for optimal design parameters minimizing the expected total cost per hour and to reveal the impact of the process and cost parameters on the behavior of optimal solutions. An economic comparison between the classical np chart, variable sample size (VSS) np control chart and VSI chart is conducted. It is found that switching from the classical attribute chart to the VSI sampling strategy results in notable cost savings and in reduction of the average time to signal and the average number of false alarms. In most cases of the sensitivity analysis, the VSI np chart outperforms the VSS np chart based on economical and statistical considerations. Copyright © 2014 John Wiley & Sons, Ltd. 相似文献