首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The design of an experiment for the evaluation of sampling uncertainty in the framework of the fitness for purpose concept is described in terms of probabilities (risks of the user) of type 1 and type 2 errors in decisions concerning the significance of effects influencing the sampling uncertainty and the measurement results. As a case study, an experiment based on the duplicate method for quantification of the sampling uncertainty and inhomogeneity (heterogeneity) of a melt of tin-free bronze produced in a 10-ton reflective oven is analyzed. The melt is defined as the sampling target. It is shown that the number of such targets (melts), the number of samples under analysis and the number of replicate analyses can be minimized, i.e., the size and cost of the experiment can be reduced, if the user knows which risks are acceptable. When inhomogeneity of the sampling target has a systematic character, like the decrease of the mass fraction of aluminum from the beginning to the end of the melt pouring in the case study, the inhomogeneity effect can be separated from the sampling uncertainty and evaluated according to the user’s needs.  相似文献   

2.
Lyn JA  Ramsey MH  Coad DS  Damant AP  Wood R  Boon KA 《The Analyst》2007,132(11):1147-1152
This paper presents methods for calculating confidence intervals for estimates of sampling uncertainty (s(samp)) and analytical uncertainty (s(anal)) using the chi-squared distribution. These uncertainty estimates are derived from application of the duplicate method, which recommends a minimum of eight duplicate samples. The methods are applied to two case studies--moisture in butter and nitrate in lettuce. Use of the recommended minimum of eight duplicate samples is justified for both case studies as the confidence intervals calculated using greater than eight duplicates did not show any appreciable reduction in width. It is considered that eight duplicates provide estimates of uncertainty that are both acceptably accurate and cost effective.  相似文献   

3.
There is an increasing appreciation that the uncertainty in environmental measurements is vitally important for their reliable interpretation. However, the adoption of methods to estimate this uncertainty has been limited by the extra cost of implementation. One method that can be used to estimate the random components of uncertainty in the sampling and analytical processes requires the collection of duplicate samples at 10% of the primary sampling locations and duplicating the analyses of these samples. A new program has been written and applied to a modified experimental design to enable a 33% reduction in the cost of analysing this 10% subset, whilst accommodating outlying values. This unbalanced robust analysis of variance (U-RANOVA) uses an unbalanced rather than the balanced experimental design usually employed. Simulation techniques have been used to validate the results of the program, by comparison of the results between the proposed unbalanced and the established balanced designs. Comparisons are also made against the seed parameters (mean and standard deviation) used to simulate the parent population, prior to the addition of a proportion (up to 10%) of outlying values. Application to a large number of different simulated populations shows that U-RANOVA produces results that are effectively indistinguishable from the results produced by the accepted balanced approach and are equally close to the true (seed) parameters of the parent normal population.  相似文献   

4.
Calculation of measurement uncertainty is a requirement for all laboratories accredited to ISO/IEC 17025 including those carrying out microbiological analyses. Today, calculation of measurement uncertainty in microbiological analyses using precision data according to global approach principles is widely recognized by the microbiologists due to difficulties in quantification of individual uncertainty sources. In food microbiology, precision data obtained from different samples usually show over-dispersion, and the use of over-dispersed data may result in large variance. The current ISO standard on measurement uncertainty in food microbiology proposes the use of log-transformed precision data to overcome this problem. This paper proposes an alternative procedure to calculate the measurement uncertainty in food microbiology using non-logarithmic precision data. The calculations used in this procedure based on relative range of duplicate analyses can be applied to intra-laboratory reproducibility data obtained from microbiological analyses of which duplicate results show relatively low variation.  相似文献   

5.
The evidentiary weight attributed to forensic breath alcohol results in drunk-driving prosecutions requires that measurement uncertainty be established and shown to be fit-for-purpose. The principal components contributing to breath alcohol measurement uncertainty include: (1) biological/sampling, (2) instrumental, (3) traceability and (4) the water/air partition coefficient for control standards. Employing duplicate breath results from over 92,000 subjects to estimate the biological/sampling component and assuming reasonable forensic values for the other components, the combined and expanded uncertainty is determined for a practical example. The combined uncertainty for an unbiased single determination breath alcohol measurement was: . Employing the expanded uncertainty (k = 2.58), the 99% confidence interval for a mean breath alcohol concentration of 0.0935 g/210 L was 0.0866 to 0.1004 g/210 L. The proportion of combined uncertainty associated with each component was determined to be: biological/sampling 73%, analytical 10%, traceability 13% and water/air partition coefficient 4%. These are forensically acceptable estimates and demonstrate fitness-for-purpose of breath alcohol measurement when employing appropriate elements of quality control.  相似文献   

6.
The combined uncertainty in the analytical results of solid materials for two methods (ET-AAS, analysis after prior sample digestion and direct solid sampling) are derived by applying the Guide to the Expression of Uncertainty in Measurement from the International Standards Organization. For the analysis of solid materials, generally, three uncertainty components must be considered: (i) those in the calibration, (ii) those in the unknown sample measurement and (iii) those in the analytical quality control (AQC) process. The expanded uncertainty limits for the content of cadmium and lead from analytical data of biological samples are calculated with the derived statistical estimates. For both methods the expanded uncertainty intervals are generally of similar width, if all sources of uncertainty are included. The relative uncertainty limits for the determination of cadmium range from 6% to 10%, and for the determination of lead they range from 8% to 16%. However, the different uncertainty components contribute to different degrees. Though with the calibration based on reference solutions (digestion method) the respective contribution may be negligible (precision < 3%), the uncertainty from a calibration based directly on a certified reference material (CRM) (solid sampling) may contribute significantly (precision about 10%). In contrast to that, the required AQC measurement (if the calibration is based on reference solutions) contributes an additional uncertainty component, though for the CRM calibration the AQC is “built-in”. For both methods, the uncertainty in the certified content of the CRM, which is used for AQC, must be considered. The estimation of the uncertainty components is shown to be a suitable tool for the experimental design in order to obtain a small uncertainty in the analytical result.  相似文献   

7.
Three non-specific methods for the extraction of total petroleum hydrocarbons (TPH) from soil into organic solvent were compared. The techniques used for sample preparation were Soxhlet extraction, closed-vessel microwave-assisted extraction, and CEN shake extraction. The total concentrations of extracted compounds in the boiling point range of C10–C40 were determined by gas chromatography with flame ionization detection. The best recovery (99%) and repeatability (±3%) from standard oil mixtures were obtained with microwave-assisted extraction. However, the different extraction methods exhibited different behaviour when spiked soil samples were extracted. The best repeatability was obtained with CEN shake extraction (±6%) but the repeatability values for Soxhlet and microwave-assisted methods were quite high (>20%). However, the larger uncertainties of the latter extraction methods does not necessarily limit the applicability of these methods to the determination of petroleum hydrocarbons in soil, as in the assessment of soil contamination the expanded uncertainty of the result is usually not limited by analytical uncertainty, but rather by the uncertainty of the primary sampling stage. However, distinctive variation found in the chromatographic profiles illustrates that discretion should be obeyed when chromatograms obtained after application of different extraction methods on petroleum contaminated samples are to be used in the fingerprinting or age dating studies. Otherwise, misleading conclusions concerning the age of spillage could be drawn.  相似文献   

8.
Tsukakoshi Y 《The Analyst》2011,136(3):533-539
Here, the uncertainty budget for a total diet study (TDS) was clarified by separating the total measurement uncertainty into the uncertainty arising from the compositional heterogeneity of food items between cities (referred to as inter-city variance), the heterogeneity of food items within cities (intra-city variance), and the chemical analysis of the food samples (analytical variance) at one study design. TDS samples were collected from 14 cities in Japan. Duplicate samples collected in each city were prepared from food items purchased from different shops, and the cadmium concentrations were measured individually to obtain the intra-city variance. These results were used to show the importance of sampling design in TDSs, by evaluating a sampling method known as a multi-stage design, in which multiple samples are collected from several cities. Such schemes have been applied to TDSs, but the uncertainty involved has not been assessed. An intra-city correlation was observed between the cadmium concentrations in samples from the same city, demonstrating that the effective sample size was not simply the number of cities and shops sampled. The TDS results showed a high intra-city variance, which was greater than the inter-city variance for all of the food groups studied, and particularly for the bean and potato groups. By combining the sampling and analytical uncertainties obtained, the sampling uncertainty across different primary sampling unit sizes and secondary sampling unit sizes was obtained. As suggested by the analysis of potatoes and beans, grouping food samples from different shops in the same city can improve the representativeness of the results.  相似文献   

9.
Lyn JA  Ramsey MH  Damant AP  Wood R 《The Analyst》2005,130(9):1271-1279
Uncertainty estimates from routine sampling and analytical procedures can be assessed as being fit for purpose using the optimised uncertainty (OU) method. The OU method recommends an optimal level of uncertainty that should be reached in order to minimise the expected financial loss, given a misclassification of a batch as a result of the uncertainty. Sampling theory can used as a predictive tool when a change in sampling uncertainty is recommended by the OU method. The OU methodology has been applied iteratively for the first time using a case study of wholesale butter and the determination of five quality indicators (moisture, fat, solids-not-fat (SNF), peroxide value (PV) and free fatty acid (FFA)). The sampling uncertainty (s(samp)) was found to be sub-optimal for moisture and PV determination, for 3-fold composite samples. A revised sampling protocol was devised using Gy's sampling theory. It was predicted that an increase in sample mass would reduce the sampling uncertainty to the optimal level, resulting in a saving in expectation of loss of over pounds 2000 per 20 tonne batch, when compared to current methods. Application of the optimal protocol did not however, achieve the desired reduction in s(samp) due to limitations in sampling theory. The OU methodology proved to be a useful tool in identifying broad weaknesses within a routine protocol and assessing fitness for purpose. However, the successful routine application of sampling theory, as part of the optimisation process, requires substantial prior knowledge of the sampling target.  相似文献   

10.
Sampling and uncertainty of sampling are important tasks, when industrial processes are monitored. Missing values and unequal sources can cause problems in almost all industrial fields. One major problem is that during weekends samples may not be collected. On the other hand a composite sample may be collected during weekend. These systematically occurring missing values (gaps) will have an effect on the uncertainties of the measurements. Another type of missing values is random missing values. These random gaps are caused, for example, by instrument failures. Pierre Gy's sampling theory includes tools to evaluate all error components that are involved in sampling of heterogeneous materials. Variograms, introduced by Gy's sampling theory, have been developed to estimate the uncertainty of auto-correlated process measurements. Variographic experiments are utilized for estimating the variance for different sample selection strategies. The different sample selection strategies are random sampling, stratified random sampling and systematic sampling. In this paper both systematic and random gaps were estimated by using simulations and real process data. These process data were taken from bark boilers of pulp and paper mills (combustion processes). When systematic gaps were examined a linear interpolation was utilized. Also cases introducing composite sampling were studied. Aims of this paper are: (1) how reliable the variogram is to estimate the process variogram calculated from data with systematic gaps, (2) how the uncertainty of missing gap can be estimated in reporting time-averages of auto-correlated time series measurements. The results show that when systematic gaps were filled by linear interpolation only minor changes in the values of variogram were observed. The differences between the variograms were constantly smallest with composite samples. While estimating the effect of random gaps, the results show that for the non-periodic processes the stratified random sampling strategy gives more reliable results than systematic sampling strategy. Therefore stratified random sampling should be used while estimating the uncertainty of random gaps in reporting time-averages of auto-correlated time series measurements.  相似文献   

11.
Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM) is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India) for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG). In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2).  相似文献   

12.
Existing methods have been applied to estimate the uncertainty of measurement, caused by both sampling and analysis, and fitness-for-purpose of these measurements. A new approach has been taken to modify the measurement uncertainty by changing the contribution made by the sampling process. A case study on nitrate in lettuce has been used to demonstrate the applicability of this new generic approach. The sampling theory of Gy was used to predict the alterations in the sampling protocol required to achieve the necessary change in sampling uncertainty. An experimental application of this altered sampling protocol demonstrated that the predicted change in sampling uncertainty was achieved in practice. For the lettuce case study, this approach showed that composite samples containing 40 heads, rather than the usual ten heads, produced measurements of nitrate that where more fit-for-purpose.  相似文献   

13.
Heydorn K  Nørgård K 《Talanta》1973,20(9):835-842
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T, which is shown to be approximated by a chi(2) distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling conditions was shown to give not only increased precision, but also improved accuracy of the results.  相似文献   

14.
15.
As part of the European Commission (EC)'s revision of the Sewage Sludge Directive and the development of a Biowaste Directive, there was recognition of the difficulty of comparing data from Member States (MSs) because of differences in sampling and analytical procedures. The ‘HORIZONTAL' initiative, funded by the EC and MSs, seeks to address these differences in approach and to produce standardised procedures in the form of CEN standards. This article is a preliminary investigation into aspects of the sampling of biosolids, composts and soils to which there is a history of biosolid application. The article provides information on the measurement uncertainty associated with sampling from heaps, large bags and pipes and soils in the landscape under a limited set of conditions, using sampling approaches in space and time and sample numbers based on procedures widely used in the relevant industries and when sampling similar materials.These preliminary results suggest that considerably more information is required before the appropriate sample design, optimum number of samples, number of samples comprising a composite, and temporal and spatial frequency of sampling might be recommended to achieve consistent results of a high level of precision and confidence.  相似文献   

16.
Complex analytical procedures are often required to prove the non-compliance with a specific legislation. In the case of a small overlap of the limit, integration of the method uncertainty in the decision-making process is essential. The decision rule proposed in Wallonia, Belgium, for the non-compliance of waste incineration plants with the EU limit value for PCDD and PCDF emissions is presented. The method uncertainty was estimated annually over 6 years from duplicate measurements using two top-down approaches. Depending on the congener, the standard uncertainty varies from 30 to 85%, with a good correlation between calculations. The analytical contribution was estimated using a bottom-up evaluation. The impact of the sampling step was deduced from the whole estimation and represents more than 80% of the total uncertainty budget. No optimisation is foreseen at this time because of practical field constraints. Based on the average fraction of each congener, the uncertainty associated with the measurement result has been established and shows a high stability over the years. Using this value, a guard band has been calculated and will be proposed to the regulatory body. Presented at the Measurement Uncertainty Symposium, Berlin, Germany, April 2008.  相似文献   

17.
18.
The 3M Petrifilm Staph Express Count plate method was compared with AOAC Official Method 975.55 for the enumeration of Staphylococcus aureus in selected foods. Four foods--cooked, diced chicken; cured ham; smoked salmon; and pepperoni--were analyzed for S. aureus by 12 collaborating laboratories. For each food tested, the collaborators received 8 blind test samples consisting of a control sample, a low inoculation level, a medium inoculation level, and a medium inoculation level with background flora, each in duplicate. The mean log10 counts for the methods were comparable for all 4 foods. The repeatability and reproducibility variances of the 24 h Petrifilm Staph Express Count plate method were similar to those of the 72 h standard method.  相似文献   

19.
For minimum-variance estimation of parameters by the method of least squares, heteroscedastic data should be weighted inversely as their variance, w(i) proportional, variant 1/sigma(i)2. Here the instrumental data variance for a commercial high-performance liquid chromatography (HPLC) instrument is estimated from 5 to 11 replicate measurements on more than 20 samples for each of four different analytes. The samples span a range of over four orders of magnitude in concentration and HPLC peak area, over which the sampling variance estimates s2 are well represented as a sum of a constant term and a term proportional to the square of the peak area. The latter contribution is dominant over most of the range used in routine HPLC analysis and represents approximately 0.2% of peak area for all four analytes studied here. It includes a contribution from uncertainty in the syringe injection volume, which is found to be +/-0.008 microL. The dominance of proportional error justifies the use of 1/x2 or 1/y2 weighting in routine calibration with such data; however, the constant variance term means that these weighting formulas are not correct in the low-signal limit relevant for analysis at trace levels. Least-squares methods for both direct and logarithmic fitting of variance sampling estimates are described. Since such estimates themselves have proportional uncertainty, direct fitting requires iterative adjustment of the weights, while logarithmic fitting does not.  相似文献   

20.
The precision of an activation analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T, which is shown to be approximated by a chi-squared distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling conditions was shown to give not only increased precision, but also improved accuracy of the results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号