首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Accreditation and Quality Assurance - Measurement uncertainty that arises from primary sampling can be expressed as an uncertainty factor, which recognises its sometimes approximately log-normal...  相似文献   

2.
The variability associated with the aflatoxin test procedure used to estimate aflatoxin levels in bulk shipments of hazelnuts was investigated. Sixteen 10 kg samples of shelled hazelnuts were taken from each of 20 lots that were suspected of aflatoxin contamination. The total variance associated with testing shelled hazelnuts was estimated and partitioned into sampling, sample preparation, and analytical variance components. Each variance component increased as aflatoxin concentration (either B1 or total) increased. With the use of regression analysis, mathematical expressions were developed to model the relationship between aflatoxin concentration and the total, sampling, sample preparation, and analytical variances. The expressions for these relationships were used to estimate the variance for any sample size, subsample size, and number of analyses for a specific aflatoxin concentration. The sampling, sample preparation, and analytical variances associated with estimating aflatoxin in a hazelnut lot at a total aflatoxin level of 10 ng/g and using a 10 kg sample, a 50 g subsample, dry comminution with a Robot Coupe mill, and a high-performance liquid chromatographic analytical method are 174.40, 0.74, and 0.27, respectively. The sampling, sample preparation, and analytical steps of the aflatoxin test procedure accounted for 99.4, 0.4, and 0.2% of the total variability, respectively.  相似文献   

3.
A workshop on uncertainty in sampling was held in Hillerød, Denmark, on 12–13 April 2007 to launch a new handbook on sampling quality assurance and uncertainty estimation. The participants of the workshop were approximately 60 delegates from 15 European countries, representing institutions performing sampling, users of the data, research institutions, as well as accreditation bodies. Materials from the workshop, including examples, tools, and calculation aids for the work can be found at http://www.samplersguide.com. The Nordtest handbook Uncertainty from sampling will be made available on the Nordtest web site at http://www.nordicinnovation.net/nordtest.cfm under NT technical reports, report number NT tec 604. Until the final report is available on the Nordtest web site, an advance draft of the Nordtest handbook is available from http://www.samplersguide.com.  相似文献   

4.
A practical approach to assessment of sampling uncertainty   总被引:1,自引:0,他引:1  
The paper reports the approach followed in the SOILSAMP project, funded by the National Environmental Protection Agency (ANPA)of Italy. SOILSAMP is aimed at assessing uncertainties associated with soil sampling in agricultural, semi-natural, urban, and industrial environments. The uncertainty assessment is based on a bottom-up approach, according to the Guide to the Expression of Uncertainty in Measurement published by the International Organization for Standardization (ISO). A designated agricultural area, which has been characterized in terms of elemental spatial distribution, will be used in future as a reference site for soil sampling intercomparison exercises. Received: 19 November 2001 Accepted: 6 January 2002  相似文献   

5.
Accreditation and Quality Assurance - Unfortunately, the references 12 and 13 were incorrectly published in the original publication. The correct references are:  相似文献   

6.
Samples of lyophilised porcine kidney were scanned by photon transmission tomography. The tomographs were used in the selection of sub-samples for subsequent trace element determination by 2MeV PIXE analysis. Major and trace element concentrations for Ca, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Br, Rb, Sr, Cd and Pb for whole kidney, cortex and medulla are derived. These are discussed, accompanied by sampling factors, to emphasise the need for rigorously enforcing sampling protocols.  相似文献   

7.
Domestic and international regulatory limits have been established for aflatoxin in almonds and other tree nuts. It is difficult to obtain an accurate and precise estimate of the true aflatoxin concentration in a bulk lot because of the uncertainty associated with the sampling, sample preparation, and analytical steps of the aflatoxin test procedure. To evaluate the performance of aflatoxin sampling plans, the uncertainty associated with sampling lots of shelled almonds for aflatoxin was investigated. Twenty lots of shelled almonds were sampled for aflatoxin contamination. The total variance associated with measuring B1 and total aflatoxins in bulk almond lots was estimated and partitioned into sampling, sample preparation, and analytical variance components. All variances were found to increase with an increase in aflatoxin concentration (both B1 and total). By using regression analysis, mathematical expressions were developed to predict the relationship between each variance component (total, sampling, sample preparation, and analysis variances) and aflatoxin concentration. Variance estimates were the same for B1 and total aflatoxins. The mathematical relationships can be used to estimate each variance for a given sample size, subsample size, and number of analyses other than that measured in the study. When a lot with total aflatoxins at 15 ng/g was tested by using a 10 kg sample, a vertical cutter mixer type of mill, a 100 g subsample, and high-performance liquid chromatography analysis, the sampling, sample preparation, analytical, and total variances (coefficient of variation, CV) were 394.7 (CV, 132.4%), 14.7 (CV, 25.5%), 0.8 (CV, 6.1%), and 410.2 (CV, 135.0%), respectively. The percentages of the total variance associated with sampling, sample preparation, and analytical steps were 96.2, 3.6, and 0.2, respectively.  相似文献   

8.
Existing methods have been applied to estimate the uncertainty of measurement, caused by both sampling and analysis, and fitness-for-purpose of these measurements. A new approach has been taken to modify the measurement uncertainty by changing the contribution made by the sampling process. A case study on nitrate in lettuce has been used to demonstrate the applicability of this new generic approach. The sampling theory of Gy was used to predict the alterations in the sampling protocol required to achieve the necessary change in sampling uncertainty. An experimental application of this altered sampling protocol demonstrated that the predicted change in sampling uncertainty was achieved in practice. For the lettuce case study, this approach showed that composite samples containing 40 heads, rather than the usual ten heads, produced measurements of nitrate that where more fit-for-purpose.  相似文献   

9.
On three fields of arable land of (3–6)×104 m2, simple reference sampling was performed by taking up to 195 soil increments from each field applying a systematic sampling strategy. From the analytical data reference values for 15 elements were established, which should represent the average analyte mass fraction of the areas. A “point selection standard deviation” was estimated, from which a prediction of the sampling uncertainty was calculated for the application of a standard sampling protocol (X-path across the field, totally 20 increments for a composite sample). Predicted mass fractions and associated uncertainties are compared with the results of a collaborative trial of 18 experienced samplers, who had applied the standard sampling protocol on these fields. In some cases, bias between reference and collaborative values is found. Most of these biases can be explained by analyte heterogeneity across the area, in particular on one field, which was found to be highly heterogeneous for most nutrient elements. The sampling uncertainties estimated from the reference sampling were often somewhat smaller compared to those from the collaborative trial. It is suspected that the influence of sample preparation and the variation due to sampler were responsible for these differences. For the applied sampling protocol, the uncertainty contribution from sampling generally is in the same range as the uncertainty contribution from analysis. From these findings, some conclusions were drawn, especially about the consequences for a sampling protocol, if in routine sampling a demanded “certainty of trueness” for the measurement result should be met.  相似文献   

10.
Ingamells CO 《Talanta》1974,21(2):141-155
Methods are now being devised for the design of sampling schemes and for data evaluation promise to increase the certainty with which large, inhomogeneous, and segregated masses of material (mountains) may be analysed for interesting constituents. These methods are directed toward the integration of work originating within several disciplines, and utilize different kinds of sampling constant to control error.  相似文献   

11.
Accurate and reliable sampling and analysis of mercury forms is an overriding aim of any atmospheric monitoring effort which seeks to understand the fate and transport of the metal in the environment. Although a fraction of the total mercury forms found in the atmosphere, particulate phase mercury, Hg(p), is believed to play a prominent role in both wet and dry deposition to the terrestrial and aquatic environments. Currently, microwave acid extraction and thermoreductive methodologies for analysis of Hg(p) samples are widely used. We report on the potential for the use of a thermoreductive method for Hg(p) analysis to evaluate and optimize it for use in routine monitoring networks. Pre-baked quartz filters can be placed in particulate samplers with well-characterized size cuts, such as dichotomous samplers and microoriface impactors. The thermoreductive methodology facilitates rapid analysis after sample collection. It requires no chemical extraction thereby eliminating the potential for contamination and generation of hazardous waste. Our results indicate that, on average, the thermoreductive method yields 30% lower values for fine fraction Hg(p) when compared with microwave acid digestion. This may be due to matrix interferents that reduce the collection efficiency of mercury onto gold preconcentration traps. Results for total particulate mercury samples indicate that on average the thermoreductive method yields 56% lower values for the coarse fraction when compared with microwave acid digestion.Experiments were also conducted in Detroit, MI, USA to investigate whether elevated reactive gaseous mercury (Hg(2+)(g)) in an urban environment can lead to an artifact during the collection of filters for Hg(p) analysis. Our results indicate a significantly higher amount of Hg(p) collected onto a filter using the conventional methodology as compared to a filter collected downstream of KCl-coated annular denuders in the absence of Hg(2+)(g). These results point to the presence of Hg(2+)(g) as an artifact during Hg(p) measurement. These results indicate that a denuder must be utilized upstream of a filter for Hg(p) collection to prevent significant Hg(2+)(g) artifact formation.  相似文献   

12.
In many cases compositional requirements for foodstuffs (e.g. limits for the fat, protein, dry matter, or water content) are established by legislation. Adequate compliance testing is possible only if limits are clearly defined, taking measurement and sampling uncertainty into consideration. Furthermore, decisions on compliance must be based on samples which reflect the composition of the quantity to be evaluated. The resulting sample sizes are normally regarded by food inspection authorities as being much larger than acceptable. Consequently, an alternative strategy should be developed. Autocontrol data (i.e. inspection results obtained by the factory) in principle provide an adequate data basis for decisions on compliance. However, they must be reliable and the food inspection authority must have access to these data on request. Using these data and on condition that they show an approximate normal distribution, an inspection strategy based on arithmetic mean and standard deviation can be developed. Reliable and transparent decisions on compliance can thus be made. In many cases an adequate verification of food authenticity requires a comparison of raw material and product composition. Maximum acceptable differences, taking the relevant sources of variation into consideration, have to be defined and should be used instead of limits. Received: 17 April 2002 Accepted: 23 June 2002  相似文献   

13.
Obtaining rate constants and concentration profiles from spectroscopy is important in reaction monitoring. In this paper, we combined kinetic equations and Iterative Target Transformation Factor Analysis (ITTFA) to resolve spectroscopic data acquired during the course of a reaction. This approach is based on the fact that ITTFA needs a first guess (test vectors) of the parameters that will be estimated (target vectors). Three methods are compared. In the first, originally proposed by Furusj? and Danielsson, kinetic modelling is only used to provide the initial test vectors for ITTFA. In the second the rate constant used to provide the test vectors is optimised until a best fit is reached. In the third, a guess of the rate constant is used to provide the test vectors to ITTFA. The outcome of ITTFA is then used to fit the kinetic model and obtain a new guess of the rate constant. With this constant new concentration profiles are generated and provided to the ITTFA algorithm as new test vectors, in an iterative manner, minimising the residuals of the predicted dataset, until convergence. The second and third methods are new implementations of ITTFA and are compared to the first, established, method. First order (both one and two step) and second order reactions were simulated and instrumental noise was introduced. An experimental second order reaction was also employed to test the methods.  相似文献   

14.
15.
Sampling and sampling strategies for environmental analysis   总被引:1,自引:0,他引:1  
Sampling errors are generally believed to dominate the errors of analytical measurement during the entire environmental data acquisition process. Unfortunately, environmental sampling errors are hardly quantified and documented even though analytical errors are frequently yet improperly reported to the third decimal point in environmental analysis. There is a significant discrepancy in directly applying traditional sampling theories (such as those developed for the binary particle systems) to trace levels of contaminants in complex environmental matrices with various spatial and temporal heterogeneities. The purpose of this critical review is to address several key issues in the development of an optimal sampling strategy with a primary goal of sample representativeness while minimizing the total number of samples and sampling frequencies, hence the cost for sampling and analysis. Several biased and statistically based sampling approaches commonly employed in environmental sampling (e.g. judgmental sampling and haphazard sampling vs. statistically based approaches such as simple random, systematic random, and stratified random sampling) are examined with respect to their pros and cons for the acquisition of scientifically reliable and legally defensible data. The effects of sample size, sample frequency and the use of compositing are addressed to illustrate the strategies for a cost reduction as well as an improved representativeness of sampling from spatially and temporally varied environmental systems. The discussions are accompanied with some recent advances and examples in the formulation of sampling strategies for the chemical or biological analysis of air, surface water, drinking water, groundwater, soil, and hazardous waste sites.  相似文献   

16.
17.
18.
 This paper reviews the experience of the use of the Eurachem Guide and gives a brief overview of the principles of evaluating uncertainty. This is followed by discussion of the comments received on the Guide and highlights some of the issues that need to be considered in the next version. Accepted: 21 October 1997  相似文献   

19.
20.
Reliable conformational sampling and trajectory analysis are always important to the study of the folding or binding mechanisms of biomolecules. Generally, one has to prepare many complicated parameters and follow a lot of steps to obtain the final data. The whole process is too complicated to new users. In this article, we provide a convenient and user-friendly tool that is compatible to AMBER, called fast sampling and analysis tool (FSATOOL). FSATOOL has some useful features. First and the most important, the whole work is extremely simplified into two steps, one is the fast sampling procedure and the other is the trajectory analysis procedure. Second, it contains several powerful sampling methods for the simulation on graphics process unit, including our previous mixing replica exchange molecular dynamics method. The method combines the advantages of the biased and unbiased simulations. Finally, it extracts the dominant transition pathways automatically from the folding network by Markov state model. Users do not need to do the tedious intermediate steps by hand. To illustrate the usage of FSATOOL in practice, we perform one simulation for a RNA hairpin in explicit solvent. All the results are presented. © 2019 Wiley Periodicals, Inc.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号