首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A methodology for the evaluation of the performance of an analytical method for incurred samples is presented. Since this methodology is based on intra-laboratory information, it is suitable for analytical fields that lack reference materials with incurred analytes and it can be used to evaluate the analytical steps prior to the analytical portion, which are usually excluded in proficiency tests or at the certification of reference materials. This methodology can be based on tests performed on routine samples allowing the collection of information on the more relevant combinations analyte/matrix; therefore, this approach is particularly useful for analytical fields that involve a high number of analyte/matrix combinations, which are difficult to cover even considering the frequent participation in expensive proficiency tests.This approach is based on the development of a model of the performance of the analytical method based on the differential approach for the quantification of measurement uncertainty and on the comparison of recovery associated with each one of the analytical steps whose performance can vary with the analyte origin, for spiked and incurred samples.This approach was applied to the determination of pesticide residues in apples. For the analytes covered, no evidence was found that the studied sample processing and extraction steps performance for this matrix varies with the analyte origins.  相似文献   

2.
 Owing to the importance of clinical analysis for human health, it is necessary to have reliable analytical information. Considering that the reliability of analytical information is a complex function of uncertainties of the sample, method, instrumentation, and data processing, it should be observed that the maximum reliability of analytical information is obtained by minimization of uncertainty values. Applying this concept to clinical analysis, the role of spectrometric and electrometric methods is highlighted.  相似文献   

3.
The expression of results with an uncertainty through the "bottom-up" approach, involving the estimation and combination of all the sources of uncertainty, represents a challenge when the analytical method includes mass transfer steps (MTS). These steps (e.g. extraction, evaporation, digestion, etc.) with inherently different from 100% recoveries lack models capable of describing their precision and efficiency. Recently, a new methodology was published aimed at the estimation of the performance of these critical steps. Comparison of the experimental dispersion from the replicated analysis of spiked samples with the combination of the uncertainty associated with gravimetric, volumetric and instrumental quantification steps (described by well established models) allows the estimation of the MTS uncertainty. Evaluation of the behaviour of the MTS within the analytical range supports the use of developed estimations over a wide concentration range. This methodology was applied, with success, to the determination of pesticide residues in melon in one particular proficiency test organised by the Food Analysis Performance Assessment Scheme (FAPAS) between November 2000 and February 2001.  相似文献   

4.
The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data, and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between observed and expected variability is tested by means of the T-test, which follows a chi-square distribution with a number of degrees of freedom determined by the number of replicates. Significant deviations between predicted and observed variability may be caused by a variety of effects, and examples will be presented; both underestimation and overestimation may occur, each leading to correcting the influence of uncertainty components according to their influence on the variability of experimental results. Some uncertainty components can be verified only with a very small number of degrees of freedom, because their influence requires samples taken at long intervals, e.g., the acquisition of a new calibrant. It is therefore recommended to include verification of the uncertainty budget in the continuous QA/QC monitoring; this will eventually lead to a test also for such rarely occurring effects.  相似文献   

5.
A model for calculating uncertainty in routine multi-element analysis is described. The model is constructed according to the principles of GUM/EURACHEM. Control chart results are combined with other existing data and results from the actual measurement into a concentration-dependent estimate of combined standard uncertainty. Since possible sources of bias are included in the calculation, overall bias as estimated from the data is used only as a control to identify needs for modification of the model and/or the analytical procedure. For each individual sample, uncertainty can be calculated automatically based on two pre-calculated parameters together with measured concentration and instrumental standard deviation. As an example, the model is demonstrated for inductively coupled plasma-mass spectrometry (ICP-MS) analysis of sewage sludge including laboratory sub-sampling, sample preparation, and instrumental determination.  相似文献   

6.
da Silva RJ  Lino MJ  Santos JR  Camões MF 《The Analyst》2000,125(8):1459-1464
A 'bottom-up' approach for the expression of results obtained from analytical methods that include analytical steps with recovery inherently different from 100% [mass transfer steps (MTS): extraction, evaporation, clean-up procedures, digestion, etc.] is presented. The estimation of the combination of all MTS uncertainty involves the comparison of the experimental dispersion of replicated analyses of spiked samples with the estimation of the uncertainty obtained for the combination of all uncertainty sources except MTS ones ('incomplete' estimation). The estimation of MTS uncertainty by difference is performed after evaluating the statistical difference between the 'incomplete' estimation and the experimental dispersion (F-test). When the two estimations are statistically equivalent, the MTS uncertainty is considered to be negligible in relation to the other sources budget. The assumption of constancy of MTS performance within the analytical range is tested through single analyses at several concentration levels and is evaluated by the inclusion of the expected values at the intervals resulting from the combination of the MTS uncertainty estimation performed at one concentration level and the 'incomplete' estimation. The developed methodology can also be useful for method optimisation and validation and for the detection of small trends in results. The determination of pesticides in sweet peppers by GC-NPD was used to explore the above concepts.  相似文献   

7.
Reliability of measurements of pesticide residues in food   总被引:1,自引:0,他引:1  
This paper accounts for the major sources of errors associated with pesticide residue analysis and illustrates their magnitude based on the currently available information. The sampling, sample processing and analysis may significantly influence the uncertainty and accuracy of analytical data. Their combined effects should be considered in deciding on the reliability of the results. In the case of plant material, the average random sampling (coefficient of variation, CV=28–40%) and sample processing (CV up to 100%) errors are significant components of the combined uncertainty of the results. The average relative uncertainty of the analytical phase alone is about 17–25% in the usual 0.01–10 mg/kg concentration range. The major contributor to this error can be the gas-liquid chromatography (GLC) or high-performance liquid chromatography (HPLC) analysis especially close to the lowest calibrated level. The expectable minimum of the combined relative standard uncertainty of the pesticide residue analytical results is in the range of 33–49% depending on the sample size.The gross and systematic errors may be much larger than the random error. Special attention is required to obtain representative random samples and to eliminate the loss of residues during sample preparation and processing.  相似文献   

8.
We propose a new procedure for estimating the uncertainty in quantitative routine analysis. This procedure uses the information generated when the trueness of the analytical method is assessed from recovery assays. In this paper, we assess trueness by estimating proportional bias (in terms of recovery) and constant bias separately. The advantage of the procedure is that little extra work needs to be done to estimate the measurement uncertainty associated to routine samples. This uncertainty is considered to be correct whenever the samples used in the recovery assays are representative of the future routine samples (in terms of matrix and analyte concentration). Moreover, these samples should be analysed by varying all the factors that can affect the analytical method. If they are analysed in this fashion, the precision estimates generated in the recovery assays take into account the variability of the routine samples and also all the sources of variability of the analytical method. Other terms related to the sample heterogeneity, sample pretreatments or factors not representatively varied in the recovery assays should only be subsequently included when necessary. The ideas presented are applied to calculate the uncertainty of results obtained when analysing sulphides in wine by HS-SPME-GC.  相似文献   

9.
For consistent interpretation of an analytical method result it is necessary to evaluate the confidence that can be placed in it, in the form of a measurement uncertainty estimate. The Guide to the expression of Uncertainty in Measurement issued by ISO establishes rules for evaluating and expressing uncertainty. Carotenoid determination in food is a complex analytical process involving several mass transfer steps (extraction, evaporation, saponification, etc.), making difficult the application of these guidelines. The ISO guide was interpreted for analytical chemistry by EURACHEM, which includes the possibility of using intra- and interlaboratory information. Measurement uncertainty was estimated based on laboratory validation data, including precision and method performance studies, and also, based on laboratory participation in proficiency tests. These methods of uncertainty estimation were applied to analytical results of different food matrices of fruits and vegetables. Measurement uncertainty of food carotenoid determination was 10–30% of the composition value in the great majority of cases. Higher values were found for measurements near instrumental quantification limits (e.g. 75% for β-cryptoxanthin, and 99% for lutein, in pear) or when sample chromatograms presented interferences with the analyte peak (e.g. 44% for α-carotene in orange). Lower relative expanded measurement uncertainty values (3–13%) were obtained for food matrices/analytes not requiring the saponification step. Based on these results, the saponification step should be avoided if food carotenoids are not present in the ester form. Food carotenoid content should be expressed taking into account the measurement uncertainty; therefore the maximum number of significant figures of a result should be 2.  相似文献   

10.
The results obtained by a laboratory over a number of proficiency testing/external quality assessment schemes (PT/EQAS) rounds can give information on the uncertainty of its measurements for a given test, provided that conditions such as full coverage of the routine analytical range, traceability, and small uncertainty of the assigned values (compared to the spread of the results) are met and provided that systematic deviations and any other sources of uncertainty are considered. As organisers of the Italian EQAS (ITEQAS) in occupational and environmental laboratory medicine, we tested this hypothesis using as model data from well-performing laboratories taking part in ITEQAS for lead in blood over the last 2 years. We also investigated how different PT/EQAS features (frequency of trials and number of samples) would affect a laboratory estimate of its uncertainty. Such information can be helpful in improving PT/EQAS organisation and define, for a given test: (a) the state of the art of the uncertainty of current measurement procedures, (b) identify needs for improvement of analytical methodologies and (c) set targets for acceptable uncertainty values.Presented at the Eurachem PT Workshop September 2005, Portorož, Slovenia.Papers published in this section do not necessarily reflect the opinion of the Editors, the Editorial Board and the Publisher.  相似文献   

11.
The combined uncertainty in the analytical results of solid materials for two methods (ET-AAS, analysis after prior sample digestion and direct solid sampling) are derived by applying the Guide to the Expression of Uncertainty in Measurement from the International Standards Organization. For the analysis of solid materials, generally, three uncertainty components must be considered: (i) those in the calibration, (ii) those in the unknown sample measurement and (iii) those in the analytical quality control (AQC) process. The expanded uncertainty limits for the content of cadmium and lead from analytical data of biological samples are calculated with the derived statistical estimates. For both methods the expanded uncertainty intervals are generally of similar width, if all sources of uncertainty are included. The relative uncertainty limits for the determination of cadmium range from 6% to 10%, and for the determination of lead they range from 8% to 16%. However, the different uncertainty components contribute to different degrees. Though with the calibration based on reference solutions (digestion method) the respective contribution may be negligible (precision < 3%), the uncertainty from a calibration based directly on a certified reference material (CRM) (solid sampling) may contribute significantly (precision about 10%). In contrast to that, the required AQC measurement (if the calibration is based on reference solutions) contributes an additional uncertainty component, though for the CRM calibration the AQC is “built-in”. For both methods, the uncertainty in the certified content of the CRM, which is used for AQC, must be considered. The estimation of the uncertainty components is shown to be a suitable tool for the experimental design in order to obtain a small uncertainty in the analytical result.  相似文献   

12.
Domestic and international regulatory limits have been established for aflatoxin in almonds and other tree nuts. It is difficult to obtain an accurate and precise estimate of the true aflatoxin concentration in a bulk lot because of the uncertainty associated with the sampling, sample preparation, and analytical steps of the aflatoxin test procedure. To evaluate the performance of aflatoxin sampling plans, the uncertainty associated with sampling lots of shelled almonds for aflatoxin was investigated. Twenty lots of shelled almonds were sampled for aflatoxin contamination. The total variance associated with measuring B1 and total aflatoxins in bulk almond lots was estimated and partitioned into sampling, sample preparation, and analytical variance components. All variances were found to increase with an increase in aflatoxin concentration (both B1 and total). By using regression analysis, mathematical expressions were developed to predict the relationship between each variance component (total, sampling, sample preparation, and analysis variances) and aflatoxin concentration. Variance estimates were the same for B1 and total aflatoxins. The mathematical relationships can be used to estimate each variance for a given sample size, subsample size, and number of analyses other than that measured in the study. When a lot with total aflatoxins at 15 ng/g was tested by using a 10 kg sample, a vertical cutter mixer type of mill, a 100 g subsample, and high-performance liquid chromatography analysis, the sampling, sample preparation, analytical, and total variances (coefficient of variation, CV) were 394.7 (CV, 132.4%), 14.7 (CV, 25.5%), 0.8 (CV, 6.1%), and 410.2 (CV, 135.0%), respectively. The percentages of the total variance associated with sampling, sample preparation, and analytical steps were 96.2, 3.6, and 0.2, respectively.  相似文献   

13.
Food and feed analysts are confronted with a number of common problems, irrespective of the analytical target. The analytical procedure can be described as a series of successive steps: sampling, sample processing, analyte extraction, and ending, finally, in interpretation of an analytical result produced with, e.g., real-time polymerase chain reaction. The final analytical result is dependent on proper method selection and execution and is only valid if valid methods (modules) are used throughout the analytical procedure. The final step is easy to validate-the measurement uncertainty added from this step is relatively limited and can be estimated with a high degree of precision. In contrast, the front-end sampling and processing steps have not evolved much, and the corresponding methods are rarely or never experimentally validated according to internationally harmonized protocols. In this paper, we outlined a strategy for modular validation of the entire analytical procedure, using an upstream validation approach, illustrated with methods for genetically modified materials that may partially apply also to other areas of food and feed analyses. We have also discussed some implications and consequences of this approach in relation to reference materials, measurement units, and thresholds for labelling and enforcement, and for application of the validated methods (modules) in routine food and feed analysis.  相似文献   

14.
The volume fraction dependence of the static magnetization of two magnetic fluids with different degrees of steric stabilization was measured at low field values (0-10 kA/m) and it was found to be nonlinear for both magnetic fluids. The nonlinearity is more pronounced in the case of the less stabilized magnetic fluid. The experimental data were processed by nonlinear regression using an analytical model for the formation of chain-like magnetic particle aggregates in magnetic fluids. The calculated dependence on the degree of steric stabilization, magnetic field, and sample concentration of the mean number of particles per chain was in the range (1-1.04).  相似文献   

15.
ICP-AES法测定油漆中总铅量不确定度的讨论   总被引:3,自引:0,他引:3  
根据EURACHEM/CITAC2000中的规定计算了ICP—AES法测定油漆中总铅量的不确定度,建立了数学模型和根据在测试过程中产生不确定度的变量建立了因果图。通过转化数学模型和因果图,对油漆标准样品中含铅量的测定精密度和准确度进行试验,并计算了此方法的不确定度。结论中提出可根据在日常分析试验中所积累的数据,用B类不确定度的计算方法计算所得的测定过程的不确定度具有更高的真实可信性,并指出要不断积累测试数据,不断更新测定方法的不确定度,这样得到的不确定度更为可信合理。  相似文献   

16.
In the current drug discovery environment, higher-throughput analytical assays have become essential to keep pace with the screening demands for drug metabolism and pharmacokinetics (DMPK) attributes. This has been dictated by advances primarily in chemical procedures, notably combinatorial and parallel syntheses, which has resulted in many-fold increases in the number of compounds requiring DMPK evaluation. Because of its speed and specificity, liquid chromatography/tandem mass spectrometry (LC/MS/MS) has become the dominant technology for sample analysis in the DMPK screening assays. For higher-throughput assays, analytical speed as well as other factors such as method development, data processing, quality control, and report generation, must be optimized. The four-way multiplexed electrospray interface (MUX), which allows for the analysis of four LC eluents simultaneously, has been adopted to maximize the rate of sample introduction into the mass spectrometer. Generic fast-gradient HPLC methods that are suitable for approximately 80% of the new chemical entities encountered have been developed. In-house-written software programs have been used to streamline information flow within the system, and for quality control by automatically identifying analytical anomalies. By integrating these components together with automated method development and data processing, a system capable of screening 100 compounds per week for Caco-2 permeability has been established.  相似文献   

17.
Abstract

Chemometrics is a recent discipline concerned, principally, with the application of mathematics and statistics to laboratory systems. One way in which the chemometrician can aid the environmental analytical chemist is via planned experimental designs. In this paper the importance of experimental design is illustrated and the main considerations prior to experimentation, namely, degrees of freedom, analytical errors, coding and modelling, are outlined. This is exemplified by a study of the influence of potentially toxic heavy metals on the growth of barley seedlings. Undesigned univariate experiments suggest that Tl is probably more toxic than Cd. A three factor central composite design is reported, to study the relative toxicities of Tl, Cd and Pb and also of Tl, Fe and Zn. The paper exemplifies how much information can be obtained from the resultant experimental response data. Multilinear regression can be employed to produce a quadratic model: this can be interpreted graphically by reconstructed univariate response curves and 3-dimensional response surfaces. Analysis of variance is a statistical method for computing how well the model has been fitted, taking into account analytical errors. With the aid of modern graphical computing, a variety of confidence intervals can be displayed for both univariate and bivariate responses. The usefulness of the design can be visualised by displaying leverage over and outside the experimental region. Finally future trends in multivariate response methodology are discussed.  相似文献   

18.
Consistent treatment of measurement bias, including the question of whether or not to correct for bias, is essential for the comparability of measurement results. The case for correcting for bias is discussed, and it is shown that instances in which bias is known or suspected, but in which a specific correction cannot be justified, are comparatively common. The ISO Guide to the Expression of Uncertainty in Measurement does not provide well for this situation. It is concluded that there is a need for guidance on handling cases of uncorrected bias. Several different published approaches to the treatment of uncorrected bias and its uncertainty are critically reviewed with regard to coverage probability and simplicity of execution. On the basis of current studies, and taking into account testing laboratory needs for a simple and consistent approach with a symmetric uncertainty interval, we conclude that for most cases with large degrees of freedom, linear addition of a bias term adjusted for exact coverage ("U(e)") as described by Synek is to be preferred. This approach does, however, become more complex if degrees of freedom are low. For modest bias and low degrees of freedom, summation of bias, bias uncertainty and observed value uncertainty in quadrature ("RSSu") provides a similar interval and is simpler to adapt to reduced degrees of freedom, at the cost of a more restricted range of application if accurate coverage is desired.  相似文献   

19.
Accurate analytical results with known uncertainty are required for the safety assessment of pesticides and testing the conformity of marketed food and feed with the maximum residue limits. The available information on various sources of errors was examined with special emphasis to those which may remain unaccounted for based on the current practice of many laboratories. The method validation typically covers the steps of the pesticide residue determination from the extraction of spiked samples to the instrumental determination, which contribute to only 10–40% of total variance of results. Though the variability of sampling, sample size reduction and sample processing may amount to the 60–90% of total variance, it generally remains unnoticed leading to wrong decisions. Another important source of gross error is the mismatch of the residues analysed and those included in the relevant residue definition. Procedures which may be applied for eliminating or reducing the errors are discussed.  相似文献   

20.
We introduce a new formula for the acceleration weight factor in the hyperdynamics simulation method, the use of which correctly provides an exact simulation of the true dynamics of a system. This new form of hyperdynamics is valid and applicable where the transition state theory (TST) is applicable and also where the TST is not applicable. To illustrate this new formulation, we perform hyperdynamics simulations for four systems ranging from one degree of freedom to 591 degrees of freedom: (1) We first analyze free diffusion having one degree of freedom. This system does not have a transition state. The TST and the original form of hyperdynamics are not applicable. Using the new form of hyperdynamics, we compute mean square displacement for a range of time. The results obtained agree perfectly with the analytical formula. (2) Then we examine the classical Kramers escape rate problem. The rate computed is in perfect agreement with the Kramers formula over a broad range of temperature. (3) We also study another classical problem: Computing the rate of effusion out of a cubic box through a tiny hole. This problem does not involve an energy barrier. Thus, the original form of hyperdynamics excludes the possibility of using a nonzero bias and is inappropriate. However, with the new weight factor formula, our new form of hyperdynamics can be easily implemented and it produces the exact results. (4) To illustrate applicability to systems of many degrees of freedom, we analyze diffusion of an atom adsorbed on the (001) surface of an fcc crystal. The system is modeled by an atom on top of a slab of six atomic layers. Each layer has 49 atoms. With the bottom two layers of atoms fixed, this system has 591 degrees of freedom. With very modest computing effort, we are able to characterize its diffusion pathways in the exchange-with-the-substrate and hop-over-the-bridge mechanisms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号