首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
Ellison SL  Holcombe DG  Burns M 《The Analyst》2001,126(2):199-210
Response surface modelling is proposed as an approach to the estimation of uncertainties associated with derivatisation, and is compared with a kinetic study. Fatty acid methyl ester formation is used to illustrate the approach, and kinetic data for acid-catalysed methylation and base-catalysed transesterification are presented. Kinetic effects did not lead to significant uncertainty contributions under normal conditions for base-catalysed transesterification of triglycerides. Uncertainties for acid-catalysed methylation with BF3 approach significance, but could be reduced by extending reaction times from 3 to 5 min. Non-linearity is a common feature of response surface models for derivatisation and compromised first-order estimates of uncertainty; it was necessary to include higher order differential terms in the uncertainty estimate. Simulations were used to examine the general applicability of the approach and to study the effects of poor precision and of change of response surface model. It is concluded that reliable uncertainty estimates are available only when the model is statistically significant, robust, representative of the underlying behaviour of the system, and forms a good fit to the data; arbitrary models are not generally suitable for uncertainty estimation. Where statistically insignificant effects were included in models, they gave negligible uncertainty contributions.  相似文献   

2.
The paper describes experiments for the evaluation of uncertainties associated with a number of chromatographic parameters. Studies of the analysis of vitamins by HPLC illustrate the estimation of the uncertainties associated with experimental "input" parameters such as the detector wavelength, column temperature and mobile phase flow-rate. Experimental design techniques, which allow the efficient study a number of parameters simultaneously, are described. Multiple linear regression was used to fit response surfaces to the data. The resulting equations were used in the estimation of the uncertainties. Three approaches to uncertainty calculation were compared--Kragten's spreadsheet, symmetric spreadsheet and algebraic differentiation. In cases where non-linearity in the model was significant, agreement between the uncertainty estimates was poor as the spreadsheet approaches do not include second-order uncertainty terms.  相似文献   

3.
Lyn JA  Ramsey MH  Damant AP  Wood R 《The Analyst》2007,132(12):1231-1237
Measurement uncertainty is a vital issue within analytical science. There are strong arguments that primary sampling should be considered the first and perhaps the most influential step in the measurement process. Increasingly, analytical laboratories are required to report measurement results to clients together with estimates of the uncertainty. Furthermore, these estimates can be used when pursuing regulation enforcement to decide whether a measured analyte concentration is above a threshold value. With its recognised importance in analytical measurement, the question arises of 'what is the most appropriate method to estimate the measurement uncertainty?'. Two broad methods for uncertainty estimation are identified, the modelling method and the empirical method. In modelling, the estimation of uncertainty involves the identification, quantification and summation (as variances) of each potential source of uncertainty. This approach has been applied to purely analytical systems, but becomes increasingly problematic in identifying all of such sources when it is applied to primary sampling. Applications of this methodology to sampling often utilise long-established theoretical models of sampling and adopt the assumption that a 'correct' sampling protocol will ensure a representative sample. The empirical approach to uncertainty estimation involves replicated measurements from either inter-organisational trials and/or internal method validation and quality control. A more simple method involves duplicating sampling and analysis, by one organisation, for a small proportion of the total number of samples. This has proven to be a suitable alternative to these often expensive and time-consuming trials, in routine surveillance and one-off surveys, especially where heterogeneity is the main source of uncertainty. A case study of aflatoxins in pistachio nuts is used to broadly demonstrate the strengths and weakness of the two methods of uncertainty estimation. The estimate of sampling uncertainty made using the modelling approach (136%, at 68% confidence) is six times larger than that found using the empirical approach (22.5%). The difficulty in establishing reliable estimates for the input variable for the modelling approach is thought to be the main cause of the discrepancy. The empirical approach to uncertainty estimation, with the automatic inclusion of sampling within the uncertainty statement, is recognised as generally the most practical procedure, providing the more reliable estimates. The modelling approach is also shown to have a useful role, especially in choosing strategies to change the sampling uncertainty, when required.  相似文献   

4.
《Analytica chimica acta》2004,520(1-2):245-255
In recent years the declaration of estimated uncertainty of measurement has become an integral part of analytical results. This study presents the assessment of results generated within the analysis of selected pesticides represented by carbamates, pyrethroides and azoles, residues of which may be found in treated apples. Multiresidue method used for analysis of spiked samples (residues at levels 0.040–0.163 mg/kg) consisted of (i) ethyl acetate extraction, (ii) GPC clean-up and (iii) identification/quantification of residues by GC. Procedures utilizing either conventional (electron-capture, nitrogen–phosphorus) or mass-selective detectors (quadrupole and ion trap analyzer) were evaluated. The results generated through alternative strategies of uncertainty estimation (“bottom-up”, “top-down”) were compared.

Using the “bottom-up” approach uncertainty of extraction which comprises two components—(i) repeatability of extraction and (ii) uncertainty of extraction recovery was shown to represent the main source of combined standard uncertainty (values of uncertainty of extraction for tested pesticides ranged from 4.6% to 21.6%). On the other hand, uncertainties associated with the GC calibration (uncertainties of weighing and diluting standards, uncertainties of purity of standards) were not so important (most of them did not exceed 2%). Combined standard uncertainties associated with the described analytical method ranged for individual compounds from 9.3% to 24.3%. Similar values of combined standard uncertainties were obtained using the alternative “top-down” approach.  相似文献   


5.
戴骐 《分析试验室》2007,26(8):77-79
建立了电感耦合等离子体发射光谱(ICP-AES)测定糖果中铅的相应数学模型, 对数学模型中各个参数进行不确定度来源分析, 分别对A类不确定度或B类不确定度进行评定. 对各不确定度分量合成和扩展, 得到铅质量分数的不确定度. 结果表明: 标准溶液的配制、标准曲线拟合线性方程及样品溶液的定容是不确定度的主要来源.  相似文献   

6.
Abstract

The ability to determine the biodegradability of chemicals without resorting to expensive tests is ecologically and economically desirable. Models based on quantitative structure–activity relations (QSAR) provide some promise in this direction. However, QSAR models in the literature rarely provide uncertainty estimates in more detail than aggregated statistics such as the sensitivity and specificity of the model’s predictions. Almost never is there a means of assessing the uncertainty in an individual prediction. Without an uncertainty estimate, it is impossible to assess the trustworthiness of any particular prediction, which leaves the model with a low utility for regulatory purposes. In the present work, a QSAR model with uncertainty estimates is used to predict biodegradability for a set of substances from a publicly available data set. Separation was performed using a partial least squares discriminant analysis model, and the uncertainty was estimated using bootstrapping. The uncertainty prediction allows for confidence intervals to be assigned to any of the model’s predictions, allowing for a more complete assessment of the model than would be possible through a traditional statistical analysis. The results presented here are broadly applicable to other areas of modelling as well, because the calculation of the uncertainty will clearly demonstrate where additional tests are needed.  相似文献   

7.
Lifetime prediction of polymeric materials often requires extrapolation of accelerated aging data with the suitability and confidence in such approaches being subject to ongoing discussions. This paper reviews the evidence of non-Arrhenius behaviour (curvature) instead of linear extrapolations in polymer degradation studies. Several studies have emphasized mechanistic variations in the degradation mechanism and demonstrated changes in activation energies but often data have not been fully quantified. To improve predictive capabilities a simple approach for dealing with curvature in Arrhenius plots is examined on a basis of two competing reactions. This allows for excellent fitting of experimental data as shown for some elastomers, does not require complex kinetic modelling, and individual activation energies are easily determined. Reviewing literature data for the thermal degradation of polypropylene a crossover temperature (temperature at which the two processes equally contribute) of 83 °C was determined, with the high temperature process having a considerably higher activation energy (107–156 kJ/mol) than the low temperature process (35–50 kJ/mol). Since low activation energy processes can dominate at low temperatures and longer extrapolations result in larger uncertainties in lifetime predictions, experiments focused on estimating Ea values at the lowest possible temperature instead of assuming straight line extrapolations will lead to more confident lifetime estimates.  相似文献   

8.
ICP-AES法测定蔬菜、水果中锌含量的不确定度评定   总被引:1,自引:0,他引:1  
不确定度[1-3]在我国各行业的应用发展不均衡,在蔬菜、水果检测方法的评定方面鲜有报道。ICP作为一种灵敏度高、快速、高效的检测手段在食品、环保及其它领域得到了广泛的应用,其检测结果不确定度的评定较为重要。本文以ICP法测定番茄中的锌为例,对ICP法测定蔬菜、水果中重金属  相似文献   

9.
《Analytical letters》2012,45(17):3322-3342
Abstract

This work describes the estimation of uncertainty following the “bottom‐up” and the “top‐down” approaches for the determination of several trace metals in seawater when using a classical 1‐pyrrolidinedithiocarbamate/diethyldithiocarbamate/freon extraction method followed by electrothermal atomic absorption spectrometry. A detailed analysis of the uncertainty sources of this method is included, which allows estimating the expanded uncertainties. The results show that the main contribution to the relative overall uncertainty is the extraction step. The estimation of the uncertainty components is shown to be a suitable tool for the experimental design in order to obtain a small uncertainty in the analytical result.  相似文献   

10.
The importance of particle X-ray microanalysis for contemporary metallurgical research has been accentuated. Corrections in the formulae for the geometric modelling method and simplifications in the calculation procedure are introduced. Conditions for successful application of the peak to background ratio method were outlined. Besides these two methods related to isolated particles a new extrapolation method for matrix embedded particles is developed and comments to the particle weight fraction estimation method for the same kind of particles are given.  相似文献   

11.
It has been suggested that typical ruggedness tests might lead directly to uncertainty estimates. This assertion is tested using simple experimental studies of uncertainties associated with sample grinding and oven-drying operations. The results are used to predict the outcome of typical ruggedness tests on the same systems. It is concluded that uncertainty estimation from ruggedness tests is appropriate only where a strong effect can be observed. Since current practice in ruggedness testing is predisposed to confirming insignificance, typical ruggedness tests are not likely to lead to reliable uncertainty estimates; instead, lack of statistical significance in ruggedness tests is better interpreted as reason to leave an effect out of the uncertainty budget. Only where the ruggedness study is modified in order to achieve statistically significant change is it useful for uncertainty estimation. Received: 27 November 2000 Accepted: 13 February 2001  相似文献   

12.
Fluorescence correlation spectroscopy (FCS) has emerged as a powerful technique for measuring low concentrations of fluorescent molecules and their diffusion constants. In FCS, the experimental data is conventionally fit using standard local search techniques, for example, the Marquardt-Levenberg (ML) algorithm. A prerequisite for these categories of algorithms is the sound knowledge of the behavior of fit parameters and in most cases good initial guesses for accurate fitting, otherwise leading to fitting artifacts. For known fit models and with user experience about the behavior of fit parameters, these local search algorithms work extremely well. However, for heterogeneous systems or where automated data analysis is a prerequisite, there is a need to apply a procedure, which treats FCS data fitting as a black box and generates reliable fit parameters with accuracy for the chosen model in hand. We present a computational approach to analyze FCS data by means of a stochastic algorithm for global search called PGSL, an acronym for Probabilistic Global Search Lausanne. This algorithm does not require any initial guesses and does the fitting in terms of searching for solutions by global sampling. It is flexible as well as computationally faster at the same time for multiparameter evaluations. We present the performance study of PGSL for two-component with triplet fits. The statistical study and the goodness of fit criterion for PGSL are also presented. The robustness of PGSL on noisy experimental data for parameter estimation is also verified. We further extend the scope of PGSL by a hybrid analysis wherein the output of PGSL is fed as initial guesses to ML. Reliability studies show that PGSL and the hybrid combination of both perform better than ML for various thresholds of the mean-squared error (MSE).  相似文献   

13.
A detailed statistical study is presented, based on simulated experimental data, on the estimation of activation parameters using the Arrhenius equation: k = A exp(B/T). The close correlation of the two parameters is shown, which requires the computation of the covariance matrix for the representation of uncertainties. This matrix facilitates the correct estimation of the confidence interval for interpolated (or extrapolated) values of rate coefficients. It is proposed that the full correlation matrix should be published in any article dealing with the determination of Arrhenius parameters. The importance of correct weighting is emphasized. Nonlinear fitting to the Arrhenius equation can be carried out without weighting only in case the (absolute) error of rate coefficient is independent of the temperature. Simulated experiments show that noncorrect weighting shifts the average values of fitted parameters and increases the variance of the parameters as well. With respect to the modified Arrhenius equation: k = A · Tn exp(B/T), statistical analysis shows that the physically meaningful estimation of all three parameters is impossible. Nonlinear fitting of three parameters is suggested for interpolation (and extrapolation) of rate coefficients, whereas in case of activation parameter estimation, the fixing of “n” on the basis of theoretical considerations is advised followed by the estimation of the remaining two parameters.  相似文献   

14.
An inside-variance estimation method (IVEM) for binary interaction parameter regression in thermodynamic models is proposed. This maximum likelihood method involves the re-computation of the variance for each iteration of the optimization procedure, automatically re-weighting the objective function. Most of the maximum likelihood approaches currently used to regress the parameters of thermodynamic models fix the variances, converting the problem into a traditional weighted least squares minimization. However, such approaches lead to residual variances (between measured and calculated values) that are inconsistent with the fixed variances and, thus, do not necessarily produce optimum parameters for prediction purposes. The new method (IVEM) substantially improves fluid phase equilibria predictions (as shown by the examples presented) by maintaining consistency between the residual variances and the variance used in the objective function. This results in better parameter estimation and to a direct measure of the uncertainty in the model prediction.  相似文献   

15.
This paper deals with some important (but often neglected) details about the uncertainty of retention measurement in thin layer chromatography, the propagation of uncertainty during computing simple and more complex values from the retention data, ending in influence of the retention uncertainty onto the regression estimates during extrapolation and lipophilicity estimation. Theoretical considerations are tested on data from previous study. It can be concluded that when TLC spots are broad and the retention uncertainty exceeds about 0.02 of RF value, the uncertainty should be taken into the account in further computations.  相似文献   

16.
The provision of uncertainty estimates along with measurement results or values computed thereof is metrologically mandatory. This is in particular true for observational data related to climate change, and thermodynamic properties of geophysical substances derived thereof, such as of air, seawater or ice. The recent International Thermodynamic Equation of Seawater 2010 (TEOS-10) provides such properties in a comprehensive and highly accurate way, derived from empirical thermodynamic potentials released by the International Association for the Properties of Water and Steam (IAPWS). Currently, there are no generally recognised algorithms available for a systematic and comprehensive estimation of uncertainties for arbitrary properties derived from those potentials at arbitrary input values, based on the experimental uncertainties of the laboratory data that were used originally for the correlations during the construction process. In particular, standard formulas for the uncertainty propagation which do not account for mutual uncertainty correlations between different coefficients tend to systematically and significantly overestimate the uncertainties of derived quantities, which may lead to practically useless results. In this paper, stochastic ensembles of thermodynamic potentials, derived from randomly modified input data, are considered statistically to provide analytical formulas for the computation of the covariance matrix of the related regression coefficients, from which in turn uncertainty estimates for any derived property can be computed a posteriori. For illustration purposes, simple analytical application examples of the general formalism are briefly discussed in greater detail.  相似文献   

17.
In preparation for studying the hydrolytic degradation of Estane® 5703 and related poly(ester urethane) elastomers, the absorption (solubility) and diffusion of water in these polymers have been examined experimentally and modeled theoretically. Weight gain and loss experiments have been carried out. The amount of water absorbed per gram of sample was linear at low relative humidities (RHs) but curved upward at higher RHs. This curvature was not fit by Henry's law or the Flory–Huggins equation but was easily fit by a water‐cluster model. Diffusion coefficients were determined by fitting the time dependence of the sample weights, and the diffusion appeared Fickian to within experimental uncertainty. The similarity of related polymers was used to determine the approximate temperature dependence of the absorption. © 2001 John Wiley & Sons, Inc. J Polym Sci Part B: Polym Phys 40: 181–191, 2002  相似文献   

18.
A spreadsheet method allowing rapid calculation of combined standard uncertainties is described. The model used allows explicitly for correlation effects, and requires a user to enter only the parameters, the calculation used to obtain the final result (including relevant influence factors), the individual standard uncertainties for the parameters, and estimates of correlation coefficients where necessary. The estimation of correlation coefficients in common cases is discussed, and it is shown that correlation is likely to be practically significant only when the correlated contribution to individual standard uncertainties is significantly over about 30% of the relevant standard uncertainty, leading to correlation coefficients |r| greater than 0.1. The implementation includes a more robust differentiation algorithm than previously reported for spreadsheet use, and initial preparation of the spreadsheets has been automated. The principle is illustrated with a simple example. Electronic Supplementary Material  Supplementary material is available in the online version of this article at and is accessible for authorized users.
Stephen L. R. EllisonEmail:
  相似文献   

19.
An error analysis for numerically evaluating random uncertainties in x‐ray photoelectron spectroscopy has been implemented in version 2003 of the spectra treatment and analysis software UNIFIT in order to improve the understanding of the statistical basis and the reliability of the model parameters for photoelectron spectra. The theoretical basis as well as two approaches to obtain error limits of the fit parameters have been considered. Several test spectra have been analysed and discussed. A representative example has been chosen to demonstrate the relevance of the error estimation for practical surface analysis. Suggestions for the minimization of errors in the peak‐fitting procedures are presented. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

20.
Appropriate sampling, that includes the estimation of measurement uncertainty, is proposed in preference to representative sampling without estimation of overall measurement quality. To fulfil this purpose the uncertainty estimate must include contribution from all sources, including the primary sampling, sample preparation and chemical analysis. It must also include contributions from systematic errors, such as sampling bias, rather than from random errors alone. Case studies are used to illustrate the feasibility of this approach and to show its advantages for improved reliability of interpretation of the measurements. Measurements with a high level of uncertainty (e.g. 50%) can be shown to be fit for some specified purposes using this approach. Once reliable estimates of the uncertainty are available, then a probabilistic interpretation of results can be made. This allows financial aspects to be considered in deciding upon what constitutes an acceptable level of uncertainty. In many practical situations ”representative” sampling is never fully achieved. This approach recognises this and instead, provides reliable estimates of the uncertainty around the concentration values that imperfect appropriate sampling causes. Received: 28 December 2001 Accepted: 25 April 2002  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号