首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
In the EURACHEM/CITAC draft ”Quantifying uncertainty in analytical measurement” estimations of measurement uncertainty in analytical results for linear calibration are given. In this work these estimations are compared, i.e. the uncertainty deduced from repeated observations of the sample vs. the uncertainty deduced from the standard residual deviation of the regression. As a result of this study it is shown that an uncertainty estimation based on repeated observations can give more realistic values if the condition of variance homogeneity is not correctly fulfilled in the calibration range. The complete calculation of measurement uncertainty including assessment of trueness is represented by an example concerning the determination of zinc in sediment samples using ICP-atomic emission spectrometry. Received: 9 February 2002 Accepted: 17 April 2002  相似文献   

3.
4.
When drugs are poorly soluble then, instead of the potentiometric determination of dissociation constants, pH-spectrophotometric titration can be used along with nonlinear regression of the absorbance response surface data. Generally, regression models are extremely useful for extracting the essential features from a multiwavelength set of data. Regression diagnostics represent procedures for examining the regression triplet (data, model, method) in order to check (a) the data quality for a proposed model; (b) the model quality for a given set of data; and (c) that all of the assumptions used for least squares hold. In the interactive, PC-assisted diagnosis of data, models and estimation methods, the examination of data quality involves the detection of influential points, outliers and high leverages, that cause many problems when regression fitting the absorbance response hyperplane. All graphically oriented techniques are suitable for the rapid estimation of influential points. The reliability of the dissociation constants for the acid drug silybin may be proven with goodness-of-fit tests of the multiwavelength spectrophotometric pH-titration data. The uncertainty in the measurement of the pK a of a weak acid obtained by the least squares nonlinear regression analysis of absorption spectra is calculated. The procedure takes into account the drift in pH measurement, the drift in spectral measurement, and all of the drifts in analytical operations, as well as the relative importance of each source of uncertainty. The most important source of uncertainty in the experimental set-up for the example is the uncertainty in the pH measurement. The influences of various sources of uncertainty on the accuracy and precision are discussed using the example of the mixed dissociation constants of silybin, obtained using the SQUAD(84) and SPECFIT/32 regression programs.  相似文献   

5.
Two simple and error free methods (direct and interpolation) for obtaining mathematical models for constructing reference lines were developed and successfully applied to the hydrostatic stress-rupture data of polyethylene pipes. Both methods employed an algorithmic process that analyzed the observed stress-rupture data along with its mathematical model of the 50% regression (LTHS) line. For each method, a shift value Δc was determined and was used to obtain the mathematical model for constructing reference lines that satisfied the requirement of ISO TS 26873. That is, the reference lines so constructed accommodated at least 97.5% of all stress-rupture data points on or above this line, in addition to being parallel to and vertically shifted below the 50% regression lines by an amount Δc. In the direct method, the reference line was made to pass directly over the data point that is equal to or the first data point greater than the 97.5% data position among all data points. On the other hand, the interpolation method extracted a shift value that corresponded to the 97.5% data position by interpolating between the first data points over and below 97.5%. In this case, the reference lines were made to pass through the interpolated position of 97.5% at every temperature. The advantage of the proposed algorithmic methods is that the determination of mathematical models for reference lines only involves finding the data position(s) with a vertical shift value of Δc that satisfies the ≤97.5% requirement. With these methods, uncertainties and errors associated with the current trial and error approach for constructing reference lines can be eliminated. In this paper, the details of the algorithmic process for obtaining a proper shift value and using it to develop the mathematical model are described for each method. Also, examples of constructing reference lines using these models are illustrated for polyethylene pipes.  相似文献   

6.
Fourier transform near-infrared spectrometry has been used in combination with multivariate chemometric methods for wide applications in agriculture and food analysis. In this paper, we used linear partial least square and nonlinear least square support vector machine regression methods to establish calibration models for Fourier transform near-infrared spectrometric determination of pectin in shaddock peel samples. In particular, the tunable kernel parameters of the linear and nonlinear models were set changing in a moderate range and were optimally selected in conjunction with a Savitzky–Golay smoother. The smoothing parameters and the linear/nonlinear modeling parameters were combined for simultaneous optimization. To investigate the robustness of calibration models, parameter uncertainty were estimated in a direct way for the optimal linear and nonlinear models. Our results show that the nonlinear least square support vector machine method gives more accurate predictive results and is substantially more robust compared to the spectral noise when compared with the linear partial least square regression. Furthermore, the optimized least square support vector machine model was evaluated by the randomly selected test samples and the model test effect was much satisfactory. We anticipate that these linear and nonlinear methods and the methodology of determination of model parameter uncertainty will be applied to other analytes in the fields of near-infrared or Fourier transform near-infrared spectroscopy.  相似文献   

7.
The work of the Joint Committee for Guides in Metrology on the expression of uncertainty in measurement is considered. Statements are made about the current edition of the Guide to the Expression of Uncertainty in Measurement (the "GUM") and the nature of the process of "GUM revision". In particular, the supplemental guides being prepared to give added value to the GUM are outlined. The guides will cover (a) numerical methods for the propagation of distributions, (b) models with more than one measurand, (c) conformity assessment and (d) modelling.  相似文献   

8.
9.
The estimation of physicochemical parameters such as distillation points and relative densities still plays an important role in the quality control of gasoline and similar fuels. Their measurements according to standard ASTM procedures demands specific equipments and are time and work consuming. An alternative method to predict distillation points and relativity density by multivariate analysis of comprehensive two-dimensional gas chromatography with flame ionization detection (GC×GC-FID) data is presented here. Gasoline samples, previously tested according to standard methods, were used to build regression models, which were evaluated by external validation. The models for distillation points were built using variable selection methods, while the model for relativity density was built using the whole chromatograms. The root mean square prediction differences (RMSPD) obtained were 0.85%, 0.48%, 1.07% and 1.71% for 10, 50 and 90% v/v of distillation and for the final point of distillation, respectively. For relative density, the RMSPD was 0.24%. These results suggest that GC×GC-FID combined with multivariate analysis can be used to predict these physicochemical properties of gasoline.  相似文献   

10.
Lyn JA  Ramsey MH  Damant AP  Wood R 《The Analyst》2007,132(12):1231-1237
Measurement uncertainty is a vital issue within analytical science. There are strong arguments that primary sampling should be considered the first and perhaps the most influential step in the measurement process. Increasingly, analytical laboratories are required to report measurement results to clients together with estimates of the uncertainty. Furthermore, these estimates can be used when pursuing regulation enforcement to decide whether a measured analyte concentration is above a threshold value. With its recognised importance in analytical measurement, the question arises of 'what is the most appropriate method to estimate the measurement uncertainty?'. Two broad methods for uncertainty estimation are identified, the modelling method and the empirical method. In modelling, the estimation of uncertainty involves the identification, quantification and summation (as variances) of each potential source of uncertainty. This approach has been applied to purely analytical systems, but becomes increasingly problematic in identifying all of such sources when it is applied to primary sampling. Applications of this methodology to sampling often utilise long-established theoretical models of sampling and adopt the assumption that a 'correct' sampling protocol will ensure a representative sample. The empirical approach to uncertainty estimation involves replicated measurements from either inter-organisational trials and/or internal method validation and quality control. A more simple method involves duplicating sampling and analysis, by one organisation, for a small proportion of the total number of samples. This has proven to be a suitable alternative to these often expensive and time-consuming trials, in routine surveillance and one-off surveys, especially where heterogeneity is the main source of uncertainty. A case study of aflatoxins in pistachio nuts is used to broadly demonstrate the strengths and weakness of the two methods of uncertainty estimation. The estimate of sampling uncertainty made using the modelling approach (136%, at 68% confidence) is six times larger than that found using the empirical approach (22.5%). The difficulty in establishing reliable estimates for the input variable for the modelling approach is thought to be the main cause of the discrepancy. The empirical approach to uncertainty estimation, with the automatic inclusion of sampling within the uncertainty statement, is recognised as generally the most practical procedure, providing the more reliable estimates. The modelling approach is also shown to have a useful role, especially in choosing strategies to change the sampling uncertainty, when required.  相似文献   

11.
Meloun M  Militký J  Kupka K  Brereton RG 《Talanta》2002,57(4):721-740
Building a calibration model with detection and quantification capabilities is identical to the task of building a regression model. Although commonly used by analysts, an application of the calibration model requires at first careful attention to the three components of the regression triplet (data, model, method), examining (a) the data quality of the proposed model; (b) the model quality; (c) the LS method to be used or a fulfillment of all least-squares assumptions. This paper summarizes these components, describes the effects of deviations from assumptions and considers the correction of such deviations: identifying influential points is the first step in least-squares model building, the calibration task depends on the regression model used, and finally the least squares LS method is based on assumptions of normality of errors, homoscedasticity, independence of errors, overly influential data points and independent variables being subject to error. When some assumptions are violated, the ordinary LS is inconvenient and robust M-estimates with the iterative method of reweighted least-squares must be used. The effects of influential points, heteroscedasticity and non-normality on the calibration precision limits are also elucidated. This paper also considers the proper construction of the statistical uncertainty expressed as confidence limits predicting an unknown concentration (or amount) value, and its dependence on the regression triplet. The authors' objectives were to provide a thorough treatment that includes pertinent references, consistent nomeclature, and related mathematical formulae to show by theory and illustrative examples those approaches best suited to typical problems in analytical chemistry. Two new algorithms, calibration and linear regression written in s-plus and enabling regression triplet analysis, the estimation of calibration precision limits, critical levels, detection limits and quantification limits with the statistical uncertainty of unknown concentrations, form the goal of this paper.  相似文献   

12.
A recently presented regression technique for linear calibration, which is based on a variance component model for univariate quantitative measurement data, is compared with the conventional and far spread regression techniques ordinary least squares regression and weighted least squares regression. The associated statistical models and estimations are represented. Its application is demonstrated at some practical examples. With consideration of special variation causes, like matrix influence or the influence of several operating conditions on the measurement response, it can be shown that the application of the variance component model is an advantage.  相似文献   

13.
The present study summarizes the measurement uncertainty estimations carried out in Nestlé Research Center since 2002. These estimations cover a wide range of analyses of commercial and regulatory interests. In a first part, this study shows that method validation data (repeatability, trueness and intermediate reproducibility) can be used to provide a good estimation of measurement uncertainty.In a second part, measurement uncertainty is compared to collaborative trials data. These data can be used for measurement uncertainty estimation as far as the in-house validation performances are comparable to the method validation performances obtained in the collaborative trial.Based on these two main observations, the aim of this study is to easily estimate the measurement uncertainty using validation data.  相似文献   

14.
Ellison SL  Holcombe DG  Burns M 《The Analyst》2001,126(2):199-210
Response surface modelling is proposed as an approach to the estimation of uncertainties associated with derivatisation, and is compared with a kinetic study. Fatty acid methyl ester formation is used to illustrate the approach, and kinetic data for acid-catalysed methylation and base-catalysed transesterification are presented. Kinetic effects did not lead to significant uncertainty contributions under normal conditions for base-catalysed transesterification of triglycerides. Uncertainties for acid-catalysed methylation with BF3 approach significance, but could be reduced by extending reaction times from 3 to 5 min. Non-linearity is a common feature of response surface models for derivatisation and compromised first-order estimates of uncertainty; it was necessary to include higher order differential terms in the uncertainty estimate. Simulations were used to examine the general applicability of the approach and to study the effects of poor precision and of change of response surface model. It is concluded that reliable uncertainty estimates are available only when the model is statistically significant, robust, representative of the underlying behaviour of the system, and forms a good fit to the data; arbitrary models are not generally suitable for uncertainty estimation. Where statistically insignificant effects were included in models, they gave negligible uncertainty contributions.  相似文献   

15.
When more than one parameter is found by a least-squares calculation, the statistical uncertainties of the parameters are generally interdependent. If the uncertainty of one such parameter is quoted as a confidence interval based on the standard error estimate of that parameter and the Student t-statistic, this interval tends to be an underestimate. A suggestion is made to quote more conservative parameter uncertainties as the extreme points on the 95% joint confidence ellipsoid and it is shown that these joint parametric uncertainties are easily calculated from the standard error estimates. Both linear and nonlinear multiple regression are discussed. Nonlinear parameter uncertainties are found after an iterative search for the minimum sum-of-squares of residuals; searches by the Gauss and simplex methods are considered. A joint parametric uncertainty calculation is illustrated by a four-parameter nonlinear regression involving a pH potentiometric titration.  相似文献   

16.
The aim of this article is to study tree-based ensemble methods, new emerging modelling techniques, for authentication of samples of olive oil blends to check their suitability for classifying the samples according to the type of oil used for the blend as well as for predicting the amount of olive oil in the blend. The performance of these methods has been investigated in chromatographic fingerprint data of olive oil blends with other vegetable oils without needing either to identify or to quantify the chromatographic peaks. Different data mining methods—classification and regression trees, random forest and M5 rules—were tested for classification and prediction. In addition, these classification and regression tree approaches were also used for feature selection prior to modelling in order to reduce the number of attributes in the chromatogram. The good outcomes have shown that these methods allow one to obtain interpretable models with much more information than the traditional chemometric methods and provide valuable information for detecting which vegetable oil is mixed with olive oil and the percentage of oil used, with a single chromatogram.
Figure
?  相似文献   

17.
Environmental prognosis by geochemical modelling is a scientific approach to several open questions of general public interest. Two prominent fields where geochemical modelling holds an important share are the remediation of contaminated former uranium mining areas and safety assessment of radioactive waste repositories in the geosphere. In both fields, application of geochemical modelling is stipulated by public authorities. The enormous complexity of models that can be handled by computers rises the awareness on the meaningfulness of a modelling result and demands for provision of an estimate of the dependability of a calculation output by the computers. It is obvious that bias, over- and underestimation of uncertainty in input data reduces the relevance of the calculation output. Chemistry contributes important data to geochemical modelling, both from field analysis and in the fundamental physico-chemical quantities enclosed into the thermodynamic data base. Some examples will be given where progress in quality assessment of chemical data may further the predictive power of geochemical modelling.  相似文献   

18.
Fourier-transform mid-infrared (FT-MIR) spectroscopy, combined with partial least-squares (PLS) regression and IPW as feature selection method, was used to develop reduced-spectrum calibration models based on a few IR bands to provide near-real-time predictions of two key parameters for the characterization of finished red wines, which are essential from a quality assurance standpoint: total and volatile acidity. Separate PLS calibration models, correlating IR data (only considering those regions showing a high signal to noise ratio) with each response studied, were developed. Wavenumber selection was also performed applying IPW-PLS to take into account only significant predictors, in an attempt to improve the quality of the final models constructed. Using both PLS and IPW-PLS regression, prediction of the two responses modelled was performed with very high reliability, with RMSECV and RMSEP values on the order of 1% (comparable in terms of accuracy to the results provided by the respective reference analysis methods). An important advantage derived from the application of the IPW-PLS method had to do with the low number of original variables needed for modelling both total acidity (22 significant wavenumbers) and volatile acidity (only 11 selected predictor variables), in such a way that variable selection contributed to enhance the stability and parsimony properties of the final calibration models. The high quality of the calibration models proposed encourages the feasibility of implementing them as a fast and reliable tool in routine analysis for the determination of critical parameters for wine quality.  相似文献   

19.
20.
Testing laboratories wishing to comply with the requirements of ISO/IEC 17025:1999 need to estimate uncertainty of measurement for their quantitative methods. Many microbiological laboratories have had procedures available for monitoring variability in duplicate results generated by laboratory analysts for some time. These procedures, however, do not necessarily include all possible contributions to uncertainty in the calculations. Procedures for estimating microbiological method uncertainty, based on the Poisson distribution, have been published but, at times, the procedures can either underestimate uncertainty or require laboratories to undertake considerable experimental studies and more complex statistical calculations. This paper proposes procedures for estimating uncertainty of measurement in microbiology, whereby routine laboratory quality control data can be analyzed with simple statistical equations. The approaches used in these procedures are also applied to published data and examples, demonstrating that essentially equivalent results can be obtained with these procedures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号