首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
About 100 countries have established regulatory limits for aflatoxin in food and feeds. Because these limits vary widely among regulating countries, the Codex Committee on Food Additives and Contaminants began work in 2004 to harmonize aflatoxin limits and sampling plans for aflatoxin in almonds, pistachios, hazelnuts, and Brazil nuts. Studies were developed to measure the uncertainty and distribution among replicated sample aflatoxin test results taken from aflatoxin-contaminated treenut lots. The uncertainty and distribution information is used to develop a model that can evaluate the performance (risk of misclassifying lots) of aflatoxin sampling plan designs for treenuts. Once the performance of aflatoxin sampling plans can be predicted, they can be designed to reduce the risks of misclassifying lots traded in either the domestic or export markets. A method was developed to evaluate the performance of sampling plans designed to detect aflatoxin in hazelnuts lots. Twenty hazelnut lots with varying levels of contamination were sampled according to an experimental protocol where 16 test samples were taken from each lot. The observed aflatoxin distribution among the 16 aflatoxin sample test results was compared to lognormal, compound gamma, and negative binomial distributions. The negative binomial distribution was selected to model aflatoxin distribution among sample test results because it gave acceptable fits to observed distributions among sample test results taken from a wide range of lot concentrations. Using the negative binomial distribution, computer models were developed to calculate operating characteristic curves for specific aflatoxin sampling plan designs. The effect of sample size and accept/reject limits on the chances of rejecting good lots (sellers' risk) and accepting bad lots (buyers' risk) was demonstrated for various sampling plan designs.  相似文献   

2.
MRM, multivariate range modeling, is based on models built as parallelepipeds in the space of the original variables and/or of discriminant variables as those of linear discriminant analysis. The ranges of these variables define the boundary of the model. The ranges are increased by a "tolerance" factor to take into account the uncertainty of their estimate. MRM is compared with UNEQ (the modeling technique based on the hypothesis of multivariate normal distribution) and with SIMCA (based on principal components) by means of the sensitivities and specificities of the models, the estimates of type I (sensitivity) and II error rates (specificity) evaluated both with the final model built with all the available objects and by means of cross validation. UNEQ and SIMCA models were obtained with the usual critical significance value of 5% and with the model forced to accept all the objects of the modeled category. The performance parameters of the class models are critically discussed focusing on their uncertainty.  相似文献   

3.
Some general reasons for poor applicability of the statistical approach based on approximation of normal data distribution to interlaboratory test results and analytical measurements at high data dispersion are considered. They include a symmetry of the concentration scale, low-frequency noise, and nonlinear phenomena in atomization processes and chemical reactions. The relationship of 1/f noise and nonlinear phenomena to uncertainty balance, experimental verification of the assigned uncertainty value, ruggedness tests and statistical data distribution are briefly discussed.  相似文献   

4.
Abstract

The ability to determine the biodegradability of chemicals without resorting to expensive tests is ecologically and economically desirable. Models based on quantitative structure–activity relations (QSAR) provide some promise in this direction. However, QSAR models in the literature rarely provide uncertainty estimates in more detail than aggregated statistics such as the sensitivity and specificity of the model’s predictions. Almost never is there a means of assessing the uncertainty in an individual prediction. Without an uncertainty estimate, it is impossible to assess the trustworthiness of any particular prediction, which leaves the model with a low utility for regulatory purposes. In the present work, a QSAR model with uncertainty estimates is used to predict biodegradability for a set of substances from a publicly available data set. Separation was performed using a partial least squares discriminant analysis model, and the uncertainty was estimated using bootstrapping. The uncertainty prediction allows for confidence intervals to be assigned to any of the model’s predictions, allowing for a more complete assessment of the model than would be possible through a traditional statistical analysis. The results presented here are broadly applicable to other areas of modelling as well, because the calculation of the uncertainty will clearly demonstrate where additional tests are needed.  相似文献   

5.
Conventional statistical analyses of counting measurements with a paired-count blank and sample yield unacceptably-large estimates of uncertainty that reduce measurement sensitivity when applied to very-low-background detection systems. An alternative is presented here: Bayesian analysis using longer-duration background measurements, appropriate modeling of the background, and a binomial distribution of decay-induced counts valid even for short-lived isotopes. We develop the needed formulae and demonstrate how the decision level and sample measurement duration are optimized jointly to produce the lowest minimum detectable quantity subject to constraints of specified acceptable risks of false detection and false failure to detect. A frequentist’s interpretation is maintained by using equal-likelihood prior distributions.  相似文献   

6.
About 100 nations have established regulatory limits for aflatoxin in food and feeds. Because these limits vary widely from one country to another, the Codex Alimentarius Commission, working through the Codex Committee on Food Additives and Contaminants, has initiated work to harmonize aflatoxin limits and sampling plans for almonds, pistachios, hazelnuts, and Brazil nuts. Studies were developed to measure the uncertainty and distribution among test results for replicate samples taken from aflatoxin-contaminated almond shipments. The uncertainty and distribution information was used to develop a model to evaluate the performance of aflatoxin sampling plans so that harmonized sampling plans can be developed for almonds that reduce the misclassifying of lots in the export trade. Twenty lots of shelled almonds were sampled according to an experimental protocol in which sixteen 10 kg samples were taken from each lot. The observed aflatoxin distribution among the 16 sample test results was compared with 3 theoretical distributions. The negative binomial distribution was selected to model aflatoxin distribution among sample test results because it gave acceptable fits across all 20 observed sample distributions. By using the variance and distribution information, operating characteristics curves were developed to predict the effect of sample size and accept/reject limits on the probability of rejecting good lots and accepting bad lots.  相似文献   

7.
 Traceability to the System International (SI) is an important prerequisite for international comparability and uniformity of chemical measurements to ensure mutual recognition of the results. In theory, all measurements can be traced back to the seven base units of the SI. Although the traceability system works well for most physical measurements, in many analytical and in some spectrophotometric measurements this system is not satisfactory. This paper describes the particular and practical problems and the contribution of the Romanian National Institute of Metrology in this field. The paper discusses the following concepts: clearly defined targets in the form of requirement specification, knowledge of trueness and/or measurement uncertainty, and traceability through an unbroken chain of calibration to primary standards. Traceability and uncertainty being two concepts inherently coupled, two examples of assessment of the uncertainty of measurement results are given for two spectrophotometric methods currently used in chemical laboratories. Received: 17 July 1996 Accepted: 2 September 1996  相似文献   

8.
We discuss the interpretation of usually broad oxidation peaks observed in electronically conducting polymers, in terms of the statistical distributions functions of polarons and bipolarons. The analysis is based on examining the chemical capacitance, that relates the change of concentration to a modification of the chemical potential of a given species, for different statistical models. We first review the standard models for single energy species that provide a nernstian dependence, and the limitations of these models are discussed. A new model that assumes a Gaussian distribution of energies related to molecular geometry fluctuations is suggested, and this model shows excellent agreement with the results of electrochemical oxidation of polypyrrole in quasiequilibrium conditions. From a fit of the data, it is found that the density of conjugated chain segments in polypyrrole, Ns approximately 10(21) cm(-3), shows a Gaussian distribution of half width sigma approximately 170 meV, tentatively attributed to bipolaron formation energies.  相似文献   

9.
Validation of complex chemical models relies increasingly on uncertainty propagation and sensitivity analysis with Monte Carlo sampling methods. The utility and accuracy of this approach depend on the proper definition of probability density functions for the uncertain parameters of the model. Taking into account the existing correlations between input parameters is essential to a reliable uncertainty budget for the model outputs. We address here the problem of branching ratios between product channels of a reaction, which are correlated by the unit value of their sum. We compare the uncertainties on predicted time-dependent and equilibrium species concentrations due to input samples, either uncorrelated or explicitly correlated by a Dirichlet distribution. The method is applied to the case of Titan ionospheric chemistry, with the aim of estimating the effect of branching ratio correlations on the uncertainty balance of equilibrium densities in a complex model.  相似文献   

10.
This paper reports the use of improved numerical approaches to modelling extraction profiles, and shows that the approach substantially reduces statistical prediction uncertainties compared to those obtained on the basis of a three-point extrapolation from the later part of the extraction curve. Numerical fitting of manually obtained polycyclic aromatic hydrocarbon extraction data to a spherical particle diffusion model showed uncertainties typically reduced by a factor of three (with extremes at 1.02 and 770). Application to pressurised fluid extraction study of pelletised poly(vinylchloride) containing 30 mass% di(2-ethylhexyl)phthalate also showed good improvements. However, this high precision data showed small but significant lack of fit resulting in residual correlation and visibly biased prediction (more so than simple extrapolation). Re-fitting and uncertainty estimation using a first-order autoregression approximation to the covariance matrix produced more realistic uncertainty estimates and closer parameter estimates and is accordingly recommended for treating residual correlation from other causes, but did not entirely alleviate the problem. Different shape models (spherical, plane sheet and cylindrical) were applied without accounting fully for fitting error, and particle size effects were eliminated by modelling a simple size distribution. However, an approximate model based on linearly concentration-dependent diffusion coefficient showed excellent fit, confirming concentration-dependence as the most likely cause. This semiempirical model led to an uncertainty in total extractable material, at 0.2% of the total extractable value (with allowance for correlation). This is potentially good enough for recovery estimation and correction in certification of reference materials for validation purposes.  相似文献   

11.
Molecular dynamics (MD) simulations can be used to estimate transition rates between conformational substates of the simulated molecule. Such an estimation is associated with statistical uncertainty, which depends on the number of observed transitions. In turn, it induces uncertainties in any property computed from the simulation, such as free energy differences or the time scales involved in the system's kinetics. Assessing these uncertainties is essential for testing the reliability of a given observation and also to plan further simulations in such a way that the most serious uncertainties will be reduced with minimal effort. Here, a rigorous statistical method is proposed to approximate the complete statistical distribution of any observable of an MD simulation provided that one can identify conformational substates such that the transition process between them may be modeled with a memoryless jump process, i.e., Markov or Master equation dynamics. The method is based on sampling the statistical distribution of Markov transition matrices that is induced by the observed transition events. It allows physically meaningful constraints to be included, such as sampling only matrices that fulfill detailed balance, or matrices that produce a predefined equilibrium distribution of states. The method is illustrated on mus MD simulations of a hexapeptide for which the distributions and uncertainties of the free energy differences between conformations, the transition matrix elements, and the transition matrix eigenvalues are estimated. It is found that both constraints, detailed balance and predefined equilibrium distribution, can significantly reduce the uncertainty of some observables.  相似文献   

12.
A paramount aspect in the development of a model for a monitoring system is the so‐called parameter stability. This is inversely related to the uncertainty, i.e., the variance in the parameters estimates. Noise affects the performance of the monitoring system, reducing its fault detection capability. Low parameters uncertainty is desired to ensure a reduced amount of noise in the model. Nonetheless, there is no sound study on the parameter stability in batch multivariate statistical process control (BMSPC). The aim of this paper is to investigate the parameter stability associated to the most used synchronization and principal component analysis‐based BMSPC methods. The synchronization methods included in this study are the following: indicator variable, dynamic time warping, relaxed greedy time warping, and time linear expanding/compressing‐based. In addition, different arrangements of the three‐way batch data into two‐way matrices are considered, namely single‐model, K‐models, and hierarchical‐model approaches. Results are discussed in connection with previous conclusions in the first two papers of the series. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

13.
戴骐 《分析试验室》2007,26(8):77-79
建立了电感耦合等离子体发射光谱(ICP-AES)测定糖果中铅的相应数学模型, 对数学模型中各个参数进行不确定度来源分析, 分别对A类不确定度或B类不确定度进行评定. 对各不确定度分量合成和扩展, 得到铅质量分数的不确定度. 结果表明: 标准溶液的配制、标准曲线拟合线性方程及样品溶液的定容是不确定度的主要来源.  相似文献   

14.
ICP-AES法测定蔬菜、水果中锌含量的不确定度评定   总被引:1,自引:0,他引:1  
不确定度[1-3]在我国各行业的应用发展不均衡,在蔬菜、水果检测方法的评定方面鲜有报道。ICP作为一种灵敏度高、快速、高效的检测手段在食品、环保及其它领域得到了广泛的应用,其检测结果不确定度的评定较为重要。本文以ICP法测定番茄中的锌为例,对ICP法测定蔬菜、水果中重金属  相似文献   

15.
Traceability implies comparison of the results of measurements, or comparison to national or international measurement standards. One of several approaches that have been used in chemistry to provide for such comparisons is distribution of proficiency evaluation materials which have been measured by a reference laboratory. A newer approach is based on receipt and measurement at a reference laboratory of materials that have been produced and analyzed by other laboratories. Traceability concepts and approaches to realization will be described together with discussion of the relative merits of various approaches. Extension into metrological fields other than chemistry will also be explored. Received: 14 November 2000 Accepted: 11 December 2000  相似文献   

16.
17.
分析并筛选了空气焓值法测量制热量测定不确定度的主要来源,通过数学模型进行不确定度的A类评定和B类分析,推导和建立了静压差、焓值、空气比容、空气湿度等制热量不确定度主要分量的数学模型,通过实验数据评定了制热量不确定度。采用极限取值评定复杂不确定度分量,为制热量不确定度评定提供了一种简便可靠的方式。  相似文献   

18.
The best estimate and its standard deviation are calculated for the case when the a priori probability that the analyte is absent from the test sample is not zero. In the calculation, a generalization of the Bayesian prior that is used in the ISO 11929 standard is applied. The posterior probability density distribution of the true values, given the observed value and its uncertainty, is a linear combination of the Dirac delta function and the normalized, truncated, normal probability density distribution defined by the observed value and its uncertainty. The coefficients of this linear combination depend on the observed value and its uncertainty, as well as on the a priori probability. It is shown that for a priori probabilities larger than zero the lower level of the uncertainty interval of the best estimate reaches the unfeasible range (i.e., negative activities). However, for a priori probabilities in excess of 0.26 it reaches the unfeasible range even for positive observed values. The upper limit of the confidence interval covering a predefined fraction of the posterior is derived.  相似文献   

19.
The evaluation of measurement uncertainty, and that of uncertainty statements of participating laboratories will be a challenge to be met in the coming years. The publication of ISO 17025 has led to the situation that testing laboratories should, to a certain extent, meet the same requirements regarding measurement uncertainty and traceability. As a consequence, proficiency test organizers should deal with the issues measurement uncertainty and traceability as well. Two common statistical models used in proficiency testing are revisited to explore the options to include the evaluation of the measurement uncertainty of the PTRV (proficiency test reference value). Furthermore, the use of this PTRV and its uncertainty estimate for assessing the uncertainty statements of the participants for the two models will be discussed. It is concluded that in analogy to Key Comparisons it is feasible to implement proficiency tests in such a way, that the new requirements can be met. Received: 29 September 2000 Accepted: 3 December 2000  相似文献   

20.
A stochastic simulation of simultaneous reaction and diffusion is proposed for the gas-liquid interface formed in the surface of a gas bubble within a liquid. The interface between a carbon dioxide bubble and an aqueous solution of calcium hydroxide was simulated as an application example, taken from the integrated production of calcium carbonate. First Gillespie’s stochastic simulation algorithm was applied in separate reaction and diffusion simulations. The results from these simulations were consistent with deterministic solutions based on differential equations. However it was observed that stochastic diffusion simulations are extremely slow. The sampling of diffusion events was accelerated applying a group molecule transfer scheme based on the binomial distribution function. Simulations of the reaction-diffusion in the gas-liquid interface based on the standard Gillespie’s stochastic algorithm were also slow. However the application of the binomial distribution function scheme allowed to compute the concentration profiles in the gas-liquid interface in a fraction of the time required with the standard Gillespie’s stochastic algorithm.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号