共查询到20条相似文献,搜索用时 46 毫秒
1.
Pavlina Simeonova Costel Sarbu Thomas Spanos Vasil Simeonov Stefan Tsakovski 《Central European Journal of Chemistry》2006,4(1):68-80
The present paper deals with the application of classical and fuzzy principal components analysis to a large data set from
coastal sediment analysis. Altogether 126 sampling sites from the Atlantic Coast of the USA are considered and at each site
16 chemical parameters are measured. It is found that four latent factors are responsible for the data structure (“natural”,
“anthropogenic”, “bioorganic”, and “organic anthropogenic”). Additionally, estimating the scatter plots for factor scores
revealed the similarity between the sampling sites. Geographical and urban factors are found to contribute to the sediment
chemical composition. It is shown that the use of fuzzy PCA helps for better data interpretation especially in case of outliers. 相似文献
2.
S. Bürger K. J. Mathew P. Mason U. Narayanan 《Journal of Radioanalytical and Nuclear Chemistry》2009,279(2):659-673
The characterized concentrations of 24 impurity elements in New Brunswick Laboratory (NBL) Certified Reference Material (CRM)
124 were reevaluated. A provisional certificate of analysis was issued in September 1983 based upon the “as prepared” values
(gravimetric mixing). The provisional certificate does not state uncertainties for the characterized values, or estimate the
degree of homogeneity. Since release of the provisional certificate of analysis various laboratories have reported analytical
results for CRM 124. Based upon the reported data a re-evaluation of the characterized values with an estimate of their uncertainties
was performed in this work. An assessment of the degree of homogeneity was included. The overall difference between the re-evaluated
values for the 24 impurity elements and the “as prepared” values from the provisional certificate of analysis is negligible
compared to the uncertainties. Therefore, NBL will establish the “as prepared” values as the certified values and use the
derived uncertainties from this work for the uncertainties of the certified values. The traceability of the “as prepared”
values was established by the gravimetric mixing procedure employed during the preparation of the CRM. NBL further recommends
a minimum sample size of 1 g of the CRM material to ensure homogeneity. Samples should be dried by heating up to 110 °C for
one hour before use. 相似文献
3.
The IAEA conducted the IAEA-CU-2006-06 Proficiency Test (PT) on “The determination of major, minor and trace elements in ancient
Chinese ceramic” in 2006. The results of this PT showed that of the 21 analytes reported by our laboratory 9 failed the precision
criteria. Therefore the results reported by our laboratory along with the results of other laboratories which carried out
analysis using neutron activation analysis (NAA) were studied. It was found that the major factor contributing towards data
falling in the “Warning” category, failing the precision criteria was the high uncertainties cited in the certificates of
the reference materials (RMs) used for quantification of data. In this regard, it is recommended that synthetic standards
should be prepared and used on a routine basis especially for the measurement of the elements K, Eu, Lu, Ta, Tb and Yb. 相似文献
4.
Henning Wolf 《Accreditation and quality assurance》2008,13(10):587-591
Absolute measurements of water density with very small uncertainties on the order of 0.001 kg/m3 have previously been a metrological challenge, as is shown by measurements of the density of pure water performed in recent
decades with different methods. However, using water as a reference liquid with a well-known density, it is possible to perform
density measurements relative to this reference liquid by means of an oscillation-type density meter. Using this so-called
substitution method, it is possible to obtain uncertainties of about 0.002 kg/m3 or a relative uncertainty of 2 × 10−6. The conversion from relative to absolute measurements is performed using a water density table. The uncertainty of this
absolute measurement is given by the combination of the uncertainty of the relative measurement and the uncertainty given
for the density table.
Presented at the PTB Seminar “Conductivity and Salinity”, September 2007, Braunschweig, Germany. 相似文献
5.
Ryan Fitzgerald 《Journal of Radioanalytical and Nuclear Chemistry》2010,284(1):173-174
In a recent paper, Mathew et al. detailed, for a specific titration-based assay of uranium, a “step-by-step approach to calculate
the GUM uncertainty of the measurand”, in which their uncertainty assessment was based solely on prior knowledge, ignoring
the manifest variability in their replication data. A simple analysis of the variance from their data reveals that the uncertainty
in the average of the replicated quantity (TEF) was at least 3.5 times their estimate. Since the observables that contribute
most to the final uncertainty in their method were not replicated, it is unknown whether the estimates for the uncertainties
of those quantities, and thus of the output quantity, were also underestimated. This comment demonstrates how a better uncertainty
evaluation is possible by extracting as much knowledge as possible from the extant data. 相似文献
6.
The measurement of temperatures accompanies almost every determination of physical quantities or material properties. This
paper gives an outline of the concept of the traceability of temperature measurements according to the International Temperature
Scale of 1990 (ITS-90) and the determination of measurement uncertainties. Furthermore, differences between ITS-90 and thermodynamic
temperatures are discussed.
Presented at the PTB Seminar “Conductivity and Salinity”, September 2007, Braunschweig, Germany. 相似文献
7.
Yu. I. Kuznetsov D. B. Vershok S. F. Timashev A. B. Solov’eva P. I. Misurkin V. A. Timofeeva S. G. Lakeev 《Russian Journal of Electrochemistry》2010,46(10):1155-1166
The process of anticorrosion magnetite coating (MC) formation on low-carbon steel is studied in alkali-free nitrate converting
media at the temperatures of 70–98°C reduced as compared to those used (130–145°C) in standard technologies of steel bluing:
formation of such coatings in alkaline nitrate solutions. Alongside with the conventional corrosion-electrochemical methods
of analysis of the formed MCs, the regularities of the MC surface reliefs were studied using the method of atomic force microscopy
combined with the technique of flicker-noise spectroscopy (FNS) for processing digitized images and obtaining the parameters
of the MC surface structure in different nanometer ranges. It was shown that it is necessary to introduce additives of metal
nitrates with a low cation radius into the ammonium nitrate converting solution to obtain MCs with a high corrosion stability
at the first stage of MC formation and the final stage must consist in the further “passivation” of MCs: MC treatment by aqueous
solutions based on nontoxic carboxylates. According to the FNS analysis of the surface structure of the formed MCs, a significant
decrease of the FNS “point” factor, an indicator of MC corrosion instability, occurred during the final treatment. On this
basis, one could characterize quantitatively the results of accelerated corrosion tests: no steel corrosion occurred on the
thus formed coatings for 42 days under standard severe conditions: 100% relative humidity and daily “showering”. The performed
study reveals fundamental possibilities for solving the problems of standardization of the anticorrosion coating surface based
on the analysis of their surface profile in the nanometer range. 相似文献
8.
Ricardo J. N. Bettencourt da Silva Maria Filomena G. F. C. Camões 《Accreditation and quality assurance》2010,15(12):691-704
The dispersion of results from proficiency tests for the analysis of pesticide residues in foodstuffs suggests that improvements
in the compatibility of measurement results are needed. Currently observed divergences can make the evaluation conclusion
on foodstuffs compliance with certain legislation dependent on the consulted laboratory. This work discusses the origin and
impact of this lack of compatibility, following the metrological concepts presented at the latest version of the “International
Vocabulary of Metrology” (VIM3), thus allowing for a clear diagnostic of the problem. The reporting of results from different
measurement methods uncorrected for the observed analyte recovery makes them traceable to different “operationally defined
measurement procedures” (VIM3) and, therefore, not comparable. When results from different measurement methods are reported
corrected for analyte recovery, R, and R is different for spiked and incurred residues, measurement results may be not compatible if this effect is not considered
on the uncertainty budget. This discussion is illustrated with metrological models for any possible combination of “measurement
performance” and “maximum residue level”. These models are complemented with experimental data of the analysis of pesticide
residues in a sample of ginseng powder from a proficiency test. The adopted experimental design allowed the identification
of additional threats to metrological compatibility in this field. Solutions to the faced problem are discussed for practicability
and impact on regulatory issues. The use of a universal “reference measurement procedure” proves to be the most feasible way
of ensuring comparability of measurements in this field. 相似文献
9.
R. R. Greenberg 《Journal of Radioanalytical and Nuclear Chemistry》2008,278(2):231-240
This paper describes some highlights from the author’s efforts to improve neutron activation analysis (NAA) detection limits
through development and optimization of radiochemical separations, as well as to improve the overall accuracy of NAA measurements
by identifying, quantifying and reducing measurement biases and uncertainties. Efforts to demonstrate the metrological basis
of NAA, and to establish it as a “Primary Method of Measurement” will be discussed. 相似文献
10.
Kaj Heydorn 《Accreditation and quality assurance》2006,10(9):479-484
Proficiency data with stated uncertainties represent a unique opportunity for testing that the reported uncertainties are consistent with the Guide to the expression of uncertainty in measurement (GUM). In most proficiency tests, however, this opportunity is forfeited, because proficiency data are processed without regard to their uncertainties. In this paper we present alternative approaches for determining a reference value as the weighted mean of all mutually consistent results and their stated uncertainties. Using an accepted reference value each reported uncertainty estimate can be expressed as an E
n
number, but a value of confirms its validity only if the uncertainty of the reference value is negligible in comparison.Reference values calculated for results from an International Measurement Evaluation Programme (IMEP-9) by “bottom up” as well as “top down” methods were practically identical, although the first strategy yielded the lowest uncertainty. A plot of individual coefficients of variation (CV) versus E
n
numbers helps interpretation of the proficiency data, which could be used to validate relative uncertainties down to <1%. 相似文献
11.
Marc Priel 《Accreditation and quality assurance》2009,14(5):235-241
Since the advent of the Guide to the expression of Uncertainty in Measurement (GUM) in 1995 laying the principles of uncertainty
evaluation numerous projects have been carried out to develop alternative practical methods that are easier to implement namely
when it is impossible to model the measurement process for technical or economical aspects. In this paper, the author presents
the recent evolution of measurement uncertainty evaluation methods. The evaluation of measurement uncertainty can be presented
according to two axes based on intralaboratory and interlaboratory approaches. The intralaboratory approach includes “the
modelling approach” (application of the procedure described in section 8 of the GUM, known as GUM uncertainty framework) and
“the single laboratory validation approach”. The interlaboratory approaches are based on collaborative studies and they are
respectively named “interlaboratory validation approach” and “proficiency testing approach”. 相似文献
12.
A study of the performance of different uncertainty evaluation strategies among 163 voluntary respondents from food proficiency
schemes is presented. Strategies included use of: single-laboratory validation data, quality control data, past proficiency
testing data, reproducibility data, a measurement equation and the dispersion of replicate observations on the test material.
Most performed reasonably well, but the dispersion of replicate observations underestimated uncertainty by a factor of approximately
3. Intended compliance with accreditation requirements was associated with significantly improved uncertainty evaluation performance,
while intended compliance with the ISO “Guide to the expression of uncertainty in measurement” had no significant effect.
Substituting estimates based on the Horwitz or Horwitz–Thompson models or on PT target standard deviation for the respondents’
own estimates of uncertainty led to a marked reduction in poor zeta scores and significant improvement in dispersion of zeta
scores. 相似文献
13.
Yi Zhang Junkang Hao Changjie Zhou Kai Chang 《Journal of mathematical chemistry》2009,46(4):1203-1212
In this article, we propose a new method to measure DNA similarity based on a normalized Lempel-Ziv complexity scheme. The
new method can weaken the effect of sequence length on complexity measurement and save computation time. Firstly, a DNA sequence
is transformed into three (0,1)-sequences based on a scheme, which considers “A” and “non-A” , “G” and “non-G”, “C” and “non-C”
bases respectively. Then, the normalized Lempel-Ziv complexity of the three (0,1)-sequences constitute a 3D vector. Finally,
by the 3D vector, one may characterize DNA sequences and compute similarity matrix for them. The examination of similarities
of two sets of DNA sequences illustrates the utility of the method in local and global similarity analysis. 相似文献
14.
Michael K. Holland Joseph V. Cordaro 《Journal of Radioanalytical and Nuclear Chemistry》2009,282(2):555-563
Minimizing plutonium measurement uncertainty is essential to nuclear material control and international safeguards. In 2005,
the International Organization for Standardization (ISO) published ISO 12183 “Controlled-potential coulometric assay of plutonium,”
2nd edition. ISO 12183:2005 recommends a target of ±0.01% for the mass of original sample in the aliquot because it is a critical
assay variable. Mass measurements in radiological containment were evaluated and uncertainties estimated. The uncertainty
estimate for the mass measurement also includes uncertainty in correcting for buoyancy effects from air acting as a fluid
and from decreased pressure of heated air from the specific heat of the plutonium isotopes. 相似文献
15.
Vasil Simeonov Costel Sarbu Desire-Luc Massart Stefan Tsakovski 《Mikrochimica acta》2001,137(3-4):243-248
A data set (48×19) consisting of Danube river water analytical data collected at Galati site, Romania, during a four-year
period has been treated by principal components analysis (PCA). The PCA indicated that seven latent factors (“hardness”, “biochemical”,
“waste inlets”, “turbidity”, “acidity”, “soil extracts” and “organic wastes”) are responsible for the data structure and explain
over 80 % of the total variance of the system. Its complexity is further proved by the application of multiple linear regression
analysis on the absolute principal components scores (APCS) where the contribution of each natural or anthropogenic sources
in the factor formation is shown. The apportioning makes clear that each variable participates to a different extent to each
source and, in this way, no pure natural or pure anthropogenic influence could be determined. No specific seasonality for
the variables in consideration is found.
Received January 24, 2001. Revision July 6, 2001. 相似文献
16.
Multivariate analysis of HT/GC-(IT)MS chromatographic profiles of triacylglycerol for classification of olive oil varieties 总被引:1,自引:0,他引:1
Cristina Ruiz-Samblás Luis Cuadros-Rodríguez Antonio González-Casado Francisco de Paula Rodríguez García Paulina de la Mata-Espinosa Juan Manuel Bosque-Sendra 《Analytical and bioanalytical chemistry》2011,399(6):2093-2103
The ability of multivariate analysis methods such as hierarchical cluster analysis, principal component analysis and partial
least squares-discriminant analysis (PLS-DA) to achieve olive oil classification based on the olive fruit varieties from their
triacylglycerols profile, have been investigated. The variations in the raw chromatographic data sets of 56 olive oil samples
were studied by high-temperature gas chromatography with (ion trap) mass spectrometry detection. The olive oil samples were
of four different categories (“extra-virgin olive oil”, “virgin olive oil”, “olive oil” and “olive-pomace” oil), and for the
“extra-virgin” category, six different well-identified olive oil varieties (“hojiblanca”, “manzanilla”, “picual”, “cornicabra”,
“arbequina” and “frantoio”) and some blends of unidentified varieties. Moreover, by pre-processing methods of chemometric
(to linearise the response of the variables) such as peak-shifting, baseline (weighted least squares) and mean centering,
it was possible to improve the model and grouping between different varieties of olive oils. By using the first three principal
components, it was possible to account for 79.50% of the information on the original data. The fitted PLS-DA model succeeded
in classifying the samples. Correct classification rates were assessed by cross-validation. 相似文献
17.
Total arsenic concentrations in groundwater samples determined by hydride generation quartz furnace atomic absorption spectrometry
may underestimated due to a loss of quartz cell surface “conditioning.” This loss of quartz cell surface “conditioning” is
due to the analysis of many samples that do not contain measurable quantities of the analyte. This process is further accelerated
by the use of high concentrations of sodium tetrahydroborate and hydrochloric acid. Analysis of the highest calibration standard
in between the samples and the use of low concentrations of sodium tetrahydroborate and hydrochloric acid could minimize the
error arising from this source. 相似文献
18.
S. M. Robinson E. R. Siciliano J. E. Schweppe 《Journal of Radioanalytical and Nuclear Chemistry》2008,276(2):447-453
Developing and testing improved alarm algorithms is a priority of the Radiation Portal Monitor Project (RPMP) at PNNL. Improved
algorithms may reduce the potential impediments that radiation screening presents to the flow of commerce, without affecting
the detection sensitivity to sources of interest. However, assessing alarm-algorithm performance involves calculation of both
detection probabilities and false alarm rates. For statistical confidence, this requires a large amount of data from drive-through
(or “dynamic”) scenarios both with, and without, sources of interest, but this is usually not feasible. Instead, an “injection-study”
procedure is used to approximate the profiles of drive-through commercial data with sources of interest present. This procedure
adds net-counts from a pre-defined set of simulated sources to raw, gross-count drive-through data randomly selected from
archived RPM data. The accuracy of the procedure — particularly the spatial distribution of the injected counts — has not
been fully examined. This report describes the use of previously constructed and validated MCNP computer models for assessing
the current injection-study procedure. In particular, this report focuses on the functions used to distribute the injected
counts throughout the raw drive-through spatial profiles, and for suggesting a new class of injection spatial distributions
that more closely resemble actual cargo scenarios. 相似文献
19.
U. Kurfürst U. Buczko C. Kleimeier R. O. Kuchenbuch 《Accreditation and quality assurance》2011,16(2):73-81
On three fields of arable land of (3–6)×104 m2, simple reference sampling was performed by taking up to 195 soil increments from each field applying a systematic sampling
strategy. From the analytical data reference values for 15 elements were established, which should represent the average analyte
mass fraction of the areas. A “point selection standard deviation” was estimated, from which a prediction of the sampling
uncertainty was calculated for the application of a standard sampling protocol (X-path across the field, totally 20 increments
for a composite sample). Predicted mass fractions and associated uncertainties are compared with the results of a collaborative
trial of 18 experienced samplers, who had applied the standard sampling protocol on these fields. In some cases, bias between
reference and collaborative values is found. Most of these biases can be explained by analyte heterogeneity across the area,
in particular on one field, which was found to be highly heterogeneous for most nutrient elements. The sampling uncertainties
estimated from the reference sampling were often somewhat smaller compared to those from the collaborative trial. It is suspected
that the influence of sample preparation and the variation due to sampler were responsible for these differences. For the
applied sampling protocol, the uncertainty contribution from sampling generally is in the same range as the uncertainty contribution
from analysis. From these findings, some conclusions were drawn, especially about the consequences for a sampling protocol,
if in routine sampling a demanded “certainty of trueness” for the measurement result should be met. 相似文献
20.
In this work a novel graphical method is applied to the presentation of intercomparison results. This is demonstrated with
the results of a recent intercomparison in measuring the 137Cs, 40K, and 90Sr activity concentration in milk powder. The “PomPlot”, an intuitive graphical method, is used for producing a summary overview
of the participants’ results of a common measurand. The “PomPlot” displays (relative) deviations of individual results from
the reference value on the horizontal axis and (relative) uncertainties on the vertical axis. 相似文献