首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Measurement uncertainty of thermodynamic data   总被引:2,自引:0,他引:2  
Thermodynamic quantities of chemical reactions are commonly derived from experimental data obtained by chemical analysis. The accuracy of the evaluated thermodynamic quantities is limited by the measurement uncertainty of the analytical techniques applied. Straightforward transfer of metrological rules established for determination of single analytes to the more complex process of evaluating values of thermodynamic quantities is not possible. Computer-intensive statistical methods and Monte Carlo techniques are shown to enable integration of existing metrological concepts. An initial stage of the integration of both concepts is presented, taking solubility data for Am(III) in carbonate media as an illustrative example. A cause and effect diagram is created as a means of identification of sources of uncertainty. The uncertainties are used in a resampling-based Monte Carlo study to produce a probability distribution of the value of a quantity.  相似文献   

2.
Calibration is an operation whose main objective is to know the metrological status of a measurement system. Nevertheless, in analytical sciences, calibration has special connotations since it is the basis to do the quantification of the amount of one or more components (analytes) in a sample, or to obtain the value of one or more analytical parameters related with that quantity. Regarding this subject, the aim of analytical calibration is to find an empiric relationship, called measurement function, which permits subsequently to calculate the values of the amount (x-variable) of a substance in a sample, from the measured values on it of an analytical signal (y-variable). In this paper, the metrological bases of analytical calibration and quantification are established and, the different work schemes and calibration methodologies, which can be applied depending on the characteristic of the sample (analyte+matrix) to analyse, are distinguished and discussed. Likewise, the different terms and related names are clarified. A special attention has been paid to those analytical methods which use separation techniques, in relation with its effect on calibration operations and later analytical quantification.  相似文献   

3.
The possibility in principle of extending metrological concepts to the characteristics of complex objects, the primary information about the state or structure of which is presented in the form of complex chaotic dependences and cannot be expressed using standard metrological images such as directly measured time and length and other dimensional values, is shown. To correctly characterize the dynamic state of such complex objects, including states of objects during nonstationary evolution, or the special features of structures formed under the conditions of external actions of various intensities, it is necessary, first, to introduce autocorrelation dependences averaged over time or spatial intervals on the basis of measured dynamic variables and, next, to use these dependences to find sets of information parameters, which can be presented as metrological characteristics of the dynamic state under study or spatial image to be analyzed. The phenomenological basis of the corresponding analysis is provided by flicker-noise spectroscopy with its possibilities of developing procedures and algorithms that can be used to obtain metrological characteristics over various frequency (time and spatial) ranges of the signals analyzed. This is the basis on which unity of metrological characteristic measurements with a determined uncertainty (error) in measurements can be achieved, standards and reference samples of fluctuation metrology can be created, and methods for the transfer of standard parameters from standards to reference samples and then to working measurement instruments can be developed. This opens up the possibility for solving many practical problems of microelectronics, energetics, nanoindustry, chemical technology, which include standardization of the state of complex systems and articles of various functional purposes, and the quality of products created.  相似文献   

4.
探讨了非色散原子荧光光度计检定中部分技术指标与检测方法对高灵敏度、智能化新型仪器的适应性,提出了相关修改补充建议。经实际计量检定证明行之有效,较好的解决了实际工作中遇到的无法实施或检定指标与具体检测要求不相适应的问题,保证了计量检定后仪器分析结果的准确程度与公正性。  相似文献   

5.
Data analysis is an essential tenet of analytical chemistry, extending the possible information obtained from the measurement of chemical phenomena. Chemometric methods have grown considerably in recent years, but their wide use is hindered because some still consider them too complicated. The purpose of this review is to describe a multivariate chemometric method, principal component regression, in a simple manner from the point of view of an analytical chemist, to demonstrate the need for proper quality-control (QC) measures in multivariate analysis and to advocate the use of residuals as a proper QC method.  相似文献   

6.
This work addresses a metrological approach for the assessment of Se status in humans in terms of serum selenomethionine (SeMet). The quantification of SeMet was carried out using a primary method of chemical analysis, namely species-specific isotope dilution (SSID) in combination with HPLC coupled to collision/reaction cell inductively coupled plasma-mass spectrometry. SeMet was released from the serum selenoalbumin (a seleno-containing protein where SeMet is randomly incorporated) by enzymatic hydrolysis of the whole serum. This study is a follow-up of the analytical method development reported previously, and it focuses primarily on the evaluation of the uncertainty budget and the main uncertainty sources for SeMet determination in three commercial serums, namely BCR-637 (certified for total Se) and two serum standards, SERONORM level 1 (SERO-L1) and 2 (SERO-L2) (with indicative concentrations of total Se). The metrological approach reported here could be considered as a pilot study in terms of metrological determination of SeMet in human serum, hence being suitable for method validation and inter-laboratory comparison.  相似文献   

7.
The admissibility of nuclear forensics measurements and opinions derived from them in US Federal and State courts are based on criteria established by the US Supreme Court in the case of Daubert v. Merrell Dow and the 2000 Amendment of Rule 702 of the Federal Rules of Evidence. These criteria are being addressed by new efforts that include the development of certified reference materials (CRMs) to provide the basis for analytical method development, optimization, calibration, validation, quality control, testing, readiness, and declaration of measurement uncertainties. Quality data is crucial for all stages of the program, from R&D, and database development, to actual casework. Weakness at any point in the program can propagate to reduce the confidence of final conclusions. The new certified reference materials will provide the necessary means to demonstrate a high level of metrological rigor for nuclear forensics evidence and will form a foundation for legally defensible nuclear chemical analysis. The CRMs will allow scientists to devise validated analytical methods, which can be corroborated by independent analytical laboratories. CRMs are required for ISO accreditation of many different analytical techniques which may be employed in the analysis of interdicted nuclear materials.  相似文献   

8.
An industrial production process is considered to be divided into a working process and an information process. Starting point for the information process is the sampling of a material flow. The meaning of “representative” samples as well as questions of continuous and non-continuous sampling methods are discussed. The signals coming out of analysis are arranged in a special way according to statistical methods. Thereafter the information content is calculated by using rules of the information theory. The transmission of the information content within a certain time results in an information flux. The information flux actually required by the working process is compared to the information flux provided by analysis; and both are put into an equilibrium followed by considerations on “necessary and sufficient” analytical actions. It is the aim of analytical information processes to reduce the entropy of the working process to a minimum.  相似文献   

9.
The stakes concerning the characterisation of particles ranged in the size from 1 to 1000 nm, namely submicron particles, are today more and more important. Because of the variety of particles even inside a given sample in terms of dimension, mass, charge or chemical composition a characterisation as complete as possible is needed. The possibility of obtaining a multidimensional information by relevant analytical methods is then of the greatest interest. One very interesting strategy consists in using hyphenated techniques, which are intrinsically capable to provide rapidly and accurately such information. This paper summarises the different hyphenated techniques that can be used to characterise submicron particles and is focussed on their main applications to illustrate their current and potential uses. In order to have a relevant overview various on-line separation techniques are considered in a comparative way. In the same way various on-line detectors are then presented. Finally the concepts of multidetection and multidimensional analysis are discussed and their interest showed through different typical examples of hyphenated techniques illustrating submicron particle characterisation in fields of applications such as environmental and nanomaterial sciences.  相似文献   

10.
The role of human being as a part of a measuring system in a chemical analytical laboratory is discussed. It is argued that a measuring system in chemical analysis includes not only measuring instruments and other devices, reagents and supplies, but also a sampling inspector and/or analyst performing a number of important operations. Without this human contribution, a measurement cannot be carried out. Human errors, therefore, influence the measurement result, i.e., the measurand estimate and the associated uncertainty. Consequently, chemical analytical and metrological communities should devote more attention to the topic of human errors, in particular at the design and development of a chemical analytical/test method and measurement procedure. Also, mapping human errors ought to be included in the program of validation of the measurement procedure (method). Teaching specialists in analytical chemistry and students how to reduce human errors in a chemical analytical laboratory and how to take into account the error residual risk, is important. Human errors and their metrological implications are suggested for consideration in future editions of the relevant documents, such as the International Vocabulary of Metrology (VIM) and the Guide to the Expression of Uncertainty in Measurement (GUM).  相似文献   

11.
Chemometrics is the application of statistical and mathematical methods to chemical problems to permit maximal collection and extraction of useful information. The development of advanced chemical instruments and processes has led to a need for advanced methods to design experiments, calibrate instruments, and analyze the resulting data. For many years, there was the prevailing view that if one needed fancy data analyses, then the experiment was not planned correctly, but now it is recognized that most systems are multivariate in nature and univariate approaches are unlikely to result in optimum solutions. At the same time, instruments have evolved in complexity, computational capability has similarly advanced so that it has been possible to develop and employ increasing complex and computationally intensive methods. In this paper, the development of chemometrics as a subfield of chemistry and particularly analytical chemistry will be presented with a view of the current state-of-the-art and the prospects for the future will be presented.  相似文献   

12.
An original focus on univariate calibration as an experimental process of quantitative analysis is presented. A novel classification system is introduced against the background of the present situation concerning nomenclature of calibration methods. Namely, it has been revealed that four methods well-known in analytical chemistry: the conventional method, the internal standard method, the indirect method and the dilution method, can be split into those carried out in both the interpolative and the extrapolative mode. It is then shown that the basic procedures of all these methods can be modified including different approaches, such as matrix-matched technique, spiking the sample with a reactant, bracketing calibration, and others. For the first time (as compared to monographies dealing with univariate calibration) it is reviewed how the methods are mixed and integrated with one another thereby creating new calibration strategies of extended capabilities in terms of enhanced resistance to the interference and non-linear effects – as the main sources of systematic calibration errors. As additional novelty, rationally possible combinations of the calibration methods – not met hitherto in the literature – have been predicted. Finally, some general rules relating to calibration are formulated and the main calibration problems that still need to be solved are displayed.  相似文献   

13.
The optimal formation conditions have been found and the chemical analytical characteristics have been determined for ethylenediaminetetraacetates of chromium(III). The techniques for the photometric determination of large amounts of chromium in chromium ore, alloyed steel, ferrochromium, and chromium(III) oxide have been developed. The metrological characteristics are not inferior to those obtained using conventional methods of analysis, but surpass them with respect to output.  相似文献   

14.
Expanding application of activation analysis in industry resulted in exposure of features related to higher requirements to productivity, reliability, automation level, metrological support of analytical methods and equipment. Based on the application of neutron generators, radioisotope neutron sources, nuclear reactors, electron accelerators as activating radiation sources, high-productivity activation analytical systems used directly in analytical laboratories and plants were constructed. Level of development of the above works makes it possible to conclude that industrial activation analysis has formed as an independent trend of nuclear analytics and has considerable prospects.  相似文献   

15.
This article summarizes the most important and current methods for the determination of total organic carbon (TOC) in solid samples. The methods presented are chemical oxidation, combustion, acid treatment and ashing. We discuss the advantages of the different methods as well as problems that can be encountered during analysis. We give additional information on the detection techniques used for TOC determination and on the topic of volatile organic carbon (VOC) that deserves particular attention when carrying out TOC determinations, in particular with solid samples. We have compiled a concise survey of the literature covering 60 reports and tabulated the relevant analytical parameters.  相似文献   

16.
Chemometrics is the application of statistical and mathematical methods to analytical data to permit maximum collection and extraction of useful information. The utility of chemometric techniques as tools enabling multidimensional calibration of selected spectroscopic, electrochemical, and chromatographic methods is demonstrated. Application of this approach mainly for interpretation of UV-Vis and near-IR (NIR) spectra, as well as for data obtained by other instrumental methods, makes identification and quantitative analysis of active substances in complex mixtures possible, especially in the analysis of pharmaceutical preparations present in the market. Such analytical work is carried out by the use of advanced chemical instruments and data processing, which has led to a need for advanced methods to design experiments, calibrate instruments, and analyze the resulting data. The purpose of this review is to describe various chemometric methods in combination with UV-Vis spectrophotometry, NIR spectroscopy, fluorescence spectroscopy, electroanalysis, chromatographic separation, and flow-injection analysis for the analysis of drugs in pharmaceutical preparations. Theoretical and practical aspects are described with pharmaceutical examples of chemometric applications. This review will concentrate on gaining an understanding of how chemometrics can be useful in the modern analytical laboratory. A selection of the most challenging problems faced in pharmaceutical analysis is presented, the potential for chemometrics is considered, and some consequent implications for utilization are discussed. The reader can refer to the citations wherever appropriate.  相似文献   

17.
In this paper, 15 years of the experiences acquired concerning the teaching of chemical metrology in Latin America are presented. These include postgraduate and undergraduate activities developed in eight countries. The combination of theoretic and practical activities and the sequence of learning from metrological, statistical, and chemometrical backgrounds up to practical activities in personal computers are basic and motivate the learning process. Care is taken to promote the metrological approach and thinking in analytical chemistry. The learning of computing techniques plays an important role, combining graphic and numerical techniques for data analysis. The role of examples during the teaching process is analyzed and recognized. The introduction of a general model of errors permits one to approach different topics on a metrological basis. The metrological approach of uncertainty based on the theory of errors permits one to develop the topic. Undergraduate students acquire a basic metrological knowledge and other experiences are also presented. Recommendations for undergraduate and postgraduate programs are pointed out.  相似文献   

18.
Balabin RM  Smirnov SV 《The Analyst》2012,137(7):1604-1610
Modern analytical chemistry of industrial products is in need of rapid, robust, and cheap analytical methods to continuously monitor product quality parameters. For this reason, spectroscopic methods are often used to control the quality of industrial products in an on-line/in-line regime. Vibrational spectroscopy, including mid-infrared (MIR), Raman, and near-infrared (NIR), is one of the best ways to obtain information about the chemical structures and the quality coefficients of multicomponent mixtures. Together with chemometric algorithms and multivariate data analysis (MDA) methods, which were especially created for the analysis of complicated, noisy, and overlapping signals, NIR spectroscopy shows great results in terms of its accuracy, including classical prediction error, RMSEP. However, it is unclear whether the combined NIR + MDA methods are capable of dealing with much more complex interpolation or extrapolation problems that are inevitably present in real-world applications. In the current study, we try to make a rather general comparison of linear, such as partial least squares or projection to latent structures (PLS); "quasi-nonlinear", such as the polynomial version of PLS (Poly-PLS); and intrinsically non-linear, such as artificial neural networks (ANNs), support vector regression (SVR), and least-squares support vector machines (LS-SVM/LSSVM), regression methods in terms of their robustness. As a measure of robustness, we will try to estimate their accuracy when solving interpolation and extrapolation problems. Petroleum and biofuel (biodiesel) systems were chosen as representative examples of real-world samples. Six very different chemical systems that differed in complexity, composition, structure, and properties were studied; these systems were gasoline, ethanol-gasoline biofuel, diesel fuel, aromatic solutions of petroleum macromolecules, petroleum resins in benzene, and biodiesel. Eighteen different sample sets were used in total. General conclusions are made about the applicability of ANN- and SVM-based regression tools in the modern analytical chemistry. The effectiveness of different multivariate algorithms is different when going from classical accuracy to robustness. Neural networks, which are capable of producing very accurate results with respect to classical RMSEP, are not able to solve interpolation problems or, especially, extrapolation problems. The chemometric methods that are based on the support vector machine (SVM) ideology are capable of solving both classical regression and interpolation/extrapolation tasks.  相似文献   

19.
The mass fraction of potassium hydrogen phthalate (KHP) from a specific batch was certified as an acidimetric standard. Two different analytical methods on a metrological level were used to carry out certification analysis: precision constant current coulometric and volumetric titration with NaOH. It could be shown that with a commercial automatic titration system in combination with a reliable software for the end-point detection it is possible to produce equivalent results with the same accuracy in comparison to a definite method handled by a fundamental apparatus for traceable precision coulometry. Prerequisite for titrations are that a high number of single measurement are applied which are calibrated with a high precision certified reference material.  相似文献   

20.
Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号