首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The production of reference materials for environmental analysis started in the Joint Research Centre at Ispra/Italy in 1972 with the objective of later certification by the BCR, but for obvious budget reasons only a fraction of the total production achieved at Ispra ever reached certification level, although all materials were produced according to the severe quality requirements requested for certified reference materials. Therefore, the materials not destinated to certification are in growing demand as inter-laboratory test materials and as laboratory reference materials, for internal quality control, e.g., by control charts. The history of reference material production within the Joint Research Centre is briefly reviewed and the latest additions described. New developments such as micro-scale reference materials intended for analytical methods requiring sample intakes at milligram or sub-milligram level and therefor not finding supply on the reference material market, and "wet" environmental reference materials, which meet more precisely the "real-world" environmental analysis conditions, are presented and the state-of-the-art discussed.  相似文献   

2.
A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures have been developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time.  相似文献   

3.
The role of the analytical machine and the computer command in automated instrumental analysis are discussed and the eminent role of true and adapted reference material is stressed. The application of charts in process control (SPC) neglects the analytical potential which is contained in trends. As an improvement on-line calibration is suggested employing a set of high quality reference samples in a moving manner with subsequent regressions after each new analysis of one of the members of the set. The number of members of the set and the sequence of their analysis determine the corrective force of the system and weighted regression provides for large concentration ranges. The performance of the correction system was tested by simulation and results of application to OES and XRF in practice over long periods of time are communicated. The analysis of steel and oxide matter in form of iron ore sinter and blast-furnace slag resulted at any time in negligible bias between nominal and observed values of the set of reference samples. Residual standard deviations near or equal to the repeatabilities of the methods were observed.  相似文献   

4.
 Most efforts in quality control have been focussed on the reduction of intralaboratory variation and the assessment of interlaboratory variation. Over the last few years, the importance of bias in interlaboratory variation and intralaboratory shifts has become clear. Small shifts can sometimes have a large impact on the number of treated patients, particularly in assays where cut off values are used. For example in cholesterol, HDL-cholesterol, HbA1c and TSH assays. There is an obvious need for adequate calibration material. However, the process of development of international primary reference materials and reference methods takes time, and even if reference materials exist and are used by in vitro diagnostics manufacturers, there still remains significant and clinically relevant interlaboratory variance and intralaboratory shifts, as is seen, e.g. in protein chemistry. The harmonization of inter laboratory and intralaboratory results needs an impulse from professional organizations to convince individual laboratories of the importance and significance of bias. This applies to all subdisciplines of laboratory medicine. On the occasion of the 25th anniversary of the Foundation for External Quality Assessment (SKZL), a large interdisciplinary harmonization project called Calibration 2000 was launched in The Netherlands The strategy and first results are reported in this paper. The project aims at harmonization of laboratory data of several disciplines, using secondary calibration materials, leading to common reference ranges throughout The Netherlands. Received: 15 April 2000 · Accepted: 15 April 2000  相似文献   

5.
Three analytical methods for determination of uranium in environmental samples by a fluorescence technique have been validated and compared in accordance with the Eurachem Guide on method validation. The first method depends on uranium separation from iron using the mercury anode technique; in the other two methods uranium is separated from iron on an anion exchange column by use of either a solution of hydrochloric acid containing ascorbic acid and hydrazine hydrate or a dilute sulfuric acid solution. Detection limits, repeatability, reproducibility, and recovery coefficient were the main validation characteristics. The results showed that better statistical values can be achieved by using the third method. Control charts for in-house control samples and international intercomparison samples have also shown that the third method is more statistically stable with time. In addition, uncertainties of measurement were estimated and compared for the three methods. It was found that the Eurachem Guide and comparison of quality statistical validation data can be good tools for selection of the appropriate method for an application.  相似文献   

6.
An industrial production process is considered to be divided into a working process and an information process. Starting point for the information process is the sampling of a material flow. The meaning of “representative” samples as well as questions of continuous and non-continuous sampling methods are discussed. The signals coming out of analysis are arranged in a special way according to statistical methods. Thereafter the information content is calculated by using rules of the information theory. The transmission of the information content within a certain time results in an information flux. The information flux actually required by the working process is compared to the information flux provided by analysis; and both are put into an equilibrium followed by considerations on “necessary and sufficient” analytical actions. It is the aim of analytical information processes to reduce the entropy of the working process to a minimum.  相似文献   

7.
Quality assurance in analytical measurement   总被引:2,自引:0,他引:2  
 The peculiarities of analytical measurement require to check characteristics of the error (its components) of the obtained analysis results to assure the quality of the measurements. This article deals with the various quality assurance procedures and algorithms which are used to check the quality indices, i.e. the accuracy, reproducibility, certainty and repeatability of analytical measurements: These procedures include: laboratory rapid control; Intra-laboratory statistical control (statistical selection control by alternative attribute, statistical selection control by quantity method of periodic check of the analysis procedure for conformity to the specified requirements) and external control (inter-laboratory control checks, inter-laboratory comparison tests, and intra-laboratory control algorithms carried out by the appropriate supervisory body.) in the separately taken laboratory. The respective algorithms, control plans and control requirements, specified according to the different control aims and assurance tasks, enable the quality and certianty of analytical information obtained in laboratories in Russia to be assured. Received: 9 November 1998 / Accepted: 24 November 1998  相似文献   

8.
The types and characteristic properties of organized media, methods of control over their efficiency, the key terms used in this area, and analytical applications of organized media have been briefly considered. It has been shown that each type of organized systems, which are constituents of organized media, is a specific microphase having no macroscopic analogs. This microphase can be considered as a nanoreactor for various analytical reactions and processes.  相似文献   

9.
This study explored the performance of experienced laboratories in the analysis for total selenium in water by a variety of analytical methods. The goal of the study was to examine intra- and interlaboratory variability. Replicates (n = 7) of 7 sample types that included a reference material of known Se concentration, natural waters, and treated wastewaters were submitted to 7 laboratories with prequalified Se analytical experience. Results of the study indicated wide ranges in minimum and maximum results, distinct differences in laboratory precision, and routine reporting of numerical results below statistical limits of quantitation. Hydride generation as a sample introduction technique demonstrated superior performance. In general, the study supports a caution advisory about using low-level Se data, especially results lower than about 10 micrograms Se/L, without quantitating the statistical uncertainty of the data. Because this study used data from samples that were submitted in bulk to participating laboratories prequalified for Se analytical expertise and experience, it can be considered a best-case demonstration of performance.  相似文献   

10.
The sources of errors in the results of chemical analysis are classified. Methods for the on-line control of the precision and accuracy of analysis are briefly discussed and compared in order to reveal the types of sources of errors in the considered methods. Algorithms to estimate the quality of work of an analytical laboratory based on the statistical analysis of the summarized results of control obtained in a certain period of time are proposed.  相似文献   

11.
In this paper, European food safety legislation is presented, and special attention is devoted to monitoring residues of veterinary drugs in foodstuffs of animal origin. After a short review of the state of the art of analytical methodology for antibiotic residue analysis, the paper focuses on validation of analytical methods, with Decision 657/2002/EC as reference document. Finally, the main issues of the quality control of the analytical data, i.e. analysis of reference materials and participation in proficiency tests, are briefly addressed.  相似文献   

12.
Near infrared spectroscopic and multivariate statistical control charts based on the net analyte signal (NAS) were applied to the polymorphic characterization of Piroxicam samples. Three different polymorphic forms (I, II and III) were studied, using X-ray powder diffraction (XRPD) and scanning electron microscopy as reference techniques. Samples containing form I were considered inside the quality specifications and forms II and III were impurities. Three control charts were developed: the NAS chart that corresponds to the analyte of interest (polymorphic form I), the interference chart that corresponds to the contribution of other compounds in the sample and the residual chart that corresponds to nonsystematic variation. From the limits estimated for each chart using samples inside the quality specifications, it was possible to identify samples that did not present polymorphic form I. The use of multivariate control charts provides a rapid evaluation of purity and the polymorphic composition of pharmaceutical formulations based on Piroxicam.  相似文献   

13.
The main sources of air pollution by inorganic metal compounds, the sampling of aerosol particles, and sample preparation for analysis are considered. The nondestructive and destructive methods of analysis are compared, and their advantages and disadvantages are specified. The development of synthetic reference samples, which are used to determine a calibration function and to verify the accuracy of analytical results for both of the methods, is considered.  相似文献   

14.
 The robustness of Shewhart control charts for subgroup means and subgroup ranges was tested by using the Monte Carlo method using training data sets comprising various numbers of points, with two repetitions in each subgroup (as in routine laboratory practice). The following control chart designs were tested: conventional based on the arithmetic mean and standard deviation, robust based on the median and/or the trimmed mean and Winsorized standard deviation, and a two-step design. The methods were applied to the system in the state of statistical control (outliers excluded) and to the system without statistical control (outliers included). Satisfactory results for both cases were only obtained when using the two-stage control charts. The conventional charts led to underestimation of the effect of outliers in the system without statistical control, whereas the robust control charts led to overestimation of the effect of outliers (false alarm) in the system under statistical control. The tests also gave evidence that the training set should include 20 points as a minimum. Received: 13 January 1997 Accepted: 12 February 1997  相似文献   

15.
One of the major problems in the signal comparison of chromatographic data is the variability of response caused by instrumental drifts and others instabilities. Measures of quality control and evaluation of conformity are inherently sensitive to shift. It is essential to be able to compare test samples to reference samples in an evolutionary analytical environment by offsetting the inevitable drift. Therefore, prior to any multivariate analysis, the alignment of analytical signals is a compulsory preprocessing step. During recent years, many researchers have taken a greater interest in the study of the alignment. The present paper is an updated review on the alignment algorithms, methods, and improvements used in chromatography. The study is dedicated to one‐dimensional signals. Several of the exposed methods have common theoretical bases and can differ through their optimization methods. The main issue for the operator is to choose the appropriate method according to the type of signals to be processed.  相似文献   

16.
Method validation is a key element in both the elaboration of reference methods and the assessment of a laboratory's competence in producing reliable analytical data. Hence, the scope of the term method validation is wide, especially if one bears in mind that there is or at least should be a close relation between validation, calibration and quality control QA/QC. Moreover, validation should include more than the instrumental step only since the whole cycle from sampling to the final analytical result is important in the assessment of the validity of an analytical result. In this article validation is put in the context of the process of producing chemical information. Two cases are presented in more detail: the development of a European standard for chlorophenols and its validation by a full scale collaborative trial, and the intralaboratory validation of a method for ethylenethiourea using alternative analytical techniques.  相似文献   

17.
The main concern of producers of certified reference materials (CRM) is the preparation of high-quality products with demonstrated homogeneity and stability, combined with a well established set of certified characteristics. CRM producers must, furthermore, comply with other constraints imposed by the ISO Guide 34: production processes, production control, and certification analyses should be performed by expert laboratories, using validated protocols documented in their respective quality assurance manuals; laboratory mean values and the corresponding "expanded" uncertainties, must be used for the determination of the certified values, as recommended by the ISO Guide to the Expression of Uncertainties in Measurements (GUM); and when possible, traceability of the certified value to the SI units, using appropriately validated and/or primary methods, must be ensured. k0-NAA, i.e. neutron activation analysis with k0 standardization, is one of the analytical techniques implemented at the Reference Material Unit of IRMM; it meets the first two requirements.  相似文献   

18.
Two of the most suitable analytical techniques used in the field of cultural heritage are NIR (near-infrared) and Raman spectroscopy. FT-Raman spectroscopy coupled to multivariate control charts is applied here for the development of a new method for monitoring the conservation state of pigmented and wooden surfaces. These materials were exposed to different accelerated ageing processes in order to evaluate the effect of the applied treatments on the goods surfaces. In this work, a new approach based on the principles of statistical process control (SPC) to the monitoring of cultural heritage, has been developed: the conservation state of samples simulating works-of-art has been treated like an industrial process, monitored with multivariate control charts, owing to the complexity of the spectroscopic data collected.The Raman spectra were analysed by principal component analysis (PCA) and the relevant principal components (PCs) were used for constructing multivariate Shewhart and cumulative sum (CUSUM) control charts. These tools were successfully applied for the identification of the presence of relevant modifications occurring on the surfaces. CUSUM charts however proved to be more effective in the identification of the exact beginning of the applied treatment. In the case of wooden boards, where a sufficient number of PCs were available, simultaneous scores monitoring and residuals tracking (SMART) charts were also investigated. The exposure to a basic attack and to high temperatures produced deep changes on the wooden samples, clearly identified by the multivariate Shewhart, CUSUM and SMART charts. A change on the pigment surface was detected after exposure to an acidic solution and to the UV light, while no effect was identified on the painted surface after the exposure to natural atmospheric events.  相似文献   

19.
Control charts are increasingly adopted by laboratories for effective monitoring of analytical processes. Analytical methods are mostly subject to two types of measurement errors, i—additive and ii—multiplicative, or proportional, error. These errors have been combined in a single model, namely the two component error model (TCME) proposed by [1]. In this study we present a comparison among the performance of three widely used location control charts, i.e. Shewhart, CUSUM and EWMA charts in presence of TCME model. This study will help quality practitioners to choose an efficient chart for the monitoring of analytical measurements.  相似文献   

20.
Three different techniques, energy dispersive X-ray fluorescence, total reflection X-ray fluorescence and particle induced X-ray emission were used to initiate an evaluation program on quality assurance (QA) procedures applied to X-ray spectrometry for chemical analysis. The use of standard methodologies to assure the statistical control of measurement data is the main objective of this work. Certified Reference Materials were used and up to 15 certified elements were analyzed to carry out the QA procedures. For the internal quality control, z-scores were calculated and control charts were produced. The plotted elemental data illustrate statistically controlled methodologies for the majority of the determinations. Even the cases where the control charts exhibit values out of control limits, the z-scores are below 3 in absolute value, indicating satisfactory results. Concerning external quality control the statistical methods applied showed that the results obtained for the three techniques are comparable, although some significant differences occur, mainly due to sample preparation. Therefore, the techniques are traceable to certified reference materials and the data gathered so far, enable to initiate a database for QA procedures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号