首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
 Method validation procedure requires a strategy for collecting those validation data that are best adapted to the analytical technique used. A flexible and general approach based on Object Linking and Embedding technology is proposed. It allows a traceable validation strategy using modular objects which encapsulate documentation, analytical data and processing logic. The contents of these objects are accessible through a standard user interface. This paper demonstrates how this can reduce experiment time, simplify evaluation efforts, and increase the ease of use of validation figures of merit. An illustration using Microsoft Visual Basic for Applications is presented, and some specific aspects are described. It consists of the evaluation of a time domain NMR technique for determining the moisture content of foods involving a multivariate calibration step. This study also illustrates how guidelines such as Good Validation Practices could be defined to present all validation documents in a standardised manner. Received: 3 September 1996 Accepted: 14 October 1996  相似文献   

2.
Multivariate screening methods are increasingly being implemented but there is no worldwide harmonized criterion for their validation. This study contributes to establish protocols for validating these methodologies. We propose the following strategy: (1) Establish the multivariate classification model and use receiver operating characteristic (ROC) curves to optimize the significance level (α) for setting the model’s boundaries. (2) Evaluate the performance parameter from the contingency table results and performance characteristic curves (PCC curves). The adulteration of hazelnut paste with almond paste and chickpea flour has been used as a case study. Samples were analyzed by infrared (IR) spectroscopy and the multivariate classification technique used was soft independent modeling of class analogies (SIMCA). The ROC study showed that the optimal α value for setting the SIMCA boundaries was 0.03 in both cases. The sensitivity value was 93%, specificity 100% for almond and 98% for chickpea, and efficiency 97% for almond and 93% for chickpea.  相似文献   

3.
The European research project DIFFERENCE is focussed on the development, optimisation and validation of screening methods for dioxin analysis, including bio-analytical and chemical techniques (CALUX, GC-LRMS/MS, GC×GC-ECD) and on the optimisation and validation of new extraction and clean-up procedures. The performance of these techniques is assessed in an international validation study and the results are compared with the reference technique GC-HRMS. This study is set up in three rounds and is in accordance with the International Harmonized Protocol for Proficiency Studies and the ISO 5725 standard. The results of the first two rounds are very promising in particular for GC-LRMS/MS. The results obtained with this technique were as accurate as the results reported by the labs using the GC-HRMS. The initial results reported for GC×GC-ECD overestimate the dioxin concentration in the samples. The results reported by the labs using the CALUX technique underestimate the total TEQ concentrations in the samples, compared to the GC-HRMS reference method. The repeatability of the CALUX is significantly higher than the other screening techniques. It was shown that accelerated solvent extraction (ASE) is a valid alternative extraction and clean-up procedure for fish oil and vegetable oil. The results obtained with CALUX and GC-HRMS after ASE are equivalent to the results obtained with the classical extraction and purification procedures.  相似文献   

4.
The determination of the contents of therapeutic drugs, metabolites and other important biomedical analytes in biological samples is usually performed by using high-performance liquid chromatography (HPLC). Modern multivariate calibration methods constitute an attractive alternative, even when they are applied to intrinsically unselective spectroscopic or electrochemical signals. First-order (i.e., vectorized) data are conveniently analyzed with classical chemometric tools such as partial least-squares (PLS). Certain analytical problems require more sophisticated models, such as artificial neural networks (ANNs), which are especially able to cope with non-linearities in the data structure. Finally, models based on the acquisition and processing of second- or higher-order data (i.e., matrices or higher dimensional data arrays) present the phenomenon known as “second-order advantage”, which permits quantitation of calibrated analytes in the presence of interferents. The latter models show immense potentialities in the field of biomedical analysis. Pertinent literature examples are reviewed.  相似文献   

5.
Multivariate self-modeling curve resolution is applied to the quantitation of coeluted organophosphorus pesticides: fenitrothion, azinphos-ethyl, diazinon, fenthion and parathion-ethyl. Analysis of these pesticides at levels of 0.1 to 1 μg/l in the presence of natural interferences is achieved using automated on-line liquid-solid extraction (Prospekt) coupled to liquid chromatography and diode array detection followed by a recently developed multivariate self-modeling curve resolution method. The proposed approach uses only 100 ml of natural water sample and has improved resolution of the coeluted organophosphorus insecticides and their quantitation at trace level. The results have been compared with those obtained by different laboratories participating in the Aquacheck interlaboratory exercise (WRC, Medmenham, UK) where more conventional analytical techniques are being used.  相似文献   

6.
Generalized analytical sensitivity (γ) is proposed as a new figure of merit, which can be estimated from a multivariate calibration data set. It can be confidently applied to compare different calibration methodologies, and helps to solve literature inconsistencies on the relationship between classical sensitivity and prediction error. In contrast to the classical plain sensitivity, γ incorporates the noise properties in its definition, and its inverse is well correlated with root mean square errors of prediction in the presence of general noise structures. The proposal is supported by studying simulated and experimental first-order multivariate calibration systems with various models, namely multiple linear regression, principal component regression (PCR) and maximum likelihood PCR (MLPCR). The simulations included instrumental noise of different types: independently and identically distributed (iid), correlated (pink) and proportional noise, while the experimental data carried noise which is clearly non-iid.  相似文献   

7.
In this paper we describe the characteristics and the applications of the multivariate methods for spectroscopic and chromatographic techniques independent component analysis (ICA) and two-dimensional correlation spectroscopy (2DCOS) focused to their use in environmental studies. In our opinion, these methods are important because they allow to characterize environmental samples with different aims and scopes from those generally obtained by means of more common multivariate methods such as principal component analysis (PCA) and partial least squares (PLS). The new insights of these methods in recent environmental studies are reviewed and debated.  相似文献   

8.
A method validation approach that bases on a quadratic regression model in which two types of error are incorporated is presented and applied to an experimental data set. The validation approach enables the determination of analytical performance characteristics referred to in Commission Decision 2002/657/EC (i.e., repeatability, within-laboratory reproducibility, decision limit, detection capability).  相似文献   

9.
In this work, a multiclass screening method for organic contaminants in natural and wastewater has been developed and validated for qualitative purposes, i.e. to ensure the reliable and sensitive identification of compounds detected in samples at a certain level of concentration. The screening is based on the use of GC-TOF MS, and the sample procedure involves solid phase extraction with C(18) cartridges. Around 150 organic contaminants from different chemical families were investigated, including PAHs, octyl/nonyl phenols, PCBs, PBDEs and a notable number of pesticides, such as insecticides (organochlorines, organophosphorus, carbamates and pyrethroids), herbicides (triazines and chloroacetanilides), fungicides and several relevant metabolites. Surface water, ground water and effluent wastewater were spiked with all target analytes at three concentration levels (0.02, 0.1 and 1 μg/L). Influent wastewater and raw leachate from a municipal solid waste treatment plant were spiked at two levels (0.1 and 1 μg/L). Up to five m/z ions were evaluated for every compound. The identification criterion was the presence of, at least, two m/z ions at the expected retention time, measured at their accurate mass, and the accomplishment of the Q/q(i) intensity ratio within specified tolerances. The vast majority of compounds investigated were correctly identified in the samples spiked at 1 μg/L. When analyte concentration was lowered down to 0.1 μg/L the identification was more problematic, especially in complex-matrix samples like influent wastewater. On the contrary, many contaminants could be properly identified at the lowest level 0.02 μg/L in cleaner matrices. The procedure was applied to the screening of water samples of different origin and matrix composition and allowed the detection of several target contaminants. A highly reliable identification could be carried out thanks to the sensitive full-spectrum acquisition at accurate mass, the high selectivity reached with the use of narrow-mass window extracted ion chromatograms, the low mass errors observed in the positive detections and the Q/q ratio accomplishment.  相似文献   

10.
Saffaj T  Ihssane B 《Talanta》2011,85(3):1535-1542
This article aims to expose a new global strategy for the validation of analytical methods and the estimation of measurement uncertainty. Our purpose is to allow to researchers in the field of analytical chemistry get access to a powerful tool for the evaluation of quantitative analytical procedures. Indeed, the proposed strategy facilitates analytical validation by providing a decision tool based on the uncertainty profile and the β-content tolerance interval. Equally important, this approach allows a good estimate of measurement uncertainty by using data validation and without recourse to other additional experiments.In the example below, we confirmed the applicability of this new strategy for the validation of a chromatographic bioanalytical method and the good estimate of the measurement uncertainty without referring to any extra effort and additional experiments. A comparative study with the SFSTP approach [1] showed that both strategies have selected the same calibration functions.The holistic character of the measurement uncertainty compared to the total error was influenced by our choice of profile uncertainty. Nevertheless, we think that the adoption of the uncertainty in the validation stage controls the risk of using the analytical method in routine phase.  相似文献   

11.
Analytical methods capability evaluation can be a useful methodology to assess the fitness of purpose of these methods for their future routine application. However, care on how to compute the capability indices have to be made. Indeed, the commonly used formulas to compute capability indices such as Cpk, will highly overestimate the true capability of the methods. Especially during methods validation or transfer, there are only few experiments performed and, using in these situations the commonly applied capability indices to declare a method as valid or as transferable to a receiving laboratory will conduct to inadequate decisions.  相似文献   

12.
Probabilistic algorithms to evaluate result reliability in qualitative chromatographic analysis are discussed in the paper. The elementary uncertainty (P0), concerned with a single test (comparison of sample and reference peak positions), is treated as the sum of misidentification and omission probabilities. Both constituents are calculated separately using the simplified model and Laplace functions. In the model, the main reasons for elementary uncertainties are random normally distributed deviations during retention characteristic measurement. Algorithms to calculate both constituents of P0 have to take into account real measurement precision, supposed composition of the sample, content of the database, chosen coincidence criterion and other factors. At a high selectivity of retention, the 3 value is recommended as the most convenient coincidence criterion. It leads to more reliable and unambiguous attribution of peaks in the chromatogram. For cases that are more complicated, the probabilistic algorithms based upon Bernoulli theorem are proposed to calculate the summary uncertainty of identification, concerned with the multiple test. They take into account P0 value, the number of repeated single tests (n) in the similar or different conditions, and chosen identification criterion K (minimal number of coincidences). The above-mentioned algorithms lead to a priori optimisation of the mode of operation of any identification software system associated with the chromatograph. They can be useful during a metrological validation of corresponding qualitative analysis methods.Presented at the Second International Conference on Metrology—Trends and Applications in Calibration and Testing Laboratories, November 4–6, 2003, Eilat, Israel.The opinions reflected in this paper are those of the author only. AQUAL does not necessarily endorse them.  相似文献   

13.
One of the limitations due to lack of resolution for a given pair of analytes in TLC or HPTLC is the need to optimize the system. In Practice this requires time, rerunning of the sample in different developing solvents, and a great deal of expertise on the part of the analyst. In our experience, application of first and second derivative recording techniques to HPTLC facilitates and speeds the whole process, permitting qualitative and quantitative assay of most unresolved spots. Consequently, we have now extended our instrumental capabilities to fourth derivative measurements. For this purpose, we have added a homemade electronic unit in series with the one previously used for first and second order derivatives. Thus, we have been able to evaluate the potential advantages of higher order derivatives for HPTLC analysis of unresolved components in various pharmaceutical products. A comparison of second and fourth order derivative measurements of seriously overlapping HPTLC components in a sample of preservatives used in the pharmaceutical industry suggests that the lower order derivatives might be a better choice in view of the higher accuracy and precision of the corresponding data. This is supported by the results of other applications, such as the assay of a commercial colorant, and a syrup formulation. The observed lack of precision of fourth order measurements stems from the fact that although the second and higher order derivatives produce narrower bandwidths, thus contributing to improved resolution, the signal to noise ratio decreases and satellite peak interactions increase, thus rendering correct discrimination of the fine structural detail of overlapping components more difficult.  相似文献   

14.
15.
An optimized model of multivariate classification for the monitoring of eighteen spring waters in the land of Serra St. Bruno, Calabria, Italy, has been developed. Thirty analytical parameters for each water source were investigated and reduced to eight by means of Principal Component Analysis (PCA). Water springs were grouped in five distinct classes by cluster techniques (CA) and a model for their classification was built by a Partial Least Squares–Discriminant Analysis (PLS–DA) procedure. The model was optimized and validated and then applied to new data matrices, containing the analytical parameters carried out on the same sources during the successive years. This model proved to be able to notice deviations of the global analytical characteristics, by pointing out in the course of time a different distribution of the samples within the classes. The variation of nitrate concentration was demonstrated to be the major responsible for the observed class shifts. The shifting sources were localized in areas used as sowable lands and high variability of nitrate content was ascribed to the practice of crop rotation, involving a varying use of the nitrogenous chemical fertilizers.  相似文献   

16.
Testing safety of foodstuffs of plant origin involves the analysis of hundreds of pesticide residues. This control is only cost-effective through the use of methods validated for the analysis of many thousands of analyte/matrix combinations. Several documents propose representative matrices of groups of matrices from which the validity of the analytical method can be extrapolated to the represented matrices after summarised experimental check of within group method performance homogeneity. Those groups are based on an evolved expert consensus based on the empirical knowledge on the current analytical procedures; they are not exhaustive, they are not objectively defined and they propose a large list of representative matrices which makes their application difficult. This work proposes grouping 240 matrices, based on the nutritional composition pattern equivalence of the analytical portion right after hydration and before solvent extraction, aiming at defining groups that observe method performance homogeneity. This grouping was based on the combined outcome of three multivariate tools, namely: Principal Component Analysis, Hierarchical Cluster Analysis and K-Mean Cluster Analysis. These tools allowed the selection of eight groups for which representative matrices with average characteristics and objective criteria to test inclusion of new matrices were established. The proposed matrices groups are homogeneous to nutritional data not considered in their definition but correlated with the studied multivariate nutritional pattern. The developed grouping that must be checked with experimental test before use was tested against small deviations in food composition and for the integration of new matrices.  相似文献   

17.
This paper describes the determination and evaluation of the mineral composition (calcium, magnesium, iron, manganese and zinc) of kale (Brassica oleracea L. var. acephala DC.) grown in soils within four cities in Bahia State, Brazil. The sampling process was performed during the summer and winter. Samples were digested with concentrated nitric acid and a digestion pump. Analyses were performed with inductively coupled plasma optical emission spectrometry (ICP OES) and the accuracy was confirmed with a certified reference material of apple leaves furnished by the National Institute of Standard and Technology. Principal component analysis (PCA) and hierarchical cluster analysis (HCA) revealed different mineral compositions of the samples collected in the summer and winter. Samples collected in the winter have a higher concentration of micronutrients (iron, zinc and manganese) and macronutrients (calcium and magnesium). The average contents (wet weight and mg per 100 g) for the winter and summer were 551 and 535 for calcium; 117 and 106 for magnesium; 2.13 and 1.48 for iron; 2.63 and 1.95 for zinc and 2.05 and 1.34 for manganese, respectively. These results are in agreement with values previously reported in the literature.  相似文献   

18.
The interactions of actinomycin D (ACTD) with the oligonucleotides 5′-CAAAGCTTTG-3′, 5′-CATGGCCATG-3′ and 5′-TATGGCCATA-3′ were investigated by means of acid–base titrations and mole-ratio and melting experiments monitored by molecular absorption and circular dichroism (CD) spectroscopies. For each experiment, CD and molecular absorption spectra were recorded at each point in the experiment, and later analyzed via appropriate multivariate data analysis methods. The study of the interactions between these oligonucleotides and ACTD at 25 °C showed the formation of an interaction complex with a stoichiometry of 1:1 (ACTD:duplex) and values for the log(formation constant) of 5.1 ± 0.3, 6.4 ± 0.2, and 5.6 ± 0.2, respectively. An additional interaction complex at higher temperatures was also detected, which might be related to the single-stranded forms of the oligonucleotides. Electronic supplementary material Supplementary material is available in the online version of this article at and is accessible for authorized users.  相似文献   

19.
Modern computerized spectroscopic instrumentation can result in high volumes of spectroscopic data. Such accurate measurements rise special computational challenges for multivariate curve resolution techniques since pure component factorizations are often solved via constrained minimization problems. The computational costs for these calculations rapidly grow with an increased time or frequency resolution of the spectral measurements.  相似文献   

20.
《Analytica chimica acta》2004,513(1):41-47
A method for determination of ochratoxin A (OTA) in wine grapes is described, using extraction with a hydrogen carbonate and polyethylene glycol (PEG) solution (5% NaHCO3 and 1% PEG 8000), followed by immunoaffinity clean-up and liquid chromatography with fluorescence detection. Validation was made with spiked samples, in levels of 0.05 and 1 μg kg−1, with average recovery rates of 76% and relative standard deviations in repeatability and intermediate precision conditions of 8 and 12%, respectively. The limit of detection and limit of quantification in grapes were established at 0.004 and 0.007 μg kg−1, respectively. To evaluate further the accuracy and efficiency of this method, naturally contaminated grapes were also analysed by another method that involves extraction with acidified methanol, at levels ranging from 0.05 to 37 μg kg−1, and the results compared. A good correlation (r=0.9996) was found, with better performances in terms of precision for the new method. A survey was conducted on wine grapes from 11 Portuguese vineyards, during the harvest of 2002, using the proposed method. OTA was detected in three out of the 11 samples, at levels ranging from 0.035 to 0.061 μg kg−1.The new method meets all the criteria of the European Commission directive 2002/26/CE, that lays down the sampling and the analysis methods for the official control of OTA levels in foodstuffs. It is reliable for low levels of contamination (ng kg−1), and avoids the use of organic solvents in the extraction step.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号