首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 483 毫秒
1.
This work presents the validation study of the comprehensive two-dimensional gas chromatography (GC×GC)–time-of-flight mass spectrometry method performance in the analysis of the key World Anti-Doping Agency (WADA) anabolic agents in doping control. The relative abundance ratio, retention time, identification and other method performance criteria have been tested in the GC×GC format to confirm that they comply with those set by WADA. Furthermore, tens of other components were identified with an average similarity of >920 (on the 0–999 scale), including 10 other endogenous sterols, and full mass spectra of 5,000+ compounds were retained. The testosterone/epitestosterone ratio was obtained from the same run. A new dimension in doping analysis has been implemented by addressing separation improvement. Instead of increasing the method sensitivity, which is accompanied by making the detector increasingly “blind” to the matrix (as represented by selected ion monitoring mode, high-resolution mass spectrometry (MS) and tandem MS), the method capabilities have been improved by adding a new “separation” dimension while retaining full mass spectral scan information. Apart from the requirement for the mass spectral domain that a minimum of three diagnostic ions with relative abundance of 5% or higher in the MS spectra, all other WADA criteria are satisfied by GC×GC operation. The minimum of three diagnostic ions arises from the need to add some degree of specificity to the acquired mass spectrometry data; however, under the proposed full MS scan method, the high MS similarity to the reference compounds offers more than the required three diagnostic ions for an unambiguous identification. This should be viewed as an extension of the present criteria to a full-scan MS method.  相似文献   

2.
Since the advent of the Guide to the expression of Uncertainty in Measurement (GUM) in 1995 laying the principles of uncertainty evaluation numerous projects have been carried out to develop alternative practical methods that are easier to implement namely when it is impossible to model the measurement process for technical or economical aspects. In this paper, the author presents the recent evolution of measurement uncertainty evaluation methods. The evaluation of measurement uncertainty can be presented according to two axes based on intralaboratory and interlaboratory approaches. The intralaboratory approach includes “the modelling approach” (application of the procedure described in section 8 of the GUM, known as GUM uncertainty framework) and “the single laboratory validation approach”. The interlaboratory approaches are based on collaborative studies and they are respectively named “interlaboratory validation approach” and “proficiency testing approach”.  相似文献   

3.
Van Eenoo and Delbeke in Accred Qual Assur (2009) have criticized Faber (in Accred Qual Assur, 2009) for not taking “all factors under consideration when making his claims”. Here, it is detailed that their criticism is based on a misunderstanding of examples that were merely intended to be illustrative. Motivated by this criticism, further discussion is provided that may help in the pursuit of more fair and effective doping tests, here exemplified by chromatography with mass spectrometric detection. Surely, any doping test can only be improved or even optimized if the risks of false positives and false negatives are well defined. This requirement is consistent with a basic principle concerning mathematical approximations (Parlett in “The symmetric eigenvalue problem”, Prentice-Hall, Englewood Cliffs, 1980): apart from just being good, they should be known to be good. Author’s reply to the response on “Regulations in the field of residue and doping analysis...” Papers published in this section do not necessarily reflect the opinion of the Editors, the Editorial Board and the Publisher.  相似文献   

4.
In this article, we propose a new method to measure DNA similarity based on a normalized Lempel-Ziv complexity scheme. The new method can weaken the effect of sequence length on complexity measurement and save computation time. Firstly, a DNA sequence is transformed into three (0,1)-sequences based on a scheme, which considers “A” and “non-A” , “G” and “non-G”, “C” and “non-C” bases respectively. Then, the normalized Lempel-Ziv complexity of the three (0,1)-sequences constitute a 3D vector. Finally, by the 3D vector, one may characterize DNA sequences and compute similarity matrix for them. The examination of similarities of two sets of DNA sequences illustrates the utility of the method in local and global similarity analysis.  相似文献   

5.
The ability of multivariate analysis methods such as hierarchical cluster analysis, principal component analysis and partial least squares-discriminant analysis (PLS-DA) to achieve olive oil classification based on the olive fruit varieties from their triacylglycerols profile, have been investigated. The variations in the raw chromatographic data sets of 56 olive oil samples were studied by high-temperature gas chromatography with (ion trap) mass spectrometry detection. The olive oil samples were of four different categories (“extra-virgin olive oil”, “virgin olive oil”, “olive oil” and “olive-pomace” oil), and for the “extra-virgin” category, six different well-identified olive oil varieties (“hojiblanca”, “manzanilla”, “picual”, “cornicabra”, “arbequina” and “frantoio”) and some blends of unidentified varieties. Moreover, by pre-processing methods of chemometric (to linearise the response of the variables) such as peak-shifting, baseline (weighted least squares) and mean centering, it was possible to improve the model and grouping between different varieties of olive oils. By using the first three principal components, it was possible to account for 79.50% of the information on the original data. The fitted PLS-DA model succeeded in classifying the samples. Correct classification rates were assessed by cross-validation.  相似文献   

6.
The context of validation for mass spectrometry (MS)-based methods is critically analysed. The focus is on the fitness for purpose depending on the task of the method. Information is given on commonly accepted procedures for the implementation and acceptance of analytical methods as ‘confirmatory methods’ according to EU criteria, and strategies for measurement. Attention is paid to the problem of matrix effects in the case of liquid chromatography-mass spectrometry-based procedures, since according to recent guidelines for bioanalytical method validations, there is a need to evaluate matrix effects during development and validation of LC-MS methods “to ensure that precision, selectivity and sensitivity will not be compromised”. Beneficial aspects of the qualification process to ensure the suitability of the MS analytical system are also evaluated and discussed.  相似文献   

7.
Multivariate statistical analysis of sediment data (information matrix 123 × 16) from the Gulf of Mexico, USA shows that the data structure is defined by four latent factors conditionally called “inorganic natural”, “inorganic anthropogenic”, “bioorganic” and “organic anthropogenic” explaining 39.24%, 23.17%, 10.77% and 10.67% of the total variance of the data system, respectively. The receptor model obtained by the application of the PCR approach makes it possible to apportion the contribution of each chemical component for the latent factor formation. A separation of the contribution of each chemical parameter is achieved within the frames of “natural” and “anthropogenic” origin of the respective heavy metal or organic matter to the sediment formation process. This is a new approach as compared to the traditional “one dimensional” search with a limited number of preliminary selected tracer components. The model suggested divides natural from anthropogenic influences and allows in this way each participant in the sediment formation process to be used as marker of either natural or anthropogenic effects. Received: 20 March 1999 / Revised: 1 June 1999 / Accepted: 3 June 1999  相似文献   

8.
Multivariate statistical analysis of sediment data (information matrix 123 × 16) from the Gulf of Mexico, USA shows that the data structure is defined by four latent factors conditionally called “inorganic natural”, “inorganic anthropogenic”, “bioorganic” and “organic anthropogenic” explaining 39.24%, 23.17%, 10.77% and 10.67% of the total variance of the data system, respectively. The receptor model obtained by the application of the PCR approach makes it possible to apportion the contribution of each chemical component for the latent factor formation. A separation of the contribution of each chemical parameter is achieved within the frames of “natural” and “anthropogenic” origin of the respective heavy metal or organic matter to the sediment formation process. This is a new approach as compared to the traditional “one dimensional” search with a limited number of preliminary selected tracer components. The model suggested divides natural from anthropogenic influences and allows in this way each participant in the sediment formation process to be used as marker of either natural or anthropogenic effects. Received: 20 March 1999 / Revised: 1 June 1999 / Accepted: 3 June 1999  相似文献   

9.
The aim of the described method was the characterization of a “atrazine-mercaptopropionic acid” humic acid conjugate, which was required for the calibration of immunoassays to determine bound residues. In order to estimate the amount of bound triazine, an oxidative nucleophilic substitution reaction of the covalently linked “atrazine-mercaptopropionic acid” to a new triazine derivative “atrazine-methoxyethanol” was performed. This cleavage product was quantified by gas chromatography. The conditions for this cleavage procedure were optimized and applied to the “atrazine-mercaptopropionic acid” humic acid conjugate and to a humic acid blank. The amount of bound “atrazine-mercaptopropionic acid” was calculated to 16.6 ± 2.5 μmol triazine per gram humic acid, which is equivalent to 0.39 ± 0.07% atrazine. Received: 7 August 1997 / Accepted: 12 September 1997  相似文献   

10.
 The need for “quality” in near patient testing (NPT) has been acknowledged since the mid 1980s. The commonest biochemical NPT device is the dry reagent strip or “dipstick” for urinalysis. Dipsticks may be read in three ways, against the color chart printed along the side of the bottle, using a benchreader (the color chart printed on a flat card) or using an electronic reader. This report uses the results of a urinalysis quality assurance (QA) program, over 1998, to evaluate the “error” rates which occur using the three different reading methods. The QA samples are buffered aqueous solutions which are “spiked” to give concentrations midway between two color blocks for each analyte. Results are scored as ±1 if a color block adjacent to the target value, ±2 for results two color blocks (defined as “error”) and ±3 for results three color blocks (defined as “gross error”) from the target value. Analysis of the results show that the error rates are similar reading visually by either method, but greatly reduced when read electronically. Some persisting errors when using the electronic reader are explained by observation studies. The study highlights the value of a urinalysis QA program for NPT urinalysis in understanding the error rates of this simple but ubiquitous test. Received: 10 July 2000 / Accepted: 10 July 2000  相似文献   

11.
The development of an analytical procedure for speciation analysis of methylmercury in fish products is presented. The method is based on high-performance liquid chromatography hyphenated to inductively coupled plasma-mass spectrometry. The metrological approach is stressed out in this paper, in order to provide reliable and comparable results. A complete uncertainty budget has been evaluated and the method has been validated by the use of a certified reference material. Moreover, the detection could rely on the isotope dilution mass spectrometry, a powerful strategy capable of highly accurate results traceable to the “Système International d’Unités” and recognised by the “Comité Consultatif pour la Quantité de Matière” as a primary method of measurement. Presented at MEFNM 2008, September 2008, Budapest, Hungary.  相似文献   

12.
In 1931 eminent chemist Fritz Paneth maintained that the modern notion of “element” is closely related to (and as “metaphysical” as) the concept of element used by the ancients (e.g., Aristotle). On that basis, the element chlorine (properly so-called) is not the elementary substance dichlorine, but rather chlorine as it is in carbon tetrachloride. The fact that pure chemicals are called “substances” in English (and closely related words are so used in other European languages) derives from philosophical compromises made by grammarians in the late Roman Empire (particularly Priscian [fl. ~520 CE]). When the main features of the constitution of isotopes became clear in the first half of the twentieth century, the formal (IUPAC) definition of a “chemical element” was changed. The features that are “essential” to being an element had previously been “transcendental” (“beyond the sphere of consciousness”) but, by the mid-twentieth century the defining characteristics of elements, as such, had come to be understood in detail. This amounts to a shift in a “horizon of invisibility” brought about by progress in chemistry and related sciences. Similarly, chemical insight is relevant to currently-open philosophical problems, such as the status of “the bundle theory” of the coherence of properties in concrete individuals.
Joseph E. Earley Sr.Email: URL: http://www.georgetown.edu/faculty/earleyj/main.htm
  相似文献   

13.
 In this overview I discuss recent advances as well as outstanding issues in reduced dimensionality quantum approaches to reactive scattering. “Reduced dimensionality” in the present context signifies treating a subset of all degrees of freedom (the most strongly coupled ones) by rigorous quantum methods and treating the remaining (weakly coupled) degrees of freedom by a variety of approximate methods, ranging from simple, so-called energy shifts to more elaborate adiabatic treatments. The most widely used example of this approach is termed “J-shifting”, and this overview will concentrate on this method and discuss its application and generalization to both “direct” and “complex” reactions, exemplified by O(3P) + HCl and O(1D) + HCl, respectively. In addition, for O(3P) + HCl, resonances in the tunneling region, due to van der Waals wells, are discussed and their challenge to reduced dimensionality methods is stressed. Another new aspect of the reduced dimensionality treatment of polyatomic reactions is the need to describe anharmonicity in a consistent fashion. This is exemplified by the H + CH4 reaction. Received: 3 February 2002 / Accepted: 8 April 2002 / Published online: 19 August 2002  相似文献   

14.
A model of dissociative chemisorption on a surface with a square lattice was studied using the Monte Carlo method. The model is based on two chemisorption pathways: “direct”—nucleation of adsorption islands, and “indirect”—their growth. The development of the surface distribution picture of chemisorbed particles was found to depend significantly on the contribution of these two pathways (Sindir/Sdir).  相似文献   

15.
Stilbenes and zeranol are nonsteroidal estrogenic growth promoters which are banned in the European Union (EU) for use in food-producing animals by Council Directive 96/22/EC. A liquid chromatography–tandem mass spectrometry (LC-MS/MS) method was developed for the screening and confirmation of stilbenes (diethylstilbestrol, dienestrol, hexestrol) and resorcylic acid lactones (zeranol and its metabolites taleranol and zearalanone as well as the mycotoxins α-zearalenol, β-zearalenol and zearalenone) in bovine urine. The method permits the confirmation and quantification of stilbenes and resorcylic acid lactones at levels below 1 μg L−1 and 1.5 μg L−1, respectively. The validation was carried out according to Commission Decision 2002/657/EC, Chap. 3.1.3 “alternative validation” by a matrix-comprehensive in-house validation concept. Decision limit CCα, detection capability CCβ, recovery, repeatabiliy, within-laboratory reproducibility and the uncertainty of measurement were calculated. Furthermore, a factorial effect analysis was carried out to identify factors that have a significant influence on the method. Factors considered to be relevant for the method in routine analysis (e.g. operator, matrix condition, storage duration of the extracts before measurement, different cartridge lots, hydrolysis conditions) were systematically varied on two levels. The factorial analysis showed that different cartridge lots, storage durations and matrix conditions can exert a relevant influence on the method.  相似文献   

16.
1 Introduction Recently the concepts and models of “pump” are used frequently in studying global cycles and varieties of elements in marine chemistry and marine biogeo-chemistry. For example, there are many “pump” con-cepts including the concepts of s…  相似文献   

17.
The Pitzer method was used to calculate the pH values on the conventional and “true” scales for the TRIS—TRIS·HCl−NaCl−H2O buffer system in the 0–40 °C temperature region and 0–4 NaCl molality interval. This buffer can be used as a standard for pH measurements in a wide range of ionic strengths. The conventional scale is used in cells without a salt bridge. The “true” scale is recommended for pH measurements using cells with a salt bridge. At the same concentrations of the buffer solution, the “true” scale is essentially transformed into the scale of the National Bureau of Standards (NBS) of the USA. Translated fromIzvestiya Akademii Nauk. Seriya Khimicheskaya, No. 4, pp. 676–680, April, 2000.  相似文献   

18.
The sustainable development rule implementation is tested by the application of chemometrics in the field of environmental pollution. A data set consisting of Cd, Pb, Cr, Zn, Cu, Mn, Ni, and Fe content in bottom sediment samples collected in the Odra River (Germany/Poland) is treated using cluster analysis (CA), principal component analysis (PCA), and source apportionment techniques. Cluster analysis clearly shows that pollution on the German bank is higher than on the Polish bank. Two latent factors extracted by PCA explain over 88 % of the total variance of the system, allowing identification of the dominant “semi-natural” and “anthropogenic” pollution sources in the river ecosystem. The complexity of the system is proved by MLR analysis of the absolute principal component scores (APCS). The apportioning clearly shows that Cd, Pb, Cr, Zn and Cu participate in an “anthropogenic” source profile, whereas Fe and Mn are “semi-natural”. Multiple regression analysis indicates that for particular elements not described by the model, the amounts vary from 4.2 % (Mn) to 13.1 % (Cr). The element Ni participates to some extent to each source and, in this way, is neither pure “semi-natural” nor pure “anthropogenic”. Apportioning indicates that the whole heavy metal pollution in the investigated river reach is 12510.45 mg·kg−1. The contribution of pollutants originating from “anthropogenic sources” is 9.04 % and from “semi-natural” sources is 86.53 %.  相似文献   

19.
The paper contains a reply to ‘Validation of specificity in doping control: problems and prospects’ by N. M. Faber. Dr. Faber charges the work of anti-doping scientists of the use inappropriate methods. The allegations refer to the procedure of substance identification which according to Dr. Faber is based on subjective criteria (“visual inspection”). We demonstrate that by contrast it represents an objective, logically sound, and clearly defined procedure which strictly follows the logic of scientific reasoning.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号