首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
2.
Pharmaceutical compounds have been detected in the environment and potentially arise from the discharge of excreted and improperly disposed medication from sewage treatment facilities. In order to minimize environmental exposure of pharmaceutical residues, a potential technique to remove pharmaceuticals from water is the use of an advanced oxidation process (AOP) involving titanium dioxide (TiO2) photocatalysis. To evaluate the extent UV/TiO2 processes have been studied for pharmaceutical degradation, a literature search using the keywords ‘titanium dioxide’, ‘photocatalysis’, ‘advanced oxidation processes’, ‘pharmaceuticals’ and ‘degradation’ were used in the ISI Web of Knowledge TM, Scopus TM and ScienceDirect TM databases up to and including articles published on 23 November 2011. The degradation rates of pharmaceuticals under UV/TiO2 treatment were dependent on type and amount of TiO2 loading, pharmaceutical concentration, the presence of electron acceptors and pH. Complete mineralization under particular experimental conditions were reported for some pharmaceuticals; however, some experiments reported evolution of toxic intermediates during the photocatalytic process. It is concluded that the UV/TiO2 system is potentially a feasible wastewater treatment process, but careful consideration of the treatment time, the loading and the type of TiO2 (doped vs. undoped) used for a particular pharmaceutical is necessary for a successful application (198 words).   相似文献   

3.
In small molecule drug discovery projects, the receptor structure is not always available. In such cases it is enormously useful to be able to align known ligands in the way they bind in the receptor. Here we shall present an algorithm for the alignment of multiple small molecule ligands. This algorithm takes pre-generated conformers as input, and proposes aligned assemblies of the ligands. The algorithm consists of two stages: the first stage is to perform alignments for each pair of ligands, the second stage makes use of the results from the first stage to build up multiple ligand alignment assemblies using a novel iterative procedure. The scoring functions are improved versions of the one mentioned in our previous work. We have compared our results with some recent publications. While an exact comparison is impossible, it is clear that our algorithm is fast and produces very competitive results.  相似文献   

4.
After the International Laboratory Accreditation Cooperation (ILAC) had taken in 2004, the resolution to conduct accreditation of producers of reference materials according to ISO Guide 34 ‘General requirements for the competence of reference material producers’ in combination with ISO/IEC 17025 ‘General requirements for the competence of testing and calibration laboratories’, ISO/REMCO, the ISO Committee on Reference Materials, decided in 2005 to revise ISO Guide 34 to align it closer with ISO/IEC 17025 and to clarify certain issues for accreditors and producers seeking accreditation without adding new requirements. Moreover, the publication in 2007 of ISO/IEC Guide 99 ‘International vocabulary of metrology—Basic and general concepts and associated terms (VIM)’ triggered additional adaptations of the guide.  相似文献   

5.
The interpretation of the results of proficiency tests by the use of mixture models is described. The data are interpreted as a sample from a mixture of several normal populations. The calculation of the statistics (the means, variances and proportions of each component) is accomplished by means of the ‘EM’ algorithm. The method has several advantages over those previously advanced, principally that the algorithm is fast and easy to execute. Examples from proficiency testing are discussed.  相似文献   

6.
It is proposed that molecular phenomena may only be described within the framework of the Complementarity Principle (‘CP’), and that scientific controversy may originate in the essential incompatibility of complementary representations. Complementarity based on the temporal Uncertainty Principle leads to new insights into transition state theory, microscopic reversibility and the Curtin-Hammett Principle. An empirical application of the ‘CP’ to the structural theory leads to a revision of present concepts of ‘reaction dynamics’, with the Principle of Least Nuclear Motion (‘PLNM’) emerging as a general alternative to electronic theories of reactivity. In fact, it is argued that the ‘PLNM’ is a better basis for the Woodward-Hoffmann rules than is orbital symmetry. A more flexible approach to organic reaction mechanisms is thus indicated. Also, as the basis of the structural theory is fundamentally uncertain, and the present theory of X-ray diffraction apparently incompatible with the ‘UP’, a reinterpretation of the Bragg equation has been attempted.  相似文献   

7.
In the evaluation of measurement uncertainty, the uncertainty budget is usually used to identify dominant terms that contribute to the uncertainty of the output estimate. Although a feature of the GUF method, it is also recommended as a qualitative tool in MCM by using ‘nonlinear’ equivalents of uncertainty contributions and sensitivity coefficients. In this paper, the use of ‘linear’ and ‘nonlinear’ parameters is discussed. It is shown that when and only when the standard uncertainty of the output estimate is nearly equal to the square root of the sum of the squares of the individual uncertainty contributions, will the latter be a reliable tool to detect the degree of contribution of each input quantity to the measurand uncertainty.  相似文献   

8.
The well-known divergence between the present ‘state of the art’ of thermogravimetry and industrial requirements is discussed. Sources of errors are analyzed and the optimization of measuring conditions is discussed regarding the problems associated with static and dynamic (flow) atmospheres, and interactions between materials and gases or vapors. Recommendations for gas-flow control systems and vapor sources are given. Thermal stability and the kinetics of gas-evolving, reversible, thermal decompositions of solids are discussed. The scope of TG-derived kinetics for practical use is examined. Some new characteristic points of TG curves are proposed and defined, e.g. ‘procedure-independent decomposition temperature’ and ‘augmented decomposition temperature’ (obtained at pseudo-equilibrium conditions). This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   

9.
Desimoni and Brunetti raise some questions about the use of Eurachem/CITAC guide, because the Eurachem/CITAC guide does not discuss an ISO recommendation before performing a test, it should be decided whether it is to be a test for conformity or a test for non-conformity. In response, it is pointed out that although this recommendation is not discussed explicitly, it is of necessity covered by the decision rule that describes how the measurement uncertainty will be taken into consideration with regard to accepting or rejecting a product according to its specification and the result of a measurement. In addition, they propose the introduction of an ‘inconclusive’ zone. We do not think that this is necessary, since the Eurachem/CITAC guide takes the view that action on rejection should be covered by the ‘decision rule’ and this can make equivalent provision for confirmation or interpretation.  相似文献   

10.
 Interferences (overlaps) occurring when lines of other elements affect either peak or background measurements can cause errors in quantitative WD analysis, but may be minimised by suitable choices of analysis conditions such as spectrometer crystal, background offsets, and pulse-height analyser settings. Computer spectrum-simulation is much more effective than reference to wavelength tables for investigating interferences. The ‘Virtual WDS’ simulation program developed by the present authors, hitherto applied only to ‘ordinary’ elements (Z ≥ 11), has been extended to light elements for which evaporated multilayers are used in place of true crystals. ‘Virtual WDS’ utilises experimentally recorded light-element K spectra and L and M spectra of heavier elements in the same wavelength range. It is impractical to record all high-order peaks, so computed line profiles are used, with widths and intensities interpolated from a limited set of observations. The relative positions of first and higher order peaks are affected significantly by the refractive index of the multilayer, requiring modification of the Bragg equation. Suppression of high orders by pulse-height analysis is less effective than for ‘normal’ wavelengths, owing to the breadth of the pulse-height distribution for low X-ray energies. Simulation using a Gaussian expression aids optimisation of threshold and window-width settings.  相似文献   

11.
Summary A popular first step in the problem of structure-based, de novo molecule design is to identify regions where specific functional groups or chemical entities would be expected to interact strongly. When the three-dimensional structure of the receptor is not available, it may be possible to derive a pharmacophore giving the three-dimensional relationships between such chemical groups. The task then is to design synthetically feasible molecules which not only contain the required groups, but which can also position them in the desired relative orientation. One way to do this is to first link the groups using an acyclic chain. We have investigated the application of the tweak algorithm [Shenkin, P.S. et al., Biopolymers, 26 (1987) 2053] for generating families of acyclic linkers. These linking structures can subsequently be braced using a ring-joining algorithm [Leach, A.R. and Lewis, R.A., J. Comput. Chem., 15 (1994) 233], giving rise to an even wider variety of molecular skeletons for further studies.  相似文献   

12.
In the slow evolution of the International vocabulary of metrology (VIM), the first concept of ‘quantity’ has now been divided generically into ‘ordinal quantity’ and a coordinate primitive without definition and term. An analysis of the concepts by their characteristics is made, and the nature of inheritance is discussed in response to a recent communication in this journal. A completion of the initiated generic division of ‘quantity’ is suggested, and a neoterm for the sister of ‘ordinal quantity’ is offered on the basis of two proposals.  相似文献   

13.
The use of simple linear mathematical models to estimate chemical properties is not a new idea. Albert Einstein used very simple ‘gravity-like' forces to explain the capillarity of different liquids in 1900–1901. Today such models are used in more complicated situations, and a great many have been developed to analyse interactions between proteins and their ligands. This is not surprising, since proteins are too complicated to model accurately without lengthy numerical analysis, and simple models often do at least as good a job in predicting binding constants as much more computationally expensive methods. One hundred years after Einstein’s ‘miraculous year’ in which he transformed physics, it is instructive to recall some of his even earlier work. As approximations, ‘scoring functions’ are excellent, but it is dangerous to read too much into them. A few cautionary tales are presented for the beginner to the field of ligand affinity prediction by linear models.  相似文献   

14.
Although auxins were the first type of plant hormone to be identified, little is known about the molecular mechanism of this important class of plant hormones. We present a classification of a set of about 50 compounds with measured auxin activities, according to their interaction properties. Four classes of compounds were defined: strongly active, weakly active with weak antiauxin behaviour, inactive and inhibitory. All compounds were modeled in two low-energy conformations, P and T, so as to obtain the best match to the planar and tilted conformations, respectively, of indole 3-acetic acid. Each set of conformers was superimposed separately using several different alignment schemes. Molecular interaction energy fields were computed for each molecule with five different chemical probes and then compared by computing similarity indices. Similarity analysis showed that the classes are on average distinguishable, with better differentiation achieved for the T conformers than the P conformers. This indicates that the T conformation might be the active one. Further, a screening was developed which could distinguish compounds with auxin activity from inactive compounds and most antiauxins using the T conformers. The classifications rationalize ambiguities in activity data found in the literature and should be of value in predicting the activities of new plant growth substances and herbicides.  相似文献   

15.
Capillary electrophoresis–mass spectrometry (CE–MS) is a powerful technique for the analysis of small soluble compounds in biological fluids. A major drawback of CE is the poor migration time reproducibility, which makes it difficult to combine data from different experiments and correctly assign compounds. A number of alignment algorithms have been developed but not all of them can cope with large and irregular time shifts between CE–MS runs. Here we present a genetic algorithm designed for alignment of CE–MS data using accurate mass information. The utility of the algorithm was demonstrated on real data, and the results were compared with one of the existing packages. The new algorithm showed a significant reduction of elution time variation in the aligned datasets. The importance of mass accuracy for the performance of the algorithm was also demonstrated by comparing alignments of datasets from a standard time-of-flight (TOF) instrument with those from the new ultrahigh resolution TOF maXis (Bruker Daltonics).  相似文献   

16.
In both European legislation relating to the testing of food and the recommendations of the Codex Alimentarius Commission, there is a movement away from specifying particular analytical methods towards specifying performance criteria to which any methods used must adhere. This ‘criteria approach’ has hitherto been based on the features traditionally used to describe analytical performance. This paper proposes replacing the traditional features, namely accuracy, applicability, detection limit and limit of determination, linearity, precision, recovery, selectivity and sensitivity, with a single specification, the uncertainty function, which tells us how the uncertainty varies with concentration. The uncertainty function can be used in two ways, either as a ‘fitness function’, which describes the uncertainty that is fit for purpose, or as a ‘characteristic function’ that describes the performance of a defined method applied to a defined range of test materials. Analytical chemists reporting the outcome of method validations are encouraged to do so in future in terms of the uncertainty function. When no uncertainty function is available, existing traditional information can be used to define one that is suitable for ‘off-the-shelf’ method selection. Some illustrative examples of the use of these functions in methods selection are appended.  相似文献   

17.
Most base units in the SI relate to specific sensoric qualities our body is able to observe: space, heat, brightness, etc. The base unit ‘mole’ incorporates intellectual insight: the atomistic perception of the world. This perception is a quintessence of over 300 years of scientific research. The quintessence, from Dalton’s ‘The sceptical chymist’ to Perrin’s Nobel Prize in 1926 and Pauling’s ‘Nature of the Chemical Bond’ in 1939, results in the conclusion that the base unit of the SI quantity ‘amount of substance’ is not the mole but the dimensionless entity.  相似文献   

18.
Summary We present a system, FLOG (Flexible Ligands Oriented on Grid), that searches a database of 3D coordinates to find molecules complementary to a macromolecular receptor of known 3D structure. The philosophy of FLOG is similar to that reported for DOCK [Shoichet, B.K. et al., J. Comput. Chem., 13 (1992) 380]. In common with that system, we use a match center representation of the volume of the binding cavity and we use a clique-finding algorithm to generate trial orientations of each candidate ligand in the binding site. Also we use a grid representation of the receptor to assess the fit of each orientation. We have introduced a number of novel features within this paradigm. First, we address ligand flexibility by including up to 25 explicit conformations of each structure in our databases. Nonhydrogen atoms in each database entry are assigned one of seven atom types (anion, cation, donor, acceptor, polar, hydrophobic and other) based on their local bonded chemical environments. Second, we have devised a new grid-based scoring function compatible with this heavy atom representation of the ligands. This includes several potentials (electrostatic, hydrogen bonding, hydrophobic and van der Waals) calculated from the location of the receptor atoms. Third, we have improved the fitting stage of the search. Initial dockings are generated with a more efficient clique-finding algorithm. This new algorithm includes the concept of essential points, match centers that must be paired with a ligand atom. Also, we introduce the use of a rapid simplex-based rigid-body optimizer to refine the orientations. We demonstrate, using dihydrofolate reductase as a sample receptor, that the FLOG system can select known inhibitors from a large database of drug-like compounds.  相似文献   

19.
We have investigated the effect of ‘Graham’s salt’ as a phosphorous containing flame-retardant applied onto cotton fabric. The optimum loading of this salt to impart flameretardancy has been determined to be about 36.78-41-31 g salt per 100 g cotton woven fabric (plain 144 g m−2). Thermogravimetry of pure cotton, treated cotton fabric and the pure salt were accomplished. The curves were then compared and commented. They reveal that this salt thermosensibilized combustion of the treated substrate as a dehydrating agent. The results obtained fortified the ‘Chemical Theory’ and ‘Coating Theory’ evidenced the formation of carbonaceous residue upon the cellulosic substrate during the combustion.  相似文献   

20.
In this paper I expand Eric Scerri’s notion of Popper’s naturalised approach to reduction in chemistry and investigate what its consequences might be. I will argue that Popper’s naturalised approach to reduction has a number of interesting consequences when applied to the reduction of chemistry to physics. One of them is that it prompts us to look at a ‘bootstrap’ approach to quantum chemistry, which is based on specific quantum theoretical theorems and practical considerations that turn quantum ‘theory’ into quantum ‘chemistry’ proper. This approach allows us to investigate some of the principles that drive theory formation in quantum chemistry. These ‘enabling theorems’ place certain limits on the explanatory latitude enjoyed by quantum chemists, and form a first step into establishing the relationship between chemistry and physics in more detail.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号