首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 27 毫秒
1.
分子模拟技术是一项成熟的、有潜力的计算机实验技术,计算机具备高速稳定的图形处理能力,可以将实验结果以数据和图形的形式呈现出来,数据库技术可以方便的提取和存储各种分子模型和聚合物片段,通过可视化的三维图形的让学生更容易理解高分子科学的抽象概念,分子动力学的计算方法以牛顿力学为基础,在一定力场下建立周期性边界条件模型,通过几何构型优化和动力学运算,直观地了解到高分子的近程结构和聚集态结构,计算得出普通高分子实验无法得到微观相态结构,丰富了实验内容,开拓了高分子科学实验的新领域。  相似文献   

2.
We present results of the theoretical study and numerical calculation of the dynamics of molecular liquids based on the combination of the memory equation formalism and the reference interaction site model (RISM). Memory equations for the site-site intermediate scattering functions are studied in the mode-coupling approximation for the first-order memory kernels, while equilibrium properties such as site-site static structure factors are deduced from RISM. The results include the temperature-density (pressure) dependence of translational diffusion coefficients D and orientational relaxation times tau for acetonitrile in water, methanol in water, and methanol in acetonitrile--all in the limit of infinite dilution. Calculations are performed over the range of temperatures and densities employing the extended simple point charge model for water and optimized site-site potentials for acetonitrile and methanol. The theory is able to reproduce qualitatively all main features of temperature and density dependences of D and tau observed in real and computer experiments. In particular, anomalous behavior, i.e, the increase in mobility with density, is observed for D and tau of methanol in water, while acetonitrile in water and methanol in acetonitrile do not show deviations from the ordinary behavior. The variety exhibited by the different solute-solvent systems in the density dependence of the mobility is interpreted in terms of the two competing origins of friction, which interplay with each other as density increases: the collisional and dielectric frictions which, respectively, increase and decrease with increasing density.  相似文献   

3.
When an atom is incorporated into a molecule or a crystal, its X-ray spectrum undergoes characteristic changes, the study of which leads to important information on the nature of the chemical bonding and on the electronic structure in a substance. A number of examples are given to illustrate the possibilities of the X-ray spectroscopy of bonded atoms. Special attention is given to the displacement of the Kα lines, from which conclusions can be drawn regarding the charge on a bonded atom, as well as to the investigation of the emission bands resulting from valence electron transitions, which yields information on the energy band structure of the solid. X-ray spectroscopic studies on free molecules and theoretical work on the calculation of the molecular orbitals of simple molecules are finally reported.  相似文献   

4.
The calculation of the rotational diffusion tensor and NMR relaxation times of quasi-rigid macromolecular structures with atomic detail can be made by means of bead/shell models. The computing time required for calculation of these properties for a single structure using rigorous hydrodynamic methods requires a moderate computing time. In possible applications of the methodology in which such a calculation had to be repeated for many structures, a faster method would be welcome. We have studied the effect of introducing a simplifying approximation in the treatment of hydrodynamic interaction, comparing the rigorous and approximate results for a set of 30 globular proteins. When the NMR relaxation times are combined in some relative form, the rigorous and approximate results are practically coincident. For absolute quantities, such as the correlation time, we find that the bias introduced by the approximation can be closely predicted and corrected. The differences between the results of the approximate-corrected and rigorous procedures are not greater than usual experimental errors, and the typical computing time is reduced from 5 min to 1 s.  相似文献   

5.
A program was written in the Basic language to handle scintillation counting data by a time-sharing computer. Explanation is given for the use of the program to facilitate the most common types of calculations, as well as the automatic data handling between the spectrometer and the remote computer. The merits of the system are discussed and a sample of the operation and the output is given.  相似文献   

6.
Summary Using the method of target factor analysis (TFA) described by Malinowski and Howery a computer program has been developed to study different sets of gas chromatographic retention data. Physico-chemical, topological and uniqueness parameters have been found to be basic factors to describe solute behaviour problems. Factor analytical solutions have been used to reproduce the data matrices and to make predictions based on best sets of basic factors. The mean absolute error in the reproduction step is between 1.72 retention index units (i.u.) for a relatively simple matrix consisting of retention indices of alcohols and 7.36 i.u. for a combined data matrix of alcohol, aldehyde and ketone retention indices. TFA has also been used to classify solutes based on their retention behaviour. Alkanes have been classified from cycloalkanes, alkanes from alkenes, and alcohols from aldehydes and ketones using only their retention data and a special kind of uniqueness vector.  相似文献   

7.
We developed a novel parallel algorithm for large-scale Fock matrix calculation with small locally distributed memory architectures, and named it the "RT parallel algorithm." The RT parallel algorithm actively involves the concept of integral screening, which is indispensable for reduction of computing times with large-scale biological molecules. The primary characteristic of this algorithm is parallel efficiency, which is achieved by well-balanced reduction of both communicating and computing volume. Only the density matrix data necessary for Fock matrix calculations are communicated, and the data once communicated are reutilized for calculations as many times as possible. The RT parallel algorithm is a scalable method because required memory volume does not depend on the number of basis functions. This algorithm automatically includes a partial summing technique that is indispensable for maintaining computing accuracy, and can also include some conventional methods to reduce calculation times. In our analysis, the RT parallel algorithm had better performance than other methods for massively parallel processors. The RT parallel algorithm is most suitable for massively parallel and distributed Fock matrix calculations for large-scale biological molecules with more than thousands of basis functions.  相似文献   

8.
Automation has become an increasingly popular tool for synthetic chemists over the past decade. Recent advances in robotics and computer science have led to the emergence of automated systems that execute common laboratory procedures including parallel synthesis, reaction discovery, reaction optimization, time course studies, and crystallization development. While such systems offer many potential benefits, their implementation is rarely automatic due to the highly specialized nature of synthetic procedures. Each reaction category requires careful execution of a particular sequence of steps, the specifics of which change with different conditions and chemical systems. Careful assessment of these critical procedural requirements and identification of the tools suitable for effective experimental execution are key to developing effective automation workflows. Even then, it is often difficult to get all the components of an automated system integrated and operational. Data flows and specialized equipment present yet another level of challenge. Unfortunately, the pain points and process of implementing automated systems are often not shared or remain buried deep in the SI. This perspective provides an overview of the current state of automation of synthetic chemistry at the benchtop scale with a particular emphasis on core considerations and the ensuing challenges of deploying a system. Importantly, we aim to reframe automation as decidedly not automatic but rather an iterative process that involves a series of careful decisions (both human and computational) and constant adjustment.

The process of automating chemistry involves a wide variety of considerations that are often overlooked.  相似文献   

9.
The availability of cryogenically cooled probes permits routine acquisition of data from low sensitivity pulse sequences such as inadequate and 1,1‐adequate. We demonstrate that the use of cryo‐probe generated 1,1‐adequate data in conjunction with HMBC dramatically improves computer‐assisted structure elucidation (CASE) both in terms of speed and accuracy of structure generation. In this study data were obtained on two dissimilar natural products and subjected to CASE analysis with and without the incorporation of two‐bond specific data. Dramatic improvements in both structure calculation times and structure candidates were observed by the inclusion of the two‐bond specific data. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

10.
Even though NMR has found countless applications in the field of small molecule characterization, there is no standard file format available for the NMR data relevant to structure characterization of small molecules. A new format is therefore introduced to associate the NMR parameters extracted from 1D and 2D spectra of organic compounds to the proposed chemical structure. These NMR parameters, which we shall call NMReDATA (for nuclear magnetic resonance extracted data), include chemical shift values, signal integrals, intensities, multiplicities, scalar coupling constants, lists of 2D correlations, relaxation times, and diffusion rates. The file format is an extension of the existing Structure Data Format, which is compatible with the commonly used MOL format. The association of an NMReDATA file with the raw and spectral data from which it originates constitutes an NMR record. This format is easily readable by humans and computers and provides a simple and efficient way for disseminating results of structural chemistry investigations, allowing automatic verification of published results, and for assisting the constitution of highly needed open‐source structural databases.  相似文献   

11.
12.
A method is presented for the automatic identification of the elements present in a sample and the calculation of the corresponding concentrations from the energies and peak areas determined by a spectrum analysis computer program. A preliminary interpretation list is produced in which the possible isotopes are given for each peak in the spectrum. This list is based only on the gamma-ray energies and half-lives of the isotopes. A careful analysis of this list yields groups of identified elements at four different significance levels. The determination of the corresponding concentration is based on the single-comparator method. The procedure is included in an automatic activation analysis system but can also be used separately.  相似文献   

13.
Our previous work postulated a transition concept among different isotopic mass states (i.e., isotopic species) of a molecule, and developed a hierarchical algorithm for accurately calculating their masses and abundances. A theoretical mass spectrum can be generated by convoluting a peak shape function to these discrete mass states. This approach suffers from limited memory if a level in the hierarchical structure has too many mass states. Here we present a memory efficient divide‐and‐recursively‐combine algorithm to do the calculation, which also improves the truncation method used in the previous hierarchical algorithm. Instead of treating all of the elements in a molecule as a whole, the new algorithm first ‘strips’ each element one by one. For the mass states of each element, a hierarchical structure is established and kept in the memory. This process reduces the memory usage by orders of magnitude (e.g., for bovine insulin, memory can be reduced from gigabytes to kilobytes). Next, a recursive algorithm is applied to combine mass states of elements to mass states of the whole molecule. The algorithm described above has been implemented as a computer program called Isotope Calculator, which was written in C++. It is freely available under the GNU Lesser General Public License from http://www.cs.brandeis.edu/~hong/software.html or http://people.brandeis.edu/~agar . Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

14.
百合皂苷的提取﹑纯化及其鉴定   总被引:6,自引:0,他引:6  
用正交实验法对百合总皂甙的提取工艺中温度、乙醇浓度、提取系统中固液比例、回流时间和提取次数5个因素进行研究,优选出简便可靠、且适合工业化生产的百合总皂甙的提取工艺。其最佳提取工艺条件是:温度为70℃,乙醇浓度为80%,固液比例为1∶6,提取时间3h,提取次数3次。用AB-8大孔吸附树脂分离、乙醚-丙酮分步沉淀得到纯百合皂苷。通过TCL薄层层析检测和红外光谱对其结构进行了初步鉴定。  相似文献   

15.
The thicknesses of intermediate oxides at the interface between ultrathin SiO2 and Si substrates have been measured via XPS elemental quantitative analysis for some SiO2/Si(100) and SiO2/Si(111) samples with the silicon oxide thickness less than 2 nm. The measurements involve XPS determination of the Si relative atomic ratio, calculation of Si atomic densities for the intermediate oxide, etc. and then the intermediate oxide thicknesses and the number of monolayers are obtained by referencing the thickness data from two international comparisons for these samples. The results show that the thickness of the intermediate oxides is in the range 0.14–0.16 nm with an average value of 0.15 nm. The number of monolayers for the intermediate oxides at the interface is less than one monolayer with an average value of 0.60. In the present work, there are a series of approximations. By making these approximations many parameters, including L and R0, used in the conventional calculation method are removed to give a simpler equation, which is valid when the thicknesses of SiO2 overlayer and the intermediate oxides are very small. This, therefore, appears to be a simple and quick method to obtain approximate oxide thicknesses of modest accuracy. The present work does not in any way replace or improve on Eqns ( 2 –6) cited in the text. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

16.
An algorithm for computing equilibrium concentrations by the “equilibrium constant” method is described. The main features of this algorithm are: (a) a damping procedure in conjunction with the Newton-Raphson technique that avoids divergence in dealing with very complicated (simultaneous presence of simple, mixed, protonated, polynuclear and hydroxypolynuclear species) and/or very large systems; (2) the use of devices to decrease core requirements, calculation time, and ill-conditioned problems; and (3) the calculation of errors in free and species concentrations from the uncertainties in analytical concentrations and in formation constants. Four systems are used for testing computer programs on calculation of equilibrium concentrations.  相似文献   

17.
Around 100 papers published from 2003 to the present are reviewed concerning analytical methods for the direct light measurement of a solid phase in which a target colored or fluorescent analyte is concentrated. Recent attention has been paid to the development of flow injection-solid phase spectrometry as a simple and inexpensive tool for routine analysis of organic compounds or pharmaceuticals. Due to some improvements in flow injection analysis, such as sequential injection and lab-on-a-valve, it is possible not only to reduce the reagent consumption but also to devise fully automatic and miniaturized systems with minimal maintenance needs. This may have the potential of becoming one of the green analytical methods. Flow injection-solid phase spectrometry is expected to be applied to the speciation of trace chemical components (e.g., specific determination of trace metal ions in different existent oxidation states) in real samples in the environment.  相似文献   

18.
19.
Although the structure of particulate filled polymers is usually thought to be very simple, often structure related phenomena determine their properties. Segregation occurs only when long flow paths and large particles are used in production. The occurrence and extent of aggregation depend on the relative magnitude of attractive and separating forces, which prevail during the homogenization of the composite; the balance of adhesive and shear forces determines structure. Fillers of small particle size always aggregate, usually leading to decreased strength and especially low impact resistance. Anisotropic particles (talc, mica, short fibers) are orientated during processing. ESR is a relatively simple technique for the estimation of orientation and orientation distribution, which are determined by processing conditions, i.e. flow pattern, shear conditions, mold filling rates, cooling conditions, etc. The orientation of the particles strongly affects composite stiffness and strength. In practice, often several factors simultaneously influence the properties of products prepared from particulate filled polymers. Separation of the effects of the influencing factors is difficult, although such knowledge would help to control composite properties. The structure and properties of injection and compression moulded PP composites containing CaCO3 or talc differs considerably from each other. The aggregation of CaCO3, the nucleating effect and the orientation of talc affect product properties. The latter are also influenced by the skin-core structure developing during injection molding as well as by the orientation of the polymer. An example is discussed in this paper, which facilitates the identification of the effect of these factors with the help of a simple model and indicates a way in which product properties can be controlled.  相似文献   

20.
Mixtures of closely related mono- and disaccharides may be efficiently separated by high-performance anion-exchange chromatography (HPAEC) only when relatively dilute alkaline eluents are employed (i.e., < 20 mM NaOH). The main drawbacks of these eluent solutions are (i) column regeneration between runs, (ii) poor reproducibility of the retention times, and (iii) the need for post-column base addition for enhancing sensitivity. Here, we describe some examples of isocratic separations of carbohydrates by HPAEC coupled with pulsed amperometric detection (PAD) accomplished by carbonate-free alkaline eluents (i.e., 5-20 mM NaOH) obtained upon addition of Ba(OAc)2 (1-2 mM). These separations include aldohexoses (i.e., galactose, glucose, and mannose), aminohexoses (i.e., glucosamine and galactosamine) and their N-acylated derivatives (i.e., N-acetylglucosamine and N-acetylgalactosamine) along with some isomeric disaccharides (i.e., lactose, lactulose and epilactose). The separation of closely related isomers of trehalose, alpha,alpha, alpha,beta, and beta,beta, is also presented. It is recommended to add Ba(OAc)2 to NaOH solutions several hours before using the alkaline eluent (i.e., 12-24 h) to ensure complete barium carbonate precipitation in the eluent reservoir. Adopting such a simple strategy can be especially useful for performing carbohydrate separations under isocratic conditions in which no regeneration and or re-equilibration of column between runs is required. Excellent repeatability of retention data throughout a three-day working session was observed, with relative standard deviations ranging from 2.0 to 3.7%, and from 0.5 to 2.0%, as day-to-day and within-day values, respectively. In addition, there was no need for postcolumn addition of strong bases to the eluent, and successful applications of the present approach confirmed its validity and practicability with detection limits of simple carbohydrates in the picomole range.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号