首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Isothermal titration calorimetry (ITC) is a traditional and powerful method for studying the linkage of ligand binding to proton uptake or release. The theoretical framework has been developed for more than two decades and numerous applications have appeared. In the current work, we explored strategic aspects of experimental design. To this end, we simulated families of ITC data sets that embed different strategies with regard to the number of experiments, range of experimental pH, buffer ionization enthalpy, and temperature. We then re-analyzed the families of data sets in the context of global analysis, employing a proton linkage binding model implemented in the global data analysis platform SEDPHAT, and examined the information content of all data sets by a detailed statistical error analysis of the parameter estimates. In particular, we studied the impact of different assumptions about the knowledge of the exact concentrations of the components, which in practice presents an experimental limitation for many systems. For example, the uncertainty in concentration may reflect imperfectly known extinction coefficients and stock concentrations or may account for different extents of partial inactivation when working with proteins at different pH values. Our results show that the global analysis can yield reliable estimates of the thermodynamic parameters for intrinsic binding and protonation, and that in the context of the global analysis the exact molecular component concentrations may not be required. Additionally, a comparison of data from different experimental strategies illustrates the benefit of conducting experiments at a range of temperatures.  相似文献   

2.
Dipeptidyl peptidase III (DPP3) is a ubiquitously expressed Zn-dependent protease, which plays an important role in regulating endogenous peptide hormones, such as enkephalins or angiotensins. In previous biophysical studies, it could be shown that substrate binding is driven by a large entropic contribution due to the release of water molecules from the closing binding cleft. Here, the design, synthesis and biophysical characterization of peptidomimetic inhibitors is reported, using for the first time an hydroxyethylene transition-state mimetic for a metalloprotease. Efficient routes for the synthesis of both stereoisomers of the pseudopeptide core were developed, which allowed the synthesis of peptidomimetic inhibitors mimicking the VVYPW-motif of tynorphin. The best inhibitors inhibit DPP3 in the low μM range. Biophysical characterization by means of ITC measurement and X-ray crystallography confirm the unusual entropy-driven mode of binding. Stability assays demonstrated the desired stability of these inhibitors, which efficiently inhibited DPP3 in mouse brain homogenate.  相似文献   

3.
本文利用Monte Carlo方法模拟了二元共聚反应的链增长过程,并给出了估计竞聚率的概率统计处理。从统计观点上讲,这是一种在参数空间对每个栅格点计算后验概率密度的Bayesian统计,采用平滑函数对不规则的后验概率密度曲面(PPDS)进行平滑化,从而在置信的95尹。区域评价共聚反应竞聚率(r_1,r_2)。本方法用于计算苯乙烯-丙烯酸丁酯二元共聚反应时,所计算的竞聚率(r_(St),r_(BA))与文献结果甚为一致。  相似文献   

4.
The international standard ISO 11843 specifies basic methods to design experiments for estimation of critical values referring to the capability of detection. The detection capability depends on the experimental design, the calibration model used, and the errors of the measurement process. This study reports how the specification of the calibration points within the calibration range can be used as a-priori information for evaluation of calibration uncertainty without any consideration of the response variables of the calibration. As result of investigation of the experimental designs, calibration points within the calibration range can be omitted without significant changes of calibration uncertainty. The approach is demonstrated at a practical example, the determination of arsenic in surface water samples taken from a river in Germany.  相似文献   

5.
The present study summarizes the measurement uncertainty estimations carried out in Nestlé Research Center since 2002. These estimations cover a wide range of analyses of commercial and regulatory interests. In a first part, this study shows that method validation data (repeatability, trueness and intermediate reproducibility) can be used to provide a good estimation of measurement uncertainty.In a second part, measurement uncertainty is compared to collaborative trials data. These data can be used for measurement uncertainty estimation as far as the in-house validation performances are comparable to the method validation performances obtained in the collaborative trial.Based on these two main observations, the aim of this study is to easily estimate the measurement uncertainty using validation data.  相似文献   

6.
An improved nucleic acid parameter set for the GROMOS force field   总被引:1,自引:0,他引:1  
Over the past decades, the GROMOS force field for biomolecular simulation has primarily been developed for performing molecular dynamics (MD) simulations of polypeptides and, to a lesser extent, sugars. When applied to DNA, the 43A1 and 45A3 parameter sets of the years 1996 and 2001 produced rather flexible double-helical structures, in which the Watson-Crick hydrogen-bonding content was more limited than expected. To improve on the currently available parameter sets, the nucleotide backbone torsional-angle parameters and the charge distribution of the nucleotide bases are reconsidered based on quantum-chemical data. The new 45A4 parameter set resulting from this refinement appears to perform well in terms of reproducing solution NMR data and canonical hydrogen bonding. The deviation between simulated and experimental observables is now of the same order of magnitude as the uncertainty in the experimental values themselves.  相似文献   

7.
The UV?CVis spectra of the f-elements, especially the lanthanides, are known for their rather sharp absorption bands showing systematic and reproducible changes upon coordination with suitable ligands. These absorption bands have been used as indicators allowing interpretation of the processes in solution. Modern absorption spectrometers in combination with digital recording of the spectra allow a precise collection of a large number of absorptions for many samples. The number of absorption thus collected easily amounts to several thousands of individual data. Chemometric techniques, especially factor analysis, are occasionally used to extract information from these data sets on basis of Bouguer?CLambert?CBeer law. Recent studies indicate the sensitivity of parameter values obtained from uncritical use of chemometric techniques to various influence and nuisance factors. On the basis of selected examples, the effects of parameter correlations, residual correlations, and measurement uncertainty introduced by volume operations are demonstrated. Using the ISO Guide to the Expression of Uncertainty as a convention for assessing measurement uncertainty, formation constants lg K derived from UV?CVis spectra of f-elements should be associated with a measurement uncertainty u of at least u?=?0.15 (k?=?1).  相似文献   

8.
Using molecular mechanics force field partial atomic charges, we show the nonuniqueness of the parametrization of continuum electrostatics models with respect to solute atomic radii and interior dielectric constant based on hydration (vacuum-to-water transfer) free energy data available for small molecules. Moreover, parameter sets that are optimal and equivalent for hydration free energy calculations lead to large variations of calculated absolute and relative electrostatic binding free energies. Hence, parametrization of solvation effects based on hydration data, although a necessary condition, is not sufficient to guarantee its transferability to the calculation of binding free energies in solution.  相似文献   

9.
Summary: This work investigates a fault diagnosis problem in the copolymerization process of styrene and methyl methacrylate (STY/MMA). Two topics are discussed in this paper: the system observability and optimal experimental design (OED) to reduce fault misclassification. Lack of observability has been found to be one of the major causes of misclassification in fault diagnosis, which is not remediable by any means other than including the right measurements necessary for the observability. In this work, the system observability has been studied through simulation analysis. Then, two new experimental design methods are proposed to train the projection pursuit regression (PPR) algorithm for fault diagnosis purpose. The new design methods, referred to as Gaussian probability design and Fuzzy boundary design, are compared to a conventional factorial design, to evaluate their performance for the problem under study. The Gaussian probability design is based on the calculation of the probability of an experimental data point near a class boundary belonging to a specific class. The Fuzzy boundary design is based on a bootstrapping technique used in part for the learning process in developing neural network models. It investigates the insufficiency of training data based on the identification of class boundaries by a group of models, such as PPR models. Both Gaussian probability design and Fuzzy boundary design methods automatically search for the sparseness of the training data, and provide guidelines to include pairs of training data on two sides of a class boundary in the areas where the data density is the lowest. The proposed design methods outperform a conventional factorial design by reducing the fault misclassification more effectively with the same amount of additional training data.

Testing data in the process measurement space of temperature vs. conversion.  相似文献   


10.
Quantitative determinations of many radioactive analytes in environmental samples are based on a process in which several independent measurements of different properties are taken. The final results that are calculated using the data have to be evaluated for accuracy and precision. The estimate of the standard deviation, s, also called the combined standard uncertainty (CSU) associated with the result of this combined measurement can be used to evaluate the precision of the result. The CSU can be calculated by applying the law of propagation of uncertainty, which is based on the Taylor series expansion of the equation used to calculate the analytical result. The estimate of s can also be obtained from a Monte Carlo simulation. The data used in this simulation includes the values resulting from the individual measurements, the estimate of the variance of each value, including the type of distribution, and the equation used to calculate the analytical result. A comparison is made between these two methods of estimating the uncertainty of the calculated result.  相似文献   

11.
This paper deals with some important (but often neglected) details about the uncertainty of retention measurement in thin layer chromatography, the propagation of uncertainty during computing simple and more complex values from the retention data, ending in influence of the retention uncertainty onto the regression estimates during extrapolation and lipophilicity estimation. Theoretical considerations are tested on data from previous study. It can be concluded that when TLC spots are broad and the retention uncertainty exceeds about 0.02 of RF value, the uncertainty should be taken into the account in further computations.  相似文献   

12.
Estimation of measurement uncertainty has become a more regularly performed part of the whole analytical process. However, there is still much on-going discussion in the scientific community about ways of building up the uncertainty budget. This study describes two approaches for estimation of measurement uncertainty in organic analysis: one which can be used for single sets of measurements and the other based on validation studies. In both cases the main contributions to the uncertainty are presented and discussed for the analysis of PCBs in mussel tissue, but the approaches can be extended to other organic pollutants in environmental/food samples. The main contributions to the uncertainty budget arise from calibration, sample preparation, and GC–MS measurements. A comparison of the relevant sources and their contributions to the expanded uncertainty is presented.  相似文献   

13.
14.
Results from the measurement of the heat of reaction of hydrothermal carbonization by power compensated differential scanning calorimetry exhibited a comparably high experimental standard deviation of around 10?C20%. The reasons for this standard deviation have been investigated and are being presented in this article. The zeroline deviation and its repeatability showed a decisive influence on the measurements due to the length of the thermal effects (several hours) and the experimental setup (high thermal capacity due to pressure capsules and hydrothermal conditions, type of calorimeter). It was quantified by reference runs and compensated mathematically. In addition, conceptual issues due to the propagation of uncertainty by sum operations are derived. There is an optimum peak length after which the uncertainty rises due to this uncertainty propagation. This optimum is at a signal level within the noise level. However, the contribution of this uncertainty showed little significance compared to the zeroline deviation and thus could be neglected. Results from hydrothermal carbonization of glucose show a mean value of 1060?J/gdaf with a standard deviation of 14% for the presented experimental setup. These values include compensations of systematic errors, including the zeroline deviation, baseline correction, leakage, and transient effects, which are discussed in detail.  相似文献   

15.
Machine learning methods have always been promising in the science and engineering fields, and the use of these methods in chemistry and drug design has advanced especially since the 1990s. In this study, molecular electrostatic potential (MEP) surfaces of phencyclidine‐like (PCP‐like) compounds are modeled and visualized in order to extract features that are useful in predicting binding affinities. In modeling, the Cartesian coordinates of MEP surface points are mapped onto a spherical self‐organizing map (SSOM). The resulting maps are visualized using electrostatic potential (ESP) values. These values also provide features for a prediction system. Support vector machines and partial least‐squares method are used for predicting binding affinities of compounds. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

16.
In this work, is given the Combined Standard Uncertainty (CSU) calculation procedure, which can be applied in spectrophotometric measurements. For the assessment of the computations, different approaches are discussed, such as the contribution to the Combined Standard Uncertainty of the reproducibility, the repeatability, the total bias, the calibration curve, and the type of the measurand. Results of inter-laboratory measurements confirmed the assumptions. For the minimization of the errors propagation a controlled experimental procedure was applied by this laboratory, called “errors propagation break-up” (ERBs). The uncertainty of sample concentration from a reference curve dominates the Combined Standard Uncertainty. The contribution of the method and the laboratory bias (total bias) to the CSU is insignificant under controlled conditions of a measurement. This work develops a simple methodology that can be utilized to evaluate the uncertainty and errors control on routine methods used both by academic researchers or the industrial sector.  相似文献   

17.
del Río FJ  Riu J  Rius FX 《The Analyst》2001,126(7):1113-1117
We developed a robust regression technique that is a generalization of the least median of squares (LMS) technique to the field in which the errors in both the predictor and the response variables are taken into account. This simple generalization is limited in the sense that the resulting straight line is found by using only two points from the initial data set. In this way a simulation step is added by using the Monte Carlo method to generate the best robust regression line. We call this new technique 'bivariate least median of squares' (BLMS), following the notation of the LMS method. We checked the robustness of the new regression technique by calculating its breakdown point, which was 50%. This confirms the robustness of the BLMS regression line. In order to show its applicability to the chemical field we tested it on simulated data sets and real data sets with outliers. The BLMS robust regression line was not affected by many types of outlying points in the data sets.  相似文献   

18.
One of the objectives of the French Alternative Energies and Atomic Energy Commission in the Marcoule Centre is to accurately quantify the composition of nuclear spent fuel, i.e. to determine the concentration of each isotope with suitable measurement uncertainty. These analysis results are essential for the validation of calculation codes used for the simulation of fuel behaviour in nuclear reactors and for nuclear matter accountancy. The different experimental steps are first the reception of a piece of spent fuel rod at the laboratory of dissolution studies, and then dissolution in a hot cell of a sample of the spent fuel rod received. Several steps are necessary to obtain a complete dissolution. Taking into account these process steps, and not only those of analysis for the evaluation of measurement uncertainties, is new, and is described in this paper. The uncertainty estimation incorporating the process has been developed following the approach proposed by the Guide to the Expression of Uncertainty in Measurement (GUM). The mathematical model of measurement was established by examining the dissolution process step by step. The law of propagation of uncertainty was applied to this model. A point by point examination of each step of the process permitted the identification of all sources of uncertainties considered in this propagation for each input variable. The measurement process presented involves the process and the analysis. The contents of this document show the importance of taking the process into account in order to give a more reliable uncertainty assessment to the result of a concentration or isotope ratio of two isotopes in spent fuel.  相似文献   

19.
Densities, vapor pressures, and the critical point were measured for dimethyl ether, thus, filling several gaps in the thermodynamic data for this compound. Densities were measured with a computer-controlled high temperature, high-pressure vibrating-tube densimeter system in the sub- and supercritical states. The densities were measured at temperatures from 273 to 523 K and pressures up to 40 MPa (417 data points), for which densities between 62 and 745 kg/m3 were covered. The uncertainty (where the uncertainties can be considered as estimates of a combined expanded uncertainty with a coverage factor of 2) in density measurement was estimated to be no greater than 0.1% in the liquid and compressed supercritical states. Near the critical temperature and pressure, the uncertainty increases to 1%. Using a variable volume apparatus with a sapphire tube, vapor pressures and critical data were determined. Vapor pressures were measured between 264 and 194 kPa up to near the critical point with an uncertainty of 0.1 kPa. The critical point was determined visually with an uncertainty of 1% for the critical volume, 0.1 K for the critical temperature, and 5 kPa for the critical pressure. The new vapor pressures and compressed liquid densities were correlated with the simple TRIDEN model. The new data along with the available literature data were used to develop a first fundamental Helmholtz energy equation of state for dimethyl ether, valid from 131.65 to 525 K and for pressures up to 40 MPa. The uncertainty in the equation of state for density ranges from 0.1% in the liquid to 1% near the critical point. The uncertainty in calculated heat capacities is 2%, and the uncertainty in vapor pressure is 0.25% at temperatures above 200 K. Although the equation presented here is an interim equation, it represents the best currently available.  相似文献   

20.
Chemical oxygen demand (COD) is one of the most relevant chemical parameters for the management of wastewater treatment facilities including the control of the quality of an effluent. The adequacy of decisions based on COD values relies on the quality of the measurements. Cost effective management of the minor sources of uncertainty can be applied to the analytical procedure without affecting measurement quality. This work presents a detailed assessment of the determination of COD values in wastewaters, according to ISO6060:1989 standard, which can support reduction of both measurement uncertainty and cost of analysis. This assessment includes the definition of the measurement traceability chain and the validation of the measurement procedure supported on sound and objective criteria. Detailed models of the measurement performance, including uncertainty, developed from the Differential Approach, were successfully validated by proficiency tests. The assumption of the measurement function linearity of the uncertainty propagation law was tested through the comparison with the numerical Kragten method. The gathered information supported the definition of strategies for measurement uncertainty or cost reduction. The developed models are available as electronic supplementary material, in an MS-Excel file, to be updated with the user's data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号