首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 375 毫秒
1.
化学反应处理的计算模型   总被引:1,自引:0,他引:1  
介绍了一种将同类反应上升为合成反应知识和在计算机上实现反合成分析的方法,反合成分析是合成设计中最关键的一步,在本工作中采用了基于谋略键寻找的合成设计方法。它有逻辑宜于在计算机上实现的优点。为了实现这个方法,我们首次提出了一种能中肯地描述合成反应的计算模型—反应知识的分类模型。这一模型由三条规则定义:规则A-反应类型;规则B-发生反应的外部条件;规则C-不适宜采用这个反应的情况;这种计算模型能够将海量反应数据中最重要最基本的信息提炼出来,转换成计算机能处理的知识。它也包含有反应适用范围的信息,从而提高了析分过程的外推能力。  相似文献   

2.
Thurston TJ  Brereton RG 《The Analyst》2002,127(5):659-668
Several methods are described for determining rate constants for second order reactions of the form U + V --> W using chemometrics and hard modelling to analyse UV absorption spectroscopic data, where all species absorb with comparable concentrations and extinctions. An interesting feature of this type of reaction is that the number of steps in the reaction is less than the number of absorbing species, resulting in a rank-deficient response matrix. This can cause problems when using some of the methods described in the literature. The approaches discussed in the paper depend, in part, on what knowledge is available about the system, including the spectra of the reactants and product, the initial concentrations and the exact kinetics. Sometimes some of this information may not be available or may be hard to estimate. Five groups of methods are discussed, namely use of multiple linear regression to obtain concentration profiles and fit kinetics information, rank augmentation using multiple batch runs, difference spectra based approaches, mixed spectral approaches which treat the reaction as two independent pseudospecies, and principal components regression. Two datasets are simulated, one where the spectra are quite different and the other where the spectrum of one reactant and the product share a high degree of overlap. Three sources of error are considered, namely sampling error, instrumental noise and errors in initial concentrations. The relative merits of each method are discussed.  相似文献   

3.
Recent developments in Hyperspectral Imaging equipment have made possible the use of this analytical technique for fast scanning of sample surfaces. This technique has turned out to be especially useful in Pharmacy, where information about the distribution of the components in the surface of a tablet can be obtained. One particular application of Hyperspectral Chemical Imaging is the search for singularities inside pharmaceutical tablets, e.g. coating defects. Nevertheless, one problem has to be faced: how to analyze a sample without any previous knowledge about it, or having only the minimum information about the tablet.In this work a new methodology, based on correlation coefficients, is introduced to obtain valuable information about one Hyperspectral Image (detection of defects, punctual contaminants, etc.) without any previous knowledge. The methodology combines Principal Component Analysis (PCA), correlation coefficient between one specific pixel included in the image and the rest of the image; and a new enhanced contrast function to obtain more selective chemical and spatial information about the image. To illustrate the applicability of the proposed methodology, real tablets of ibuprofen have been studied.The proposed methodology is presented as a control technique to detect batch variability, defects in final tablets and punctual contaminants, being a potential supplementary tool for quality controls. In addition, the usefulness of the proposed methodology is not exclusive to NIR-CI devices, but to any hyperspectral and multivariate image system.  相似文献   

4.
Emulsion and suspension polymerizations are important industrial processes for polymer production. The end-user properties of polymers depend strongly on how the polymerization reactions proceed in time (i.e. a batch or semicontinuous, rate of reagents feeding, etc.). In other words, these reactions are process dependent, which makes the successful process control a key point to ensure high-quality products. In several process control strategies the on-line monitoring of reaction performance is required. Due to the multiphase nature of the emulsion and suspension processes, there is a lack of sensors to perform successful on-line monitoring. Near infrared and Raman spectroscopies have been pointed out as useful approaches for monitoring emulsion and suspension polymerizations and several applications have been described. In such instance, the chemometric approach on relating near infrared and Raman spectra to polymer properties is widely used and has proven to be useful. Nevertheless, the multiphase nature of emulsion and suspension polymerizations also represents a challenge for the chemometric approach based on multivariate calibration models and demands the development of new methods. In this work, a set novel results is presented from the monitoring of 15 batch emulsion reactions that show the chemometric challenge to be faced on development of new methods for successful monitoring of processes taken under dispersed medium. In order to discuss these results, several chemometric approaches were revised. It is shown that Raman and NIR spectroscopic techniques are suitable for on-line monitoring of monomer concentration and polymer content during the polymerizations, as well as medium heterogeneity properties, i.e. average particle size. It is also shown that Hotteling and Q statistics, widely used in chemometrics, might fail in monitoring these reactions, while an approach based on principal curves is able to overcome such restriction.  相似文献   

5.
The implementation of an expert system for the automated qualitative interpretation of energy-dispersive x-ray spectra is discussed. The first step in the interpretation process is the extraction of the relevant data from the spectrum, which is done by a preprocessor program, written in FORTRAN. The expert system itself consists of three parts. The knowledge base contains specific information on energy-dispersive x-ray fluorescence spectrometry presented in the form of IF/THEN rules. The data base contains the reduced spectral data and an array of certainty factors associated with each element; the certainty factor for an element represents the probability of its being present in the sample from which the spectrum was taken. Finally, the inference engine performs manipulation of the knowledge. For a particular state of the data base, the certainty factors for all the elements are iteratively modified until convergence is reached by using the rules from the knowledge base. During each cycle, the inference engine selects one rule from the knowledge base and executes it. Rules are selected on the basis of the chemical elements contained in their IF part and according to their previously assigned focus levels. Execution of the THEN part of the selected rule modifies the certainty factors of a number of elements. At the end of the interpretation session, the system lists the elements which have a high probability of being present in the sample. Optionally, the user can be provided with explanations of the reasoning steps taken during the interpretation. Application of the expert system to a particular spectrum shows that it is useful for the reliable interpretation of spectral data obtained from electron microprobe analysis of industrial aerosol particles.  相似文献   

6.
In chemical and biochemical processes, steady‐state models are widely used for process assessment, control and optimisation. In these models, parameter adjustment requires data collected under nearly steady‐state conditions. Several approaches have been developed for steady‐state identification (SSID) in continuous processes, but no attempt has been made to adapt them to the singularities of batch processes. The main aim of this paper is to propose an automated method based on batch‐wise unfolding of the three‐way batch process data followed by a principal component analysis (Unfold‐PCA) in combination with the methodology of Brown and Rhinehart 2 for SSID. A second goal of this paper is to illustrate how by using Unfold‐PCA, process understanding can be gained from the batch‐to‐batch start‐ups and transitions data analysis. The potential of the proposed methodology is illustrated using historical data from a laboratory‐scale sequencing batch reactor (SBR) operated for enhanced biological phosphorus removal (EBPR). The results demonstrate that the proposed approach can be efficiently used to detect when the batches reach the steady‐state condition, to interpret the overall batch‐to‐batch process evolution and also to isolate the causes of changes between batches using contribution plots. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

7.
Process analytical technology is an essential step forward in pharmaceutical industry. Real-time analyzers will provide timely data on quality properties. This information combined with process data (temperatures, flow rates, pressure readings) collected in real time can become a powerful tool for this industry, for process understanding, process and quality monitoring, abnormal situation detection and for improving product quality and process reliability. A very important tool for this achievement is the multivariate analysis. Dr. Theodora Kourti is Research Manager in the McMaster Advanced Control Consortium (MACC) and Adjunct Professor in the Chemical Engineering Department at McMaster University. She is the co-recipient of the 2003 University – Industry Synergy Award for Innovation, given by the Natural Science & Engineering Research Council of Canada. Dr. Kourti has been working on Multivariate Statistical Methods for Process and Product Improvement and Abnormal Situation Detection in Process Industries since 1992 and has been involved in more than 80 major industrial applications in North America and Europe. These are either off-line or real-time applications for batch and continuous processes, in diverse industries such as Chemicals, Pharmaceuticals, Semiconductor, Mining, Pulp and Paper, Petrochemicals, Photographic and Steel Industry. She has published extensively in this area and has provided training for numerous industrial practitioners.  相似文献   

8.
Many high quality products are produced in a batch wise manner. One of the characteristics of a batch process is the recipe driven nature. By repeating the recipe in an identical manner a desired end-product is obtained. However, in spite of repeating the recipe in an identical manner, process differences occur. These differences can be caused by a change of feed stock supplier or impurities in the process. Because of this, differences might occur in the end-product quality or unsafe process situations arise. Therefore, the need to monitor an industrial batch process exists. An industrial process is usually monitored by process measurements such as pressures and temperatures. Nowadays, due to technical developments, spectroscopy is more and more used for process monitoring. Spectroscopic measurements have the advantage of giving a direct chemical insight in the process. Multivariate statistical process control (MSPC) is a statistical way of monitoring the behaviour of a process. Combining spectroscopic measurements with MSPC will notice process perturbations or process deviations from normal operating conditions in a very simple manner. In the following an application is given of batch process monitoring. It is shown how a calibration model is developed and used with the principles of MSPC. Statistical control charts are developed and used to detect batches with a process upset.  相似文献   

9.
The decomposition and volatilization reactions in the glass batch are important factors for the glass manufacturing, for the optimization of the melting conditions and material properties as well as for minimisation of the environmental impacts and complete utilisation of the raw materials. The use of modern analytical techniques to study such processes in laboratory is a first step towards industrial process control. One of the most successful techniques in evolved gas analysis has been the mass spectrometry.The different types of mass filters and their advantages and limits are described for use in batch reactions and glass melting.  相似文献   

10.
A new standardized lipolysis approach is presented where the focus is on the initial rate of lipolysis. An advantage is that data obtained in this way reflect degradation before growing amounts of lipolysis products retard the process. The method can be used to rank different lipase substrates. In particular, the method can be used to obtain information about the susceptibility to degradation of various emulsions and dispersions that are used in technical applications. We present how the method is standardized to facilitate comparison of various substrates. This involves (i) lipase substrate in excess, i.e., the amount of lipase is rate limiting, and (ii) expressing rate of degradation relative to that of a reference substrate, tributyrin. Under such conditions, with the amount of lipase substrate held constant, an increase in enzymatic activity will generate a proportional increase in the lipolysis rate. This enables comparison of results obtained from different enzyme batches and corrects for day-to-day variability. Examples illustrating the potential of the method to discriminate and rank different lipase substrates with regard to enzymatic degradation are presented.  相似文献   

11.
A semi-continuous process of polymer enhanced ultrafiltration for removal of lead and cadmium has been elaborated. This operation mode would let a better coupling between industrial and laboratory-scale processes. Basically, it includes two stages: (1) metal retention, where we can obtain a permeate stream free of heavy metals; (2) polymer regeneration, where the polymer is regenerated in order to be reused in metal retention stage. In order to work in this way, a control system of permeate and feed stream flows has been installed in a batch laboratory-scale plant. In the first place, more suitable hydrodynamic operating parameters were obtained by ultrafiltration experiments. The influence of pH has been studied to fix the pH for metal retention and polymer regeneration experiments, and the operative polymer binding capacity has been determined to know the metal amount that can be treated. A mathematical model taking into account both conservation equations and competitive reactions which occur in the medium has been established. The development of this mathematical model (which is in good agreement with experimental data) enables to estimate design parameters to dimension pilot and industrial scale installations based on this process.  相似文献   

12.
Thermal processes are part of many industrial treatments; therefore, it is of great interest to gain more insight of these processes. Evolved gas analysis (EGA) is the most straightforward way to make chemical reactions in thermal processes accessible for on-line investigations. The sample matrix of evolved off gas e.g., from coffee roasting is a permanently changing and complex mixture of a multitude of substances that have to be analyzed simultaneously for real on-line investigations without any sample trapping or separation device. Therefore, a measurement system as an ion trap mass spectrometer (ITMS) with soft ionization is required with its tandem mass spectrometry capability to provide distinct substance identification unperturbed by the remaining matrix. The presented novel system setup is based on a thermogravimetric device (TG) to simulate the thermal treatment as in industrial processes combined with an ITMS with soft single photon ionization (SPI) to achieve the required substance information. Hence it is possible to gain single mass spectrometric information of expected substances for process control. More comprehensive than that are the two-dimensional MS data which are required for research and process development purposes though. The conducted analyses show that this novel setup is able to provide distinct substance identification in evolved gas of roast and ground coffee powder. To our knowledge, this is the first TG–SPI–ITMS setup with successful application in verifying the identity of different mass traces within a single run.  相似文献   

13.
Multivariate curve resolution (MCR) and especially the orthogonal projection approach (OPA) can be applied to spectroscopic data and were proved to be suitable for process monitoring. To improve the quality of the on-line monitoring of batch processes, it is interesting to get as many as possible spectra in a given period of time. Nevertheless, hardware limitations could lead to the fact that it is not possible to acquire more than a certain number of spectra in this given period of time. Wavelength selection could be a good way to limit this problem since it decreases size, and consequently the acquisition time, of each recorded spectrum. This paper details an industrial application of genetic algorithms (GA) coupled with a curve resolution method (OPA) for such purpose.  相似文献   

14.
In technical chemistry, systems biology and biotechnology, the construction of predictive models has become an essential step in process design and product optimization. Accurate modelling of the reactions requires detailed knowledge about the processes involved. However, when concerned with the development of new products and production techniques for example, this knowledge often is not available due to the lack of experimental data. Thus, when one has to work with a selection of proposed models, the main tasks of early development is to discriminate these models. In this article, a new statistical approach to model discrimination is described that ranks models wrt. the probability with which they reproduce the given data. The article introduces the new approach, discusses its statistical background, presents numerical techniques for its implementation and illustrates the application to examples from biokinetics.  相似文献   

15.
Wavelet transforms are presented as a useful tool to analyse electrochemical noise data. Various concepts developed in the framework of wavelet transforms have been adapted to study electrochemical noise measurements. The most relevant feature of this method of analysis is its capability of decomposing electrochemical noise records into different sets of wavelet coefficients, which contain information about corrosion events occurring at a determined time-scale. Thus, this mathematical approach could become an alternative tool which solves the limitations of other more established procedures for the analysis of electrochemical noise data, such as statistical or Fourier transform-based methods.  相似文献   

16.

It is very rare that a one-step process of extraction leads to the pure compound with a high degree of purity specified by an industrial application. The various stages of a synthesis process and possible secondary reactions may lead to the synthesis of more or less complex and highly diluted solutions. In this work, the rationale and strategy for extraction and purification of a high added value compound are discussed. All the thinking is based on the knowledge and the exploitation of phase diagrams and then developed for different unit operations of the process. The most significant research tools are the experimental data and the modelling of phase equilibrium to estimate the yield of each step of extraction. The significant example chosen involves all the basic methods of phase separation, starting with liquid-vapour equilibrium: stripping of high volatility components and then more or less complex distillation are classically employed. The theoretical plateau number can be deduced from the equilibrium equation curves. The second step is based on the study of the liquid-liquid equilibrium and is an intermediate step for enrichment of the solution when distillation is not possible. A final step based on solid-liquid equilibrium consists of the selective crystallization of the pure product at low temperature, in order to satisfy the requirements of purity and safety imposed by industrial use. The conclusion includes all isolation operations in the form of a general extraction and purification scheme.

  相似文献   

17.
We investigate the preparation of nearly monodisperse gold nanoparticles by heat treatment in different conditions. The effects of various solvents, heating temperature, and heating time length on the monodispersity of gold nanoparticles were studied systematically and a general route to generate gold nanoparticles with uniform size was determined. The first step was to prepare gold nanoparticles with less than 3 nm and the following operation was to heat the gold nanoparticles in the present of thiolated solvents where monodispersed gold nanoparticles could be obtained easily. Our approach has enriched synthesis of monodisperse gold nanoparticles, and may provide some valuable experimental data about how the heating process affects the size evolution of gold nanoparticles.  相似文献   

18.
Rank annihilation factor analysis (RAFA) is applied to resolve the two-way kinetic-spectral data measured from spectroscopic reactions and acquire rate constants and the absorption spectrum of each component. A two-step first-order consecutive reaction is studied in this paper. When the first step rate constant acts as an optimizing object, and simply combined with the pure spectrum of reactant, the rank of the original data matrix can be reduced by one by annihilating the information of reactant from the original data matrix. The residual standard deviation (R.S.D.) of the residual matrix after bilineaization of the background matrix is regarded as the evaluation function. Owing to the correlation between kinetic functions of species in the reaction, two optimal resolutions, corresponding to the rate constants of the first and second step, respectively, can both be obtained in one computing process. Given the kinetic parameters, the absorption spectrum of each component including the intermediate can be obtained through least square regression. This approach can also be applied to reaction systems where the intermediate or the final product doesnot absorb. The performance of the method has been evaluated by using synthetic data. Electrodegradation of phenol solution and alkaline hydrolysis of dimethyl phthalate were also studied by the present method.  相似文献   

19.
Process development, optimization and robustness analysis for chromatographic separation are often entirely based on experimental work and generic knowledge. This paper describes a model-based approach that can be used to gain process knowledge and assist in the robustness analysis of an ion-exchange chromatography step using a model-based approach. A kinetic dispersive model, where the steric mass action model accounts for the adsorption is used to describe column performance. Model calibration is based solely on gradient elution experiments at different gradients, flow rates, pH and column loads. The position and shape of the peaks provide enough information to calibrate the model and thus single-component experiments can be avoided. The model is calibrated to the experiments and the confidence intervals for the estimated parameters are used to account for the model error throughout the analysis. The model is used to predict the result of a robustness analysis conducted as a factorial experiment and to design a robust pooling approach. The confidence intervals are used in a "worst case" approach where the parameters for the components are set at the edge of their confidence intervals to create a worst case for the removal of impurities at each point in the factorial experiment. The pooling limit was changed to ensure product quality at every point in the factorial analysis. The predicted purities and yields were compared to the experimental results to ensure that the prediction intervals cover the experimental results.  相似文献   

20.
When studying the principal component analysis (PCA) or partial least squares (PLS) modelling of batch process data, one realizes that there is a wide range of approaches. In many cases, new modelling approaches are presented just because they work properly for a particular application, for example, on‐line monitoring and a given number of processes. A clear understanding of why these approaches perform successfully and which are the advantages and disadvantages in front of the others is seldom supplied. Why does modelling after batch‐wise unfolding capture changing dynamics? What are the consequences of variable‐wise unfolding? Is there any best unfolding method? When should several models for a single process be used? In this paper, it is shown how these and other related questions can be answered by properly analyzing the dynamic covariance structures of the various approaches. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号