首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
 Existing software and computer systems in laboratories require retrospective evaluation and validation if their initial validation was not formally documented. The key steps in this process are similar to those for the validation of new software and systems: user requirements and system specification, formal qualification, and procedures to ensure ongoing performance during routine operation. The main difference is that frequently qualification of an existing system is based primarily on reliable operation and proof of performance in the past rather than on qualification during development and installation. Received: 30 April 1998 · Accepted: 2 June 1998  相似文献   

2.
 Installation and operational qualification are important steps in the overall validation and qualification process for software and computer systems. This article guides users of such systems step by step through the installation and operational qualification procedures. It provides guidelines on what should be tested and documented during installation prior to routine use. The author also presents procedures for the qualification of software using chromatographic data systems and a network server for central archiving as examples. Received: 31 October 1997 · Accepted: 25 November 1997  相似文献   

3.
It is only possible to obtain analytical results that are suitable for their intended purpose if the equipment used is capable of producing measurements of the required quality. To ensure that this requirement is met, analysts should define the performance criteria required from the instruments, ensure that only suitable instruments are selected for analytical measurements, and confirm that these instruments continue to meet these criteria for their entire operational life. This process should be conducted on a formal, documented basis, known as equipment qualification. In addition to describing the key elements of equipment qualification for all analytical instruments, this paper gives specific guidance on its application to conductivity systems that has never previously appeared in the literature. The benefits of performing equipment qualification are highlighted and guidance is given on the selection of control standards and why the equipment vendor performing stages of equipment qualification can be of benefit to the user. The relationship between equipment qualification and method validation is discussed, including how these activities play a major role in determining the quality control measures that should be applied to routine analysis.  相似文献   

4.
Capillary electrophoresis (CE) is increasingly being used in regulated and testing environments which demand validation. The design, development and production of CE instrumentation should be governed by qualifications which ensure the quality of the finished product. The vendor should therefore provide guidelines and procedures which assist the user in ensuring the adequate operation of the instrumentation and especially in designing installation qualification (IQ) and operational qualification/performance verification (OQ/PV) procedures. OQ/PV should test those functions of an instrument which directly affect the CE analysis, i.e. voltage, temperature, injection precision and detector function. In validation of CE methods care should be taken that those aspects which directly affect the precision of peak parameters are appreciated. The relationship between CE instrumentation, chemistry and validation parameters is discussed and guidelines are presented for definition of a CE method for submission to regulatory authorities.  相似文献   

5.
Analytical laboratories are more and more faced to meet official regulatory requirements as described in FDA and EPA good laboratory practice, good automated laboratory practice and good manufacturing practice regulations or to officially establish quality systems, such as specified in the ISO 9000 Series quality standards, in the ISO Guide 25 or in the EN 45001 guidelines. The impact on analytical instrumentation will be the requirement for stringent validation of analytical equipment and methods which increase the overall analysis costs. An overview is presented on the validation requirements using e.g. gas chromatography, high performance liquid chromatography, capillary electrophoresis and UV-visible spectroscopy and on the strategy to meet such needs at minimal extra costs with the help of an instrument vendor. It is recommended to use instrument hardware that has already built-in tools for self-verification and which is to be validated at the vendor's site. Performance testing in the user's laboratory is done using standard operating procedures as supplied with the instrument. If resources in the user's laboratory are limited, the performance verification is done by the vendor. Software and the entire computer system is validated prior to shipment at the vendor's site. Acceptance testing is done in the user's environment following the vendor recommendations. Analytical methods are validated automatically at the end of method development using a dedicated software. The software can be customized such that it can also be used for daily automated system suitability testing. Security and integrity of analytical data are ensured by saving the raw data together with instrument conditions and instrument log-books in check-sum protected binary register files for long-term archiving.  相似文献   

6.
 Some problems of validation of computerised instruments are reviewed briefly, taking essential standards and guides into account. The significant role of certified standard reference materials is underlined. An attitude of suppliers towards the validation of instruments is presented, and producers' responsibilities and obligations are discussed. The "black-box" concept is recommended as a preliminary step for the validation of computerised instruments. Two examples for gel permeation chromatography are given that illustrate a bad manufacturer's practice (BMP) and good manufacturer's practice (GMP). In the case of BMP, a need is expressed for a guide and for regulations that should be implemented into the quality assurance system. It has been proposed that the EURACHEM/VAM draft of guidance for qualification/validation of instruments should be amended by incorporating the "black-box" approach as a preliminary procedure for validation of computerised instruments, a retrospective validation procedure if the need for current validation was not foreseen or not specified, and a procedure (or selection rules) for qualification of the supplier. Moreover, the mechanisms of inspection to control the observance of the standardised rules and commonly recognised recommendations should also be considered by international quality organisations. Received: 19 November 1996 · Accepted: 20 March 1997  相似文献   

7.
The validation of a molecular organic structure on the basis of 1D and 2D HSQC, COSY and HMBC NMR spectra is proposed as an alternative to the methods that are mainly based on chemical shift prediction. The CCASA software was written for this purpose. It provides an updated and improved implementation of the preceding computer‐assisted spectral assignment software. CCASA can be downloaded freely from http://www.univ‐reims.fr/LSD/JmnSoft/CASA . Two bioactive natural products, a triterpene and a benzophenone, were selected from literature data as examples. The tentative matching between the structure and the NMR data interpretation of the triterpene unexpectedly leads to the hypothesis of an incorrect structure. The LSD software was used to find an alternative structure that improved the 2D NMR data interpretation and the carbon‐13 chemical shift matching between experimental values and those produced by the nmrshiftdb2 prediction tool. The benzophenone example showed that signal assignment by means of chemical shift prediction can be replaced by elementary user‐supplied chemical shift and multiplicity constraints. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

8.
The use of Quantrad Sensor's ScoutTM in field type applications is described. The portability of the ScoutTM enables the user to obtain more accurate information in the field versus a survey meter. Isotopic identification is possible when ancillary information is combined with built-in software libraries. Data from the ScoutTM in remediation at Stanford Linear Accelerator (SLAC), NORM (Naturally Occurring Radioactive Maerial) measurements in California's Central Valley oil fields, medical isotope identification at nuclear pharmaceutical company and emergency response applications are presented. Additionally, custom software enabled the use of the ScoutTM in identification, qualification and detection of Special Nuclear Materials (SNM) in illicit trafficking and portal monitoring applications.  相似文献   

9.
Specific programming of automated HPLC systems allows total on-line qualification, validation and stability monitoring using the concept of deferred standards. Setting up such a process for routine analyses in an automated HPLC system requires specific autosampler programming as well as specific monitoring software. With an autosampler, a double injection procedure is programmed, the first introducing the sample, and the second, a few minutes deferred, the deferred control standard. Two additional compounds are therefore added to the sample before and during the chromatographic process: the intemal standard for sample quantification and the deferred standard for system control. Specific methodologies are described of how to obtain classical quantitative analysis information as well as system qualification validation stability information. Experiments were performed to develop specified methodologies to monitor the quality of quantitative analysis during the life of the column by using the deferred standard concept to probe the effects of column ageing on separation characteristics.  相似文献   

10.
 The approach to validation of a computer program for an analytical instrument as a component of the analytical method (using this instrument with the program) is discussed. This approach was used for validating a new program for atomic absorption analysis. The validation plan derived from this approach was based on minimising the influence of all steps of the analytical procedure on the analytical results obtained by the method. In this way significant changes in the results may be caused only by replacement of the previous program by the new one. The positive validation conclusion was based on the comparison of the results of the analysis of suitable reference materials obtained with the new program and with its precursor in the same conditions, and also on comparison of their deviations from the accepted reference values for these materials, with the corresponding uncertainties. Received: 25 January 1997 Accepted: 14 March 1997  相似文献   

11.
12.
Increasing importation of food and the diversity of potential contaminants have necessitated more analytical testing of these foods. Historically, mass spectrometric methods for testing foods were confined to monitoring selected ions (SIM or MRM), achieving sensitivity by focusing on targeted ion signals. A limiting factor in this approach is that any contaminants not included on the target list are not typically identified and retrospective data mining is limited. A potential solution is to utilize high‐resolution MS to acquire accurate mass full‐scan data. Based on the instrumental resolution, these data can be correlated to the actual mass of a contaminant, which would allow for identification of both target compounds and compounds that are not on a target list (nontargets). The focus of this research was to develop software algorithms to provide rapid and accurate data processing of LC/MS data to identify both targeted and nontargeted analytes. Software from a commercial vendor was developed to process LC/MS data and the results were compared to an alternate, vendor‐supplied solution. The commercial software performed well and demonstrated the potential for a fully automated processing solution.  相似文献   

13.
 The analytical chemists in process development in the pharmaceutical industry have to solve the difficult problem of producing high quality methods for purity determination and assay within a short time without a clear definition of the substance to be analyzed. Therefore the quality management is very difficult. The ideal situation would be that every method is validated before use. This is not possible because this would delay the development process. A process-type quality development approach with an estimation type fast validation (measurement uncertainty) is therefore suggested. The quality management process consists of the estimation of measurement uncertainty for early project status. Statistical process control (SPC) is started directly after measurement uncertainty estimation and a classical validation for the end of the project. By this approach a process is defined that allows a fast and cost-efficient way of supporting the development process with the appropriate quality at the end of the process and provides the transparency needed in the development process. The procedure presented tries to solve the problem of the parallelism between the two development processes (chemical and analytical development) by speeding up the analytical development process initially. Received: 25 March 1997 · Accepted: 17 May 1997  相似文献   

14.
 Analysis of high-resolution NMR spectra elucidation has been known for many years. Hard-and software development now permits the implementation of such programs on personal computers. The structural information hidden in complex proton NMR spectra becomes easily accessible by using graphical user interfaces and direct data exchange between programs. A new mode has been implemented in 1D WIN-NMR to support the analysis of multiplet patterns with first order rules. Structure display, direct export mechanisms to the simulation program WIN-DAISY, and an archiving possibility complete the state-of-the-art data analysis. Some practical examples are given. Received: 25 October 1996/Revised: 6 March 1997/Accepted: 10 March 1997  相似文献   

15.
 Method validation procedure requires a strategy for collecting those validation data that are best adapted to the analytical technique used. A flexible and general approach based on Object Linking and Embedding technology is proposed. It allows a traceable validation strategy using modular objects which encapsulate documentation, analytical data and processing logic. The contents of these objects are accessible through a standard user interface. This paper demonstrates how this can reduce experiment time, simplify evaluation efforts, and increase the ease of use of validation figures of merit. An illustration using Microsoft Visual Basic for Applications is presented, and some specific aspects are described. It consists of the evaluation of a time domain NMR technique for determining the moisture content of foods involving a multivariate calibration step. This study also illustrates how guidelines such as Good Validation Practices could be defined to present all validation documents in a standardised manner. Received: 3 September 1996 Accepted: 14 October 1996  相似文献   

16.
We have developed a graphical user interface based dendrimer builder toolkit (DBT) which can be used to generate the dendrimer configuration of desired generation for various dendrimer architectures. The validation of structures generated by this tool was carried out by studying the structural properties of two well known classes of dendrimers: ethylenediamine cored poly(amidoamine) (PAMAM) dendrimer, diaminobutyl cored poly(propylene imine) (PPI) dendrimer. Using full atomistic molecular dynamics (MD) simulation we have calculated the radius of gyration, shape tensor and monomer density distribution for PAMAM and PPI dendrimer at neutral and high pH. A good agreement between the available simulation and experimental (small angle X‐ray and neutron scattering; SAXS, SANS) results and calculated radius of gyration was observed. With this validation we have used DBT to build another new class of nitrogen cored poly(propyl ether imine) dendrimer and study it's structural features using all atomistic MD simulation. DBT is a versatile tool and can be easily used to generate other dendrimer structures with different chemistry and topology. The use of general amber force field to describe the intra‐molecular interactions allows us to integrate this tool easily with the widely used molecular dynamics software AMBER. This makes our tool a very useful utility which can help to facilitate the study of dendrimer interaction with nucleic acids, protein and lipid bilayer for various biological applications. © 2012 Wiley Periodicals, Inc.  相似文献   

17.
 To help users of laboratory instrumentation to obtain laboratory accreditation and International Organization of Standardization certification, instrument manufactures should develop a comprehensive compliance programme for their products that includes product features, documentation and services for equipment validation and qualification. Received: 5 October 1998 · Accepted: 20 October 1998  相似文献   

18.
 The problem of validation criteria for developing ion-selective membrane electrodes for the analysis of pharmaceuticals arises from the connection between the reliability of ion-selective membrane electrodes construction and the reliability of the analytical information. Liquid membrane selective electrodes are more suitable for validation than the solid variety. The influence of the stability of ion pair complexes from the membrane on various parameters (e.g. response, limit of detection, and selectivity) is discussed. Validation criteria are proposed. Received: 18 September 1997 · Accepted: 17 November 1997  相似文献   

19.
In this article, we propose a new molecular orbital program for all-electron calculation of proteins which is based on density functional theory. To carry it out in a fully analytical way, we adopted the (pure-) analytical Xα method and modified it for saving a lot of memories for large-scale calculations. The recent software technology sophisticated in information science is inevitably applied to achieve calculations of large molecular systems. The program is coded by the object-oriented language C + +, its output is shown graphically, and the most of the procedures in this program are controlled through an efficient graphical user interface developed by ourselves. Such technology supports the safe construction of the huge software, the tidy representation of enormous data, and the ready control of complex calculations. Test calculations with various sizes of glycine polypeptides indicate that the computation time is proportional to the 1.7 powers of the number of residues. This result suggests that the all-electron calculations of proteins consisting of over 1000 atoms could be performed with distributed and/or massively parallel computers. © 1997 John Wiley & Sons, Inc. Int J Quant Chem 63: 245–256, 1997  相似文献   

20.
Because of its high sensitivity and specificity, hyphenated mass spectrometry has become the predominant method to detect and quantify metabolites present in bio-samples relevant for all sorts of life science studies being executed. In contrast to targeted methods that are dedicated to specific features, global profiling acquisition methods allow new unspecific metabolites to be analyzed. The challenge with these so-called untargeted methods is the proper and automated extraction and integration of features that could be of relevance. We propose a new algorithm that enables untargeted integration of samples that are measured with high resolution liquid chromatography–mass spectrometry (LC–MS). In contrast to other approaches limited user interaction is needed allowing also less experienced users to integrate their data. The large amount of single features that are found within a sample is combined to a smaller list of, compound-related, grouped feature-sets representative for that sample. These feature-sets allow for easier interpretation and identification and as important, easier matching over samples. We show that the automatic obtained integration results for a set of known target metabolites match those generated with vendor software but that at least 10 times more feature-sets are extracted as well. We demonstrate our approach using high resolution LC–MS data acquired for 128 samples on a lipidomics platform. The data was also processed in a targeted manner (with a combination of automatic and manual integration) using vendor software for a set of 174 targets. As our untargeted extraction procedure is run per sample and per mass trace the implementation of it is scalable. Because of the generic approach, we envision that this data extraction lipids method will be used in a targeted as well as untargeted analysis of many different kinds of TOF-MS data, even CE- and GC–MS data or MRM. The Matlab package is available for download on request and efforts are directed toward a user-friendly Windows executable.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号