Since the original work of Dantzig and Wolfe in 1960, the idea of decomposition has persisted as an attractive approach to large-scale linear programming. However, empirical experience reported in the literature over the years has not been encouraging enough to stimulate practical application. Recent experiments indicate that much improvement is possible through advanced implementations and careful selection of computational strategies. This paper describes such an effort based on state-of-the-art, modular linear programming software (IBM's MPSX/370). 相似文献
High nuclearity paramagnetic, spin-coupled transition metal clusters and grids are fascinating chemists and physicists partly because of their structural beauty, and the challenge of creating them, but also because of their novel physical properties. Magnetic interactions between the spin centers are a primary focus. This review will examine a selection of Mn(II) polynuclear grids and clusters, with nuclearities in the range Mn4 to Mn9. Theoretical treatments of the magnetic properties are discussed, and approaches to solving the exchange problem for ‘large’ spin systems related to computational difficulties. A freely available software package (MAGMUN4.1) is presented as a means of dealing simply with spin-coupled clusters in general, and symmetry reduction methods are discussed briefly as a means of dealing with ‘large’ spin systems. 相似文献
An important part of quality assurance in any analytical laboratory is the production of comprehensive results integrating uncertainty measurements. Many testing laboratories face the problem that the expenditure required to evaluate small uncertainties (high precision and high accuracy) is often uneconomic. In most cases an uncertainty of high reliability has to be calculated from only a few data (one calibration, few replications, etc.). This problem can be solved by an expert system. To achieve this the analytical procedure has to be structured into a dialouge and divided into parts. The uncertainty has to be calculated for each part of the procedure. Addition of the individual uncertainties results in the combined and expanded uncertainty. During the dialouge the system should advise the analyst how to get an efficient and effective calculation of uncertainty. All calculations, mathematical and statistical procedures have to be surveyable but running the system should not be too time consuming for economic reasons. Within the scope of the EURECA-project initiated by the Eidgenössische Materialprüfungs- und Forschungsanstalt (EMPA), St. Gallen, Switzerland, expert system software is being developed in cooperation with other research institutes and manufacturers of analytical instruments. Using this software it will be possible to calculate the uncertainty for analytical procedures such as titration, atomic emission spectrometry (ICP-OES), atomic absorption spectrometry (AAS) and gas- and liquid chromatography (GC, HPLC). 相似文献
This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages.
Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market.
There is a trend in the manufacturing sector towards total automation using “predictive process control”. Real-time verification of process parameters “on-the-run” allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points. 相似文献
Abstract An interactive program, dotplot, has been developed for browsing millions of lines of text and source code, using an approach borrowed from biology for studying homology (self-similarity) in DNA sequences. With conventional browsing tools such as a screen editor, it is difficult to identify structures that are too big to fit on the screen. In contrast, with dotplots we find that many of these structures show up as diagonals, squares, textures, and other visually recognizable features, as will be illustrated in examples selected from biology and two new application domains, text (AP news, Canadian Hansards) and source code (5ESS®). In an attempt to isolate the mechanisms that produce these features, we have synthesized similar features in dotplots of artificial sequences. We also introduce an approximation that makes the calculation of dotplots practical for use in an interactive browser. 相似文献
Abstract Statistical software systems include modules for manipulating data sets, model fitting, and graphics. Because plots display data, and models are fit to data, both the model-fitting and graphics modules depend on the data. Today's statistical environments allow the analyst to choose or even build a suitable data structure for storing the data and to implement new kinds of plots. The multiplicity problem caused by many plot varieties and many data representations is avoided by constructing a plot-data interface. The interface is a convention by which plots communicate with data sets, allowing plots to be independent of the actual data representation. This article describes the components of such a plot-data interface. The same strategy may be used to deal with the dependence of model-fitting procedures on data. 相似文献