The use of simulation modeling in computational analysis of organizations is becoming a prominent approach in social science
research. However, relying on simulations to gain intuition about social phenomena has significant implications. While simulations
may give rise to interesting macro-level phenomena, and sometimes even mimic empirical data, the underlying micro and macro
level processes may be far from realistic. Yet, this realism may be important to infer results that are relevant to existing
theories of social systems and to policy making. Therefore, it is important to assess not only predictive capability but also
explanation accuracy of formal models in terms of the degree of realism reflected by the embedded processes. This paper presents
a process-centric perspective for the validation and verification (V&V) of agent-based computational organization models.
Following an overview of the role of V&V within the life cycle of a simulation study, emergent issues in agent-based organization
model V&V are outlined. The notion of social contract that facilitates capturing micro level processes among agents is introduced
to enable reasoning about the integrity and consistency of agent-based organization designs. Social contracts are shown to
enable modular compositional verification of interaction dynamics among peer agents. Two types of consistency are introduced:
horizontal and vertical consistency. It is argued that such local consistency analysis is necessary, but insufficient to validate
emergent macro processes within multi-agent organizations. As such, new formal validation metrics are introduced to substantiate
the operational validity of emergent macro-level behavior.
Levent Yilmaz is Assistant Professor of Computer Science and Engineering in the College of Engineering at Auburn University and co-founder
of the Auburn Modeling and Simulation Laboratory of the M&SNet. Dr. Yilmaz received his Ph.D. and M.S. degrees from Virginia
Polytechnic Institute and State University (Virginia Tech). His research interests are on advancing the theory and methodology
of simulation modeling, agent-directed simulation (to explore dynamics of socio-technical systems, organizations, and human/team
behavior), and education in simulation modeling. Dr. Yilmaz is a member of ACM, IEEE Computer Society, Society for Computer
Simulation International, and Upsilon Pi Epsilon. URL: http://www.eng.auburn.edu/~yilmaz 相似文献
Fractions of Cu and Zn species in legume samples (common white bean, pea, chick pea and lentil seeds and defatted soybean flour) were analysed by on-line hyphenation of size exclusion chromatography and inductively coupled plasma-mass spectrometry. Samples were extracted by 0.02 mol l−1 Tris–HCl buffer solution, pH 7.5. The extraction efficiency lay in the region 60–90 and 60–80% for Cu and Zn, respectively. Quantification of elements in the individual chromatographic fractions was carried out by isotope dilution (ID) and external calibration (EC) techniques. For ID analysis the chromatographic effluent was mixed with the flow of 65Cu and 68Zn isotope enriched solution and the isotope ratio values 63Cu/65Cu and (64Zn+66Zn)/68Zn were measured. In the case of EC technique calibration solutions of elements were injected to the flow of mobile phase by the second injector. Prior entering detector the effluent was mixed with the flow of internal standard solution (In, 50 μg l−1). Both methods have similar precision, however the behaviour of both studied elements was not the same. The chromatographic analysis itself was the main source of variability in the case of Cu. For Zn species analysis, the extraction process and the manipulation with the extract, played the significant role too. It was probably caused by lower stability of the present zinc chelates. The total amounts of Zn found in all chromatographic fractions represented 85–95% of Zn in sampled extract whereas those of Cu approached 100%. In case of small peaks the results of ID and EC were not the same. The EC results were lower then ID results. The great deal of results uncertainty accounts for the precision. 相似文献
Capillary electrophoresis-mass spectrometry (CE-MS) with an electrospray ionization interface was applied for the quantitative analysis of imazamox pesticide in well water, potable water, and pond water. The detector response for imazamox was determined to be linear over the concentration range of 50-1 ng/ml. The limits of quantitation and detection of the method were determined to be 200 and 20 ng/l for imazamox compound in each type of water sample, respectively. The total sample preparation and CE-MS analysis time was under 2 h. 相似文献
A simple, rapid, and sensitive high-performance liquid chromatographic method for estimation of efavirenz in human plasma
has been developed and validated. Chromatography was performed with C18 analytical column and 50:50 acetonitrile–phosphate buffer (pH 3.5) as mobile phase. Compounds were monitored by UV detection
at 247 nm. The retention time for efavirenz was 6.45 min and that for the internal standard, nelfinavir, was 2.042 min. Response
was a linear over the concentration range of 0.1 μg–10 μg mL−1 in human plasma. The method was simple, specific, precise and accurate and was useful for bioequivalence and pharmacokinetic
studies of efavirenz. 相似文献
A method for sensitive determination of the anti-cancer agent oxaliplatin in human plasma and human plasma ultrafiltrate (pUF) is presented. The method is based on the quantification of platinum by graphite-furnace atomic-absorption spectrometry, with Zeeman correction and an atomisation temperature of 2,700°C. Sample pretreatment involves dilution of the samples with a solution containing 0.15 mol L–1 NaCl and 0.20 mol L–1 HCl in water. Validation was performed in accordance with the most recent FDA guidelines for bioanalytical method validation. All results were within requirements. The validated ranges of quantification were 0.10–400 mol L–1 for human pUF and 0.50–400 mol L–1 for plasma. The assay is now successfully used to support pharmacokinetic studies of cancer patients treated with oxaliplatin. 相似文献
This article discusses problems of validating classification models especially in datasets where sample sizes are small and the number of variables is large. It describes the use of percentage correctly classified (%CC) as an indicator for success of a classification model. For small datasets, %CC should not be used uncritically and its interpretation depends on sample size. It illustrates the use of a common classification method, discriminant partial least squares (D-PLS) on a randomly generated dataset of 200 samples and 200 variables.
An aim of the classifier is to determine whether the null hypothesis (there is no distinction between two classes) can be rejected. Autoprediction gives an 84.5% CC. It is shown that, if there is variable selection, it must be performed independently on the training set to obtain a CC close to 50% on the test set; otherwise, over-optimistic and false conclusions can be reached about the ability to classify samples into groups.
Finally, two aims of determining the quality of a model are frequently confused, namely optimisation (often used to determine the most appropriate number of components in a model) and independent validation; to overcome this, the data should be split into three groups.
There are often difficulties with model building if validation and optimisation have been done on different groups of samples, especially using iterative methods, each group being modelled using properties, such as a different number of components or different variables. 相似文献
The utility and validity of toxicity tests for monitoring of wastewater treatment have been assessed. The evaluated acute toxicity tests have been Vibrio fischeri, Selenastrum capricornotum and Daphnia magna tests. The validation studies indicated that the acute toxicity tests can be considered as high sensitivity analytical tools to detect common environmental concentrations of the pollutants at concentration levels as low as ng l−1. The toxicity tests showed to have discriminatory ability to distinguish between different degrees of toxicity, and the toxic specificity of the compounds on target organisms. Synergistic, additive or antagonistic effects were evaluated indicating the capacity of the toxicity test to assess the combined effects of chemicals in wastewaters. The reproducibility of these tests, calculated as relative standard deviation, is acceptable in the range of 5-22.3%. The application of multivariate date analysis proved that toxicity and chemical measures are complementary analytical tools for monitoring of wastewaters quality. The toxicity tests are useful analytical tools for screening of chemical analysis and as an early warning system to monitor the treatment of WWTPs. The use of single toxicity test or battery of tests is the best approach to evaluate the risk because they are reliable indices of the toxic impact of effluents in the aquatic environment. The toxicity tests were applied in the quality control of different European WWTPs. 相似文献
Polyhalogenated aromatic hydrocarbons, such as polychlorinated dibenzo-p-dioxins are a large and diverse group of environmental pollutants. Their tendency to accumulate in the food chain and their toxicity make monitoring necessary. The reference analysis method is laborious and very expensive, therefore cheap and rapid bioassays have been developed. The chemical-activated luciferase bioassay (CALUX) bioassay uses a recombinant cell line, which responds to dioxins and dioxin-like molecules with Ah receptor (AhR)-dependent induction of firefly luciferase in a dose related response. The CALUX was tested for its use in the screening of feed. Aliquots of 20 g of enriched feed were extracted with a toluene:methanol mixture (20:4 v/v) and extracts were defatted on 33% H2SO4 silica columns and purified on carbon columns. Only the dioxin and furan fraction was analysed, the PCB fraction was discarded. The precision of the method is acceptable and in compliance with an R.S.D. <30% as suggested for cell-based bioassays in the Commission Directive 2002/70/EC of July 2002. The results evidence good agreement between TEQ-values obtained by either CALUX or GC–HRMS. The method is now routinely in use for a feed screening programme designed by the Federal Agency for the Safety of the Food chain. Approximately, 25 samples are analysed weekly. From the obtained results approximately 10% was confirmed by GC–HRMS. The false positive ratio is 1% and no false negatives were found, making the use of the CALUX technology advantageous. 相似文献
An analytical procedure enabling routine analysis of four environmental estrogens at concentrations below 1 ng L–1 in estuarine water samples has been developed and validated. The method includes extraction of water samples using solid-phase extraction discs and detection by gas chromatography (GC) with tandem mass spectrometry (MS–MS) in electron-impact (EI) mode. The targeted estrogens included 17- and 17-estradiol (aE2, bE2), estrone (E1), and 17-ethinylestradiol (EE2), all known environmental endocrine disruptors. Method performance characteristics, for example trueness, recovery, calibration, precision, accuracy, limit of quantification (LOQ), and the stability of the compounds are presented for each of the selected estrogens. Application of the procedure to water samples from the Scheldt estuary (Belgium – The Netherlands), a polluted estuary with reported incidences of environmental endocrine disruption, revealed that E1 was detected most frequently at concentrations up to 7 ng L–1. aE2 was detected once only and concentrations of bE2 and EE2 were below the LOQ.Presented at the 9th FECS Conference on Chemistry and the Environment, Bordeaux, France, 29 August–1 September 2004 相似文献