首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

The world of science has undergone a major transformation by virtue of technological innovations in computing and information proessing. Sociology is one site in which this change is being played out. Our basic aim is to set out a revised image of any modern science, within which we can conceptualize and discuss the role of a newly emergent subfield we term computational sociology. Specifically, we expand the familiar two‐component model of a science, featuring a theoretical and an empirical side, to include a computational component. We show how the three components interrelate in a triangular system in which empirical data analysis, theoretical explanation and computer simulation link the three components. We close our paper with a brief discussion of how one new development in computation relates to concepts of sociology, an instance of the hybrid character of computational sociology.  相似文献   

2.
Computer simulation is finding a role in an increasing number of scientific disciplines, concomitant with the rise in available computing power. Marshalling this power facilitates new, more effective and different research than has been hitherto possible. Realizing this inevitably requires access to computational power beyond the desktop, making use of clusters, supercomputers, data repositories, networks and distributed aggregations of these resources. The use of diverse e-infrastructure brings with it the ability to perform distributed multiscale simulations. Accessing one such resource entails a number of usability and security problems; when multiple geographically distributed resources are involved, the difficulty is compounded. In this paper we present a solution, the Application Hosting Environment,3 which provides a Software as a Service layer on top of distributed e-infrastructure resources. We describe the performance and usability enhancements present in AHE version 3, and show how these have led to a high performance, easy to use gateway for computational scientists working in diverse application domains, from computational physics and chemistry, materials science to biology and biomedicine.  相似文献   

3.
This article reports on “MaterialSim”, an undergraduate-level computational materials science set of constructionist activities which we have developed and tested in classrooms. We investigate: (a) the cognition of students engaging in scientific inquiry through interacting with simulations; (b) the effects of students programming simulations as opposed to only interacting with ready-made simulations; (c) the characteristics, advantages, and trajectories of scientific content knowledge that is articulated in epistemic forms and representational infrastructures unique to computational materials science, and (d) the principles which govern the design of computational agent-based learning environments in general and for materials science in particular. Data sources for the evaluation of these studies include classroom observations, interviews with students, videotaped sessions of model-building, questionnaires, and analysis of artifacts. Results suggest that by becoming ‘model builders,’ students develop deeper understanding of core concepts in materials science, and learn how to better identify unifying principles and behaviors within the content matter.  相似文献   

4.
Optical computing   总被引:1,自引:0,他引:1  
  相似文献   

5.
Decompositions of the plane into disjoint components separated by curves occur frequently. We describe a package of subroutines which provides facilities for defining, building, and modifying such decompositions and for efficiently solving various point and area location problems. Beyond the point that the specification of this package may be useful to others, we reach the broader conclusion that well-designed data structures and support routines allow the use of more conceptual or non-numerical portions of mathematics in the computational process, thereby extending greatly the potential scope of the use of computers in scientific problem solving. Ideas from conceptual mathematics, symbolic computation, and computer science can be utilized within the framework of scientific computing and have an important role to play in that area.  相似文献   

6.
Computers with multiple processor cores using shared memory are now ubiquitous. In this paper, we present several parallel geometric algorithms that specifically target this environment, with the goal of exploiting the additional computing power. The algorithms we describe are (a) 2-/3-dimensional spatial sorting of points, as is typically used for preprocessing before using incremental algorithms, (b) d-dimensional axis-aligned box intersection computation, and finally (c) 3D bulk insertion of points into Delaunay triangulations, which can be used for mesh generation algorithms, or simply for constructing 3D Delaunay triangulations. For the latter, we introduce as a foundational element the design of a container data structure that both provides concurrent addition and removal operations and is compact in memory. This makes it especially well-suited for storing large dynamic graphs such as Delaunay triangulations.We show experimental results for these algorithms, using our implementations based on the Computational Geometry Algorithms Library (CGAL). This work is a step towards what we hope will become a parallel mode for CGAL, where algorithms automatically use the available parallel resources without requiring significant user intervention.  相似文献   

7.
It is now standard practice in computational science for large-scale simulations to be implemented and investigated in a problem solving environment (PSE) such as MATLAB or MAPLE. In such an environment, a scientist or engineer will formulate a mathematical model, approximate its solution using an appropriate numerical method, visualize the approximate solution and verify (or validate) the quality of the approximate solution. Traditionally, we have been most concerned with the development of effective numerical software for generating the approximate solution and several efficient and reliable numerical libraries are now available for use within the most widely used PSEs. On the other hand, the visualization and verification tasks have received little attention, even though each often requires as much computational effort as is involved in generating the approximate solution.In this paper, we will investigate the effectiveness of a suite of tools that we have recently introduced in the MATLAB PSE to verify approximate solutions of ordinary differential equations. We will use the notion of ‘effectivity index’, widely used by researchers in the adaptive mesh PDE community, to quantify the credibility of our verification tools. Numerical examples will be presented to illustrate the effectiveness of these tools when applied to a standard numerical method on two model test problems.  相似文献   

8.
This article argues that the agent‐based computational model permits a distinctive approach to social science for which the term “generative” is suitable. In defending this terminology, features distinguishing the approach from both “inductive” and “deductive” science are given. Then, the following specific contributions to social science are discussed: The agent‐based computational model is a new tool for empirical research. It offers a natural environment for the study of connectionist phenomena in social science. Agent‐based modeling provides a powerful way to address certain enduring—and especially interdisciplinary—questions. It allows one to subject certain core theories—such as neoclassical microeconomics—to important types of stress (e.g., the effect of evolving preferences). It permits one to study how rules of individual behavior give rise—or “map up”—to macroscopic regularities and organizations. In turn, one can employ laboratory behavioral research findings to select among competing agent‐based (“bottom up”) models. The agent‐based approach may well have the important effect of decoupling individual rationality from macroscopic equilibrium and of separating decision science from social science more generally. Agent‐based modeling offers powerful new forms of hybrid theoretical‐computational work; these are particularly relevant to the study of non‐equilibrium systems. The agent‐based approach invites the interpretation of society as a distributed computational device, and in turn the interpretation of social dynamics as a type of computation. This interpretation raises important foundational issues in social science—some related to intractability, and some to undecidability proper. Finally, since “emergence” figures prominently in this literature, I take up the connection between agent‐based modeling and classical emergentism, criticizing the latter and arguing that the two are incompatible. © 1999 John Wiley & Sons, Inc.  相似文献   

9.
This article describes advances in statistical computation for large-scale data analysis in structured Bayesian mixture models via graphics processing unit (GPU) programming. The developments are partly motivated by computational challenges arising in fitting models of increasing heterogeneity to increasingly large datasets. An example context concerns common biological studies using high-throughput technologies generating many, very large datasets and requiring increasingly high-dimensional mixture models with large numbers of mixture components. We outline important strategies and processes for GPU computation in Bayesian simulation and optimization approaches, give examples of the benefits of GPU implementations in terms of processing speed and scale-up in ability to analyze large datasets, and provide a detailed, tutorial-style exposition that will benefit readers interested in developing GPU-based approaches in other statistical models. Novel, GPU-oriented approaches to modifying existing algorithms software design can lead to vast speed-up and, critically, enable statistical analyses that presently will not be performed due to compute time limitations in traditional computational environments. Supplemental materials are provided with all source code, example data, and details that will enable readers to implement and explore the GPU approach in this mixture modeling context.  相似文献   

10.
DNA computing is a novel method for solving a class of intractable computationalproblems in which the computing can grow exponentially with problem size. Up to now, manyaccomplishments have been achieved to improve its performance and increase its reliability.Hamilton Graph Problem has been solved by means of molecular biology techniques. A smallgraph was encoded in molecules of DNA, and the “operations” of the computation wereperformed with standard protocols and enzymes. This work represents further evidence forthe ability of DNA computing to solve NP-complete search problems.  相似文献   

11.
Workflows support the automation of scientific processes, providing mechanisms that underpin modern computational science. They facilitate access to remote instruments, databases and parallel and distributed computers. Importantly, they allow software pipelines that perform multiple complex simulations (leveraging distributed platforms), with one simulation driving another. Such an environment is ideal for computational science experiments that require the evaluation of a range of different scenarios “in silico” in an attempt to find ones that optimize a particular outcome. However, in general, existing workflow tools do not incorporate optimization algorithms, and thus whilst users can specify simulation pipelines, they need to invoke the workflow as a stand-alone computation within an external optimization tool. Moreover, many existing workflow engines do not leverage parallel and distributed computers, making them unsuitable for executing computational science simulations. To solve this problem, we have developed a methodology for integrating optimization algorithms directly into workflows. We implement a range of generic actors for an existing workflow system called Kepler, and discuss how they can be combined in flexible ways to support various different design strategies. We illustrate the system by applying it to an existing bio-engineering design problem running on a Grid of distributed clusters.  相似文献   

12.
Quantum Bayesian computation is an emerging field that levers the computational gains available from quantum computers. They promise to provide an exponential speed-up in Bayesian computation. Our article adds to the literature in three ways. First, we describe how quantum von Neumann measurement provides quantum versions of popular machine learning algorithms such as Markov chain Monte Carlo and deep learning that are fundamental to Bayesian learning. Second, we describe quantum data encoding methods needed to implement quantum machine learning including the counterparts to traditional feature extraction and kernel embeddings methods. Third, we show how quantum algorithms naturally calculate Bayesian quantities of interest such as posterior distributions and marginal likelihoods. Our goal then is to show how quantum algorithms solve statistical machine learning problems. On the theoretical side, we provide quantum versions of high dimensional regression, Gaussian processes and stochastic gradient descent. On the empirical side, we apply a quantum FFT algorithm to Chicago house price data. Finally, we conclude with directions for future research.  相似文献   

13.
The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.  相似文献   

14.
Multiblock component methods are applied to data sets for which several blocks of variables are measured on a same set of observations with the goal to analyze the relationships between these blocks of variables. In this article, we focus on multiblock component methods that integrate the information found in several blocks of explanatory variables in order to describe and explain one set of dependent variables. In the following, multiblock PLS and multiblock redundancy analysis are chosen, as particular cases of multiblock component methods when one set of variables is explained by a set of predictor variables that is organized into blocks. Because these multiblock techniques assume that the observations come from a homogeneous population they will provide suboptimal results when the observations actually come from different populations. A strategy to palliate this problem—presented in this article—is to use a technique such as clusterwise regression in order to identify homogeneous clusters of observations. This approach creates two new methods that provide clusters that have their own sets of regression coefficients. This combination of clustering and regression improves the overall quality of the prediction and facilitates the interpretation. In addition, the minimization of a well-defined criterion—by means of a sequential algorithm—ensures that the algorithm converges monotonously. Finally, the proposed method is distribution-free and can be used when the explanatory variables outnumber the observations within clusters. The proposed clusterwise multiblock methods are illustrated with of a simulation study and a (simulated) example from marketing.  相似文献   

15.
毛文吉 《系统科学与数学》2008,28(11):1432-1440
随着计算机和信息技术的发展,信息科学技术的研究越来越重视与社会科学的交叉,社会计算已成为国内外计算机及相关领域的最新研究热点.社会计算与社会智能研究的核心之一是社会因果关系推理和行为评判问题.基于认知和心理学理论,介绍一个社会推理计算模型MASIM以及基于MASIM的社会计算系统实例,并以此阐述建立社会推理机制和社会计算系统的若干技术方面.  相似文献   

16.
For the solution of large scale simulations in structural mechanics iterative solving methods are mandatory. The efficiency of such methods can crucially depend on different factors: choice of material parameters, quality of the underlying computational mesh and number of processors in a parallel computing system. We distinguish between three aspects of ‘efficiency’: processor efficiency (degree to which the solving algorithm is able to exploit the processor's computational power), parallel efficiency (ratio between computation and communication times) and numerical efficiency (convergence behaviour). With the new FEM software package Feast we pursue the aim to develop a solver mechanism which at the same time gains high efficiencies in all three aspects, while trying to minimise the mentioned dependencies. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

17.
For material modeling of microstructured media, an accurate characterization of the underlying microstructure is indispensable. Mathematically speaking, the overall goal of microstructure characterization is to find simple functionals which describe the geometric shape as well as the composition of the microstructures under consideration and enable distinguishing microstructures with distinct effective material behavior. For this purpose, we propose using Minkowski tensors, in general, and the quadratic normal tensor, in particular, and introduce a computational algorithm applicable to voxel-based microstructure representations. Rooted in the mathematical field of integral geometry, Minkowski tensors associate a tensor to rather general geometric shapes, which make them suitable for a wide range of microstructured material classes. Furthermore, they satisfy additivity and continuity properties, which makes them suitable and robust for large-scale applications. We present a modular algorithm for computing the quadratic normal tensor of digital microstructures. We demonstrate multigrid convergence for selected numerical examples and apply our approach to a variety of microstructures. Strikingly, the presented algorithm remains unaffected by inaccurate computation of the interface area. The quadratic normal tensor may be used for engineering purposes, such as mean field homogenization or as target value for generating synthetic microstructures.  相似文献   

18.
We generalize the concept — dimension tree and the related results for monomial algebras to a more general case — relations algebras Λ by bringing Gröbner basis into play. More precisely, we will describe the minimal projective resolution of a left Λ-module M as a rooted ‘weighted’ diagraph to be called the minimal resolution graph for M. Algorithms for computing such diagraphs and applications as well will be presented.  相似文献   

19.
In silico research in medicine is thought to reduce the need for expensive clinical trials under the condition of reliable mathematical models and accurate and efficient numerical methods. In the present work, we tackle the numerical simulation of reaction–diffusion equations modeling human ischemic stroke. This problem induces peculiar difficulties like potentially large stiffness which stems from the broad spectrum of temporal scales in the nonlinear chemical source term as well as from the presence of steep spatial gradients in the reaction fronts, spatially very localized. Furthermore, simulations on realistic 3D geometries are mandatory in order to describe correctly this type of phenomenon. The main goal of this article is to obtain, for the first time, 3D simulations on realistic geometries and to show that the simulation results are consistent with those obtain in experimental studies or observed on MRI images in stroke patients.For this purpose, we introduce a new resolution strategy based mainly on time operator splitting that takes into account complex geometry coupled with a well-conceived parallelization strategy for shared memory architectures. We consider then a high order implicit time integration for the reaction and an explicit one for the diffusion term in order to build a time operator splitting scheme that exploits efficiently the special features of each problem. Thus, we aim at solving complete and realistic models including all time and space scales with conventional computing resources, that is on a reasonably powerful workstation. Consequently and as expected, 2D and also fully 3D numerical simulations of ischemic strokes for a realistic brain geometry, are conducted for the first time and shown to reproduce the dynamics observed on MRI images in stroke patients. Beyond this major step, in order to improve accuracy and computational efficiency of the simulations, we indicate how the present numerical strategy can be coupled with spatial adaptive multiresolution schemes. Preliminary results in the framework of simple geometries allow to assess the proposed strategy for further developments.  相似文献   

20.
可计算建模(computable modeling) 指根据所研究问题对计算精度的要求, 综合运用相关领域知识建立或简化模型, 减少计算量, 提高计算效率, 使得模型在现有计算机条件下可计算. 可计算建模是科学与工程计算研究的一个重要方面. 本文主要通过若干例子介绍可计算建模研究的内涵.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号