首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Powder‐snow avalanches are violent natural disasters which represent a major risk for infrastructures and populations in mountain regions. In this study we present a novel model for the simulation of avalanches in the aerosol regime. The second scope of this study is to get more insight into the interaction process between an avalanche and a rigid obstacle. An incompressible model of two miscible fluids can be successfully employed in this type of problems. We allow for mass diffusion between two phases according to the Fick's law. The governing equations are discretized with a contemporary fully implicit finite volume scheme. The solver is able to deal with arbitrary density ratios. Several numerical results are presented. Volume fraction, velocity, and pressure fields are presented and discussed. Finally, we point out how this methodology can be used for practical problems.  相似文献   

2.
研究图上Abelian沙堆模型问题.首先给出关于沙堆模型常返构型的极大雪崩序列,然后刻画了一些雪崩性质.基于这些性质,我们确定了单圈图的基本雪崩中每个顶点的topplings数及它的雪崩多项式,推广了R.Cori的结果.  相似文献   

3.
Probabilistic cellular automata form a very large and general class of stochastic processes. These automata exhibit a wide range of complex behavior and are of interest in a number of fields of study, including mathematical physics, percolation theory, computer science, and neurobiology. Very little has been proved about these models, even in simple cases, so it is common to compare the models to mean field models. It is normally assumed that mean field models are essentially trivial. However, we show here that even the mean field models can exhibit surprising behavior. We prove some rigorous results on mean field models, including the existence of a surrogate for the “energy” in certain non‐reversible models. We also briefly discuss some differences that occur between the mean field and lattice models. © 2006 Wiley Periodicals, Inc. Random Struct. Alg., 2006  相似文献   

4.
Burn‐in is a widely used method to improve the quality of products or systems after they have been produced. In this paper, we consider the problem of determining the optimal burn‐in time and optimal work size maximizing the long‐run average amount of work saved per time unit in the computer applications. Assuming that the underlying lifetime distribution of the computer has an initially decreasing or/and eventually increasing failure rate function, an upper bound for the optimal burn‐in time is derived for each fixed work size and a uniform (with respect to the burn‐in time) upper bound for the optimal work size is also obtained. Furthermore, it is shown that a non‐trivial lower bound for the optimal burn‐in time can be derived if the underlying lifetime distribution has a large initial failure rate. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

5.
6.
This article introduces a suite of approaches and measures to study the impact of co‐authorship teams based on the number of publications and their citations on a local and global scale. In particular, we present a novel weighted graph representation that encodes coupled author‐paper networks as a weighted co‐authorship graph. This weighted graph representation is applied to a dataset that captures the emergence of a new field of science and comprises 614 articles published by 1036 unique authors between 1974 and 2004. To characterize the properties and evolution of this field, we first use four different measures of centrality to identify the impact of authors. A global statistical analysis is performed to characterize the distribution of paper production and paper citations and its correlation with the co‐authorship team size. The size of co‐authorship clusters over time is examined. Finally, a novel local, author‐centered measure based on entropy is applied to determine the global evolution of the field and the identification of the contribution of a single author's impact across all of its co‐authorship relations. A visualization of the growth of the weighted co‐author network, and the results obtained from the statistical analysis indicate a drift toward a more cooperative, global collaboration process as the main drive in the production of scientific knowledge. © 2005 Wiley Periodicals, Inc. Complexity 10: 57–67, 2005  相似文献   

7.
In this paper we study the dynamics of fermionic mixed states in the mean‐field regime. We consider initial states that are close to quasi‐free states and prove that, under suitable assumptions on the initial data and on the many‐body interaction, the quantum evolution of such initial data is well approximated by a suitable quasi‐free state. In particular, we prove that the evolution of the reduced one‐particle density matrix converges, as the number of particles goes to infinity, to the solution of the time‐dependent Hartree‐Fock equation. Our result holds for all times and gives effective estimates on the rate of convergence of the many‐body dynamics towards the Hartree‐Fock evolution.© 2015 Wiley Periodicals, Inc.  相似文献   

8.
It is shown that certain general classes of constrained binary optimization tasks can be solved with increasing accuracy by a first order mean field approximation of the Boltzmann distribution of the associated Lagrangian as the instance size grows. The formalism is thoroughly analyzed for the quadratic and multidimensional knapsack models. In these cases analytical expressions for the convergence of the optimality gaps are given, which are experimentally verified.  相似文献   

9.
Different methods are used to determine the scaling exponents associated with a time series describing a complex dynamical process, such as those observed in geophysical systems. Many of these methods are based on the numerical evaluation of the variance of a diffusion process whose step increments are generated by the data. An alternative method focuses on the direct evaluation of the scaling coefficient of the Shannon entropy of the same diffusion distribution. The combined use of these methods can efficiently distinguish between fractal Gaussian and Lévy‐walk time series and help to discern between alternative underling complex dynamics. © 2005 Wiley Periodicals, Inc. Complexity 10: 51–56, 2005  相似文献   

10.
11.
An ND/D/1 queueing model means that N independent periodic sources are served by a single server and the packets have the same size. These models have received close attention as general queueing models in telecommunications. Both discrete models, where it is only permitted to transmit packets at fixed time instances, and also continuous models, where the time of transmission is not restricted, can be applied in the modeling. This paper provides the exact distribution of the cumulative idle time duration in such queuing systems and also proposes accurate approximation formulae for large systems. The results of this paper are of practical significance because existing approximations of the distribution of the cumulative idle time can be replaced by the proposed formulae.AMS subject classification: 68M20, 60K25This revised version was published online in June 2005 with corrected coverdate  相似文献   

12.
The numerical analysis of ductile damage and failure in engineering materials is often based on the micromechanical model of Gurson [1]. Numerical studies in the context of the finite‐element method demonstrate that, as with other such types of local damage models, the numerical simulation of the initiation and propagation of damage zones is strongly mesh‐dependent and thus unreliable. The numerical problems concern the global load‐displacement response as well as the onset, size and orientation of damage zones. From a mathematical point of view, this problem is caused by the loss of ellipticity of the set of partial di.erential equations determining the (rate of) deformation field. One possible way to overcome these problems with and shortcomings of the local modelling is the application of so‐called non‐local damage models. In particular, these are based on the introduction of a gradient type evolution equation of the damage variable regarding the spatial distribution of damage. In this work, we investigate the (material) stability behaviour of local Gurson‐based damage modelling and a gradient‐extension of this modelling at large deformation in order to be able to model the width and other physical aspects of the localization of the damage and failure process in metallic materials.  相似文献   

13.
The finite element method has been well established for numerically solving parabolic partial differential equations (PDEs). Also it is well known that a too large time step should not be chosen in order to obtain a stable and accurate numerical solution. In this article, accuracy analysis shows that a too small time step should not be chosen either for some time‐stepping schemes. Otherwise, the accuracy of the numerical solution cannot be improved or can even be worsened in some cases. Furthermore, the so‐called minimum time step criteria are established for the Crank‐Nicolson scheme, the Galerkin‐time scheme, and the backward‐difference scheme used in the temporal discretization. For the forward‐difference scheme, no minimum time step exists as far as the accuracy is concerned. In the accuracy analysis, no specific initial and boundary conditions are invoked so that such established criteria can be applied to the parabolic PDEs subject to any initial and boundary conditions. These minimum time step criteria are verified in a series of numerical experiments for a one‐dimensional transient field problem with a known analytical solution. The minimum time step criteria developed in this study are useful for choosing appropriate time steps in numerical simulations of practical engineering problems. © 2005 Wiley Periodicals, Inc. Numer Methods Partial Differential Eq, 2006  相似文献   

14.
水滴清除气溶胶过程的随机算法和数值模拟   总被引:1,自引:0,他引:1  
气溶胶尺度分布的时间演变可量化气溶胶的湿沉降过程,它在数学上可由考虑湿沉降的通用动力学方程来描述.该方程为一典型的部分积分微分方程,与气溶胶尺度分布和雨滴尺度分布均相关,且由于需要考虑Brown扩散、拦截和惯性碰撞等湿沉降机制而使得清除系数模型非常复杂,普通的数值方法难以求解.为此发展了一种新的多重Monte Carlo算法,以求解考虑湿沉降的通用动力学方程,并用于模拟实际环境中气溶胶的湿沉降.对于对数正态分布的气溶胶尺度分布和雨滴尺度分布, 多重Monte Carlo算法进行的数值模拟表明, 雨滴几何平均尺度越小, 雨滴几何标准偏差越小,越有利于小尺度和中等尺度气溶胶的湿去除,而稍微不利于大尺度气溶胶的湿去除.  相似文献   

15.
H.M. Lübcke  T. Rung  F. Thiele 《PAMM》2002,1(1):292-293
Due to their high efficiency and robustness almost all simulations of complex engineering flows rely on turbulence models with explicit stress closures. Focal point of an explicit stress closure is the stress‐strain‐relationship, that couples the Reynolds‐stresses to the velocity field. Mostly linear eddy‐viscosity models are employed, thus, they are based on the hypothesis and assume Reynolds‐stress proportional to the strain‐rate tensor. Therefore, they are easy to implement, but they fail to predict complex flow fields besides two dimensional shear flow such as secondary flow in non‐circular ducts. In contrast to these models, explicit algebraic stress models (EASM) represent the Reynolds‐stress by the integrity basis of the strain‐rate and vorticity tensors. This leads to a more general stress‐strain‐relationship, however, it increases the implementation and computation effort. In this paper the properties of the integrity basis will be discussed and the derivation of a minimal integrity basis will be proposed.  相似文献   

16.
Abstract Fishing leads to truncation of a population's age and size structure. However, large‐sized fish are usually more valuable per unit weight than small ones. Nevertheless, these size‐related factors have mostly been ignored in bioeconomic modeling. Here, we present a simple extension to the Gordon–Schaefer model that accounts for variations in mean individual catch weight, and derive the feedback rule for optimal harvest in this setting. As the Gordon–Schaefer model has no population structure, size effects have to be accounted for indirectly. Here we assume a simple negative relationship between fishing effort and mean individual weight, and a positive relationship between mean catch weight and price. The aim is to emulate alterations of size structure in fish populations due to fishing and the influence of size on price per weight unit and eventually, net revenues. This demonstrates, on a general level, how such size‐dependent effects change the patterns of optimal harvest paths and sustainable revenue in single fish stocks. The model shows clear shifts toward lower levels of optimal effort and yield compared to classical models without size effects. This suggests that ignoring body size could lead to misleading assumptions and policies, potentially causing rent dissipation and suboptimal utilization of renewable resources.  相似文献   

17.
While several mental functions are characterized by parallel computation performed by moduli in the cortex, consciousness is sustained by a serial global integration: a single scene at a time takes place. Studies on complex systems show that macroscopic variables, integrating many components activities, undergo fluctuations with an intermittent serial structure when the system is in a state called “criticality”, characterized by avalanches with inverse-power-law (scale-free) distribution densities of sizes and inter-event times. Criticality has been established in human brain dynamics during wakefulness. Here we review how the critical hypothesis is able to explain many recent studies on brain complex dynamics. We focus, in particular, on the global, serial, intermittent behavior that can be assessed via high-density electroencephalograms, studying transitions between metastable states. Established as it is during wakefulness, it remained unsolved whether this global intermittent dynamics correlates with consciousness or with a non-task-driven default mode, also present in non-conscious states, like deep (NREM) sleep. Here we show that in NREM sleep seriality breaks down, and re-establishes during REM sleep (dreams), with unaltered spacial structure, in terms of complex branching of avalanches. We conjecture that this connectivity is exploited in NREM sleep by neural bistability, resetting and “parallelizing” portions of the cortex.  相似文献   

18.
A new risk measure fully based on historical data is proposed, from which we can naturally derive concentrated optimal portfolios rather than imposing cardinality constraints. The new risk measure can be expressed as a quadratics of the introduced greedy matrix, which takes investors' joint behavior into account. We construct distribution‐free portfolio selection models in simple case and realistic case, respectively. The latest techniques for describing transaction cost constraints and solving nonconvex quadratic programs are utilized to obtain the optimal portfolio efficiently. In order to show the practicality, efficiency, and robustness of our new risk measure and corresponding portfolio selection models, a series of empirical studies are carried out with trading data from advanced stock markets and emerging stock markets. Different performance indicators are adopted to comprehensively compare results obtained under our new models with those obtained under the mean‐variance, mean‐semivariance, and mean‐conditional value‐at‐risk models. Out‐of‐sample results sufficiently show that our models outperform the others and provide a simple and practical approach for choosing concentrated, efficient, and robust portfolios. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

19.
《Applied Mathematical Modelling》2014,38(17-18):4277-4290
The inhomogeneous generalized population balance equation, which is discretized with the direct quadrature method of moment (DQMOM), is solved to predict the bubble size distribution (BSD) in a vertical pipe flow. The proposed model is compared with a more classical approach where bubbles are characterized with a constant mean size. The turbulent two-phase flow field, which is modeled using a Reynolds-Averaged Navier–Stokes equation approach, is assumed to be in local equilibrium, thus the relative gas and liquid (slip) velocities can be calculated with the algebraic slip model, thereby accounting for the drag, lift, and lubrication forces. The complex relationship between the bubble size distribution and the resulting forces is described accurately by the DQMOM. Each quadrature node and weight represents a class of bubbles with characteristic size and number density, which change dynamically in time and space to preserve the first moments of the BSD. The predictions obtained are validated against previously published experimental data, thereby demonstrating the advantages of this approach for large-scale systems as well as suggesting future extensions to long piping systems and more complex geometries.  相似文献   

20.
We consider the problem of estimating regression models of two-dimensional random fields. Asymptotic properties of the least squares estimator of the linear regression coefficients are studied for the case where the disturbance is a homogeneous random field with an absolutely continuous spectral distribution and a positive and piecewise continuous spectral density. We obtain necessary and sufficient conditions on the regression sequences such that a linear estimator of the regression coefficients is asymptotically unbiased and mean square consistent. For such regression sequences the asymptotic covariance matrix of the linear least squares estimator of the regression coefficients is derived.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号