首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   87篇
  免费   0篇
力学   6篇
数学   1篇
物理学   80篇
  2021年   2篇
  2020年   3篇
  2014年   1篇
  2013年   3篇
  2012年   7篇
  2011年   4篇
  2010年   3篇
  2009年   6篇
  2008年   5篇
  2007年   5篇
  2006年   5篇
  2005年   4篇
  2004年   2篇
  2003年   3篇
  2002年   1篇
  2000年   7篇
  1999年   2篇
  1998年   8篇
  1996年   3篇
  1994年   1篇
  1993年   3篇
  1992年   3篇
  1991年   1篇
  1990年   2篇
  1989年   2篇
  1986年   1篇
排序方式: 共有87条查询结果,搜索用时 0 毫秒
1.
2.
Punctuated evolution due to delayed carrying capacity   总被引:1,自引:0,他引:1  
A new delay equation is introduced to describe the punctuated evolution of complex nonlinear systems. A detailed analytical and numerical investigation provides the classification of all possible types of solutions for the dynamics of a population in the four main regimes dominated respectively by: (i) gain and competition, (ii) gain and cooperation, (iii) loss and competition and (iv) loss and cooperation. Our delay equation may exhibit bistability in some parameter range, as well as a rich set of regimes, including monotonic decay to zero, smooth exponential growth, punctuated unlimited growth, punctuated growth or alternation to a stationary level, oscillatory approach to a stationary level, sustainable oscillations, finite-time singularities as well as finite-time death.  相似文献   
3.
We tested 45 indices and common stocks in the South African stock market for the possible existence of a bubble over the period from January 2003 to May 2006. A bubble is defined by a faster-than-exponential acceleration with significant log-periodic oscillations. These two traits are analyzed using different methods. Sensitivity tests show that the estimated parameters are robust. With the insight of 6 additional months of data since the analysis was performed, we observe that many of the stocks on the South African market experienced an abrupt drop at mid-June 2006, which is compatible with the predicted tc for several of the stocks, but not all. This suggests that the mini-crash that occurred around mid-June of 2006 was only a partial correction, which has resumed into a renewed bubbly acceleration bound to end some time in 2007, similarly to what happened in the US market from October 1997 to August 1998.  相似文献   
4.
5.
Several recent works point out that the crowd of small unobservable earthquakes (with magnitudes below the detection threshold md) may play a significant and perhaps dominant role in triggering future seismicity. Using the ETAS branching model of triggered seismicity, we apply the formalism of generating probability functions to investigate how the statistical properties of observable earthquakes differ from the statistics of all events. The ETAS (epidemic-type aftershock sequence) model assumes that each earthquake can trigger other earthquakes (“aftershocks”). An aftershock sequence results in this model from the cascade of aftershocks of each past earthquake. The triggering efficiency of earthquakes is assumed to vanish below a lower magnitude limit m0, in order to ensure the convergence of the theory and may reflect the physics of state-and-velocity frictional rupture. We show that, to a good approximation, the statistical distribution of seismic rates of events with magnitudes above md generated by an ETAS model with branching ratio n is the same as that of events generated by another ETAS model with effective parameter n(md). Our present analysis thus confirms, for the full statistical (time-independent or large time-window approximation) properties, the results obtained previously by one of us and Werner, based solely on the average seismic rates (the first-order moment of the statistics). Our analysis also demonstrates that this correspondence is not exact, as there are small corrections which can be systematically calculated, in terms of additional contributions that can be mapped onto a different branching model. We also show that this approximate correspondence of the ETAS model onto itself obtained by changing m0 into md, and n into n(md) holds only with respect to its statistical properties and not for all its space-time properties.  相似文献   
6.
Human beings like to believe they are in control of their destiny. This ubiquitous trait seems to increase motivation and persistence, and is probably evolutionarily adaptive [J.D. Taylor, S.E. Brown, Psych. Bull. 103, 193 (1988); A. Bandura, Self-efficacy: the exercise of control (WH Freeman, New York, 1997)]. But how good really is our ability to control? How successful is our track record in these areas? There is little understanding of when and under what circumstances we may over-estimate [E. Langer, J. Pers. Soc. Psych. 7, 185 (1975)] or even lose our ability to control and optimize outcomes, especially when they are the result of aggregations of individual optimization processes. Here, we demonstrate analytically using the theory of Markov Chains and by numerical simulations in two classes of games, the Time-Horizon Minority Game [M.L. Hart, P. Jefferies, N.F. Johnson, Phys. A 311, 275 (2002)] and the Parrondo Game [J.M.R. Parrondo, G.P. Harmer, D. Abbott, Phys. Rev. Lett. 85, 5226 (2000); J.M.R. Parrondo, How to cheat a bad mathematician (ISI, Italy, 1996)], that agents who optimize their strategy based on past information may actually perform worse than non-optimizing agents. In other words, low-entropy (more informative) strategies under-perform high-entropy (or random) strategies. This provides a precise definition of the “illusion of control” in certain set-ups a priori defined to emphasize the importance of optimization. An erratum to this article is available at .  相似文献   
7.
In a number of natural and social systems, the response to an exogenous shock relaxes back to the average level according to a long-memory kernel ~1/t1+θ with 0 ≤ θ < 1. In the presence of an epidemic-like process of triggered shocks developing in a cascade of generations at or close to criticality, this “bare” kernel is renormalized into an even slower decaying response function ~1/t1-θ. Surprisingly, this means that the shorter the memory of the bare kernel (the larger 1+θ), the longer the memory of the response function (the smaller 1-θ). Here, we present a detailed investigation of this paradoxical behavior based on a generation-by-generation decomposition of the total response function, the use of Laplace transforms and of “anomalous” scaling arguments. The paradox is explained by the fact that the number of triggered generations grows anomalously with time at ~ tθ so that the contributions of active generations up to time t more than compensate the shorter memory associated with a larger exponent θ. This anomalous scaling results fundamentally from the property that the expected waiting time is infinite for 0 ≤ θ ≤ 1. The techniques developed here are also applied to the case θ > 1 and we find in this case that the total renormalized response is a constant for t < 1/(1-n) followed by a cross-over to ~1/t1+θ for t ≫ 1/(1-n).  相似文献   
8.
We call attention against what seems to be a widely held misconception according to which large crashes are the largest events of distributions of price variations with fat tails. We demonstrate on the Dow Jones Industrial Average that with high probability the three largest crashes in this century are outliers. This result supports the suggestion that large crashes result from specific amplification processes that might lead to observable pre-cursory signatures. Received and Revised: 30 November 1997 / Accepted: 8 December 1997  相似文献   
9.
    
We study tensorial elastic wave transport in two-dimensional densely fractured media using numerical simulations and document transitions from propagation to diffusion and to localisation/delocalisation. For large fracture stiffness,waves are propagative at the scale of the system. For small stiffness,multiple scattering prevails,such that waves are diffusive in disconnected fracture networks,and localised in connected ones with a strong multifractality of the intensity field. A re-entrant delocalisation is found in well-connected fracture networks due to energy leakage via evanescent waves and cascades of mode conversion.https://doi.org/10.1209/0295-5075/ac225d  相似文献   
10.
    
We present a simple model of earthquakes ona pre-existing hierarchical fault structure. The systemself-organizesat large times in a stationary state with a power lawGutenberg-Richterdistribution of earthquake sizes. The largest fault carries irregulargreat earthquakes preceded by precursors developing over long timescales and followed by aftershocks obeying an Omori'slaw. The cumulative energy released by precursors follows atime-to-failure power law with log-periodic structures, qualifying alarge event as an effective dynamical (depinning) critical point.Down the hierarchy, smaller earthquakes exhibit the samephenomenology,albeit with increasing irregularities.https://doi.org/10.1209/epl/i1998-00113-x  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号