首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 875 毫秒
1.
Handong Li  Yan Wang 《Physica A》2010,389(16):3254-749
Recent empirical literature documents the presence of long-term memory in return volatility. But the mechanism of the existence of long-term memory is still unclear. In this paper, we investigate the origin and properties of long-term memory with nonparametric volatility, using high-frequency time series data of the Chinese Shanghai Composite Stock Price Index. We perform Detrended Fluctuation Analysis (DFA) on three different nonparametric volatility estimators with different sampling frequencies. For the same volatility series, the Hurst exponents reduce as the sampling time interval increases, but they are still larger than 1/2, which means that no matter how the interval changes, it still cannot change the existence of long memory. RRV presents a relatively stable property on long-term memory and is less influenced by sampling frequency. RV and RBV have some evolutionary trends depending on time intervals, which indicating that the jump component has no significant impact on the long-term memory property. This suggests that the presence of long-term memory in nonparametric volatility can be contributed to the integrated variance component. Considering the impact of microstructure noise, RBV and RRV still present long-term memory under various time intervals. We can infer that the presence of long-term memory in realized volatility is not affected by market microstructure noise. Our findings imply that the long-term memory phenomenon is an inherent characteristic of the data generating process, not a result of microstructure noise or volatility clustering.  相似文献   

2.
Over the last two decades, a large number of different methods had been used to study the fractal-like behavior of the heart rate variability (HRV). In this paper some of the most used techniques were reviewed. In particular, the focus is set on those methods which characterize the long memory behavior of time series (in particular, periodogram, detrended fluctuation analysis, rescale range analysis, scaled window variance, Higuchi dimension, wavelet-transform modulus maxima, and generalized structure functions). The performances of the different techniques were tested on simulated self-similar noises (fBm and fGn) for values of alpha, the slope of the spectral density for very small frequency, ranging from -1 to 3 with a 0.05 step. The check was performed using the scaling relationships between the various indices. DFA and periodogram showed the smallest mean square error from the expected values in the range of interest for HRV. Building on the results obtained from these tests, the effective ability of the different methods in discriminating different populations of patients from RR series derived from Holter recordings, was assessed. To this extent, the Noltisalis database was used. It consists of a set of 30, 24-h Holter recordings collected from healthy subjects, patients suffering from congestive heart failure, and heart transplanted patients. All the methods, with the exception at most of rescale range analysis, were almost equivalent in distinguish between the three groups of patients. Finally, the scaling relationships, valid for fBm and fGn, when empirically used on HRV series, also approximately held.  相似文献   

3.
In the face of the upcoming 30th anniversary of econophysics, we review our contributions and other related works on the modeling of the long-range memory phenomenon in physical, economic, and other social complex systems. Our group has shown that the long-range memory phenomenon can be reproduced using various Markov processes, such as point processes, stochastic differential equations, and agent-based models—reproduced well enough to match other statistical properties of the financial markets, such as return and trading activity distributions and first-passage time distributions. Research has lead us to question whether the observed long-range memory is a result of the actual long-range memory process or just a consequence of the non-linearity of Markov processes. As our most recent result, we discuss the long-range memory of the order flow data in the financial markets and other social systems from the perspective of the fractional Lèvy stable motion. We test widely used long-range memory estimators on discrete fractional Lèvy stable motion represented by the auto-regressive fractionally integrated moving average (ARFIMA) sample series. Our newly obtained results seem to indicate that new estimators of self-similarity and long-range memory for analyzing systems with non-Gaussian distributions have to be developed.  相似文献   

4.
We apply superstatistical techniques to an experimental time series of measured transient currents through a thin aluminium–PMMA–aluminium film. We show that in good approximation the current can be approximated by local Gaussian processes with fluctuating variance. The marginal density exhibits ‘fat tails’ and is well modelled by a superstatistical model. Our techniques can be generally applied to other short time series as well.  相似文献   

5.
Second-order structure functions are widely used to characterize turbulence in the inertial range because they are simple to estimate, particularly in comparison to spectral density functions and wavelet variances. Structure function estimators, however, are highly autocorrelated and, as a result, no suitable theory has been established to provide confidence intervals for turbulence parameters when determined via regression fits in log/log space. Monte Carlo simulations were performed to compare the performance of structure function estimators of turbulence parameters with corresponding multitaper spectral and wavelet variance estimators. The simulations indicate that these latter estimators have smaller variances than estimators based upon the structure function. In contrast to structure function estimators, the statistical properties of the multitaper spectral and wavelet variance estimators allow for the construction of confidence intervals for turbulence parameters. The Monte Carlo simulations also confirm the validity of the statistical theory behind the multitaper spectral and wavelet variance estimators. The strengths and weaknesses of the various estimators are further illustrated by analyzing an atmospheric temperature time series.  相似文献   

6.
Three-scaled windowed variance methods (standard, linear regression detrended, and brdge detrended) for estimating the Hurst coefficient (H) are evaluated. The Hurst coefficient, with 0 < H < 1, characterizes self-similar decay in the time-series autocorrelation function. The scaled windowed variance methods estimate H for fractional Brownian motion (fBm) signals which are cumulative sums of fractional Gaussian noise (fGn) signals. For all three methods both the bias and standard deviation of estimates are less than 0.05 for series having N ≥ 2(9) points. Estimates for short series (N < 2(8)) are unreliable. To have a 0.95 probability of distinguishing between two signals with true H differing by 0.1, more than 2(15) points are needed. All three methods proved more reliable (based on bias and variance of estimates) than Hurst's rescaled range analysis, periodogram analysis, and autocorrelation analysis, and as reliable as dispersional analysis. The latter methods can only be applied to fGn or differences of fBm, while the scaled windowed variance methods must be applied to fBm or cumulative sums of fGn.  相似文献   

7.
Bretthorst's recent generalization of the Lomb-Scargle periodogram shows that a sufficient statistic for frequency estimation from non-uniformly, but simultaneously sampled quadrature data is equivalent to the FFT of those data with the missing samples replaced by zeros. We have applied this concept to the rapid analysis of pulsed field gradient MRI data which have been non-uniformly sampled in the velocity encoding wave vector q. For a small number of q samples, it is more computationally efficient to calculate the periodogram directly rather than using the FFT algorithm with a large number of zeros. The algorithm we have implemented for finding the peak of the generalized periodogram is simple and robust; it involves repeated apodization and grid searching of the periodogram until the desired velocity resolution is achieved. The final estimate is refined by quadratic interpolation. We have tested the method for fully developed Poiseuille flow of a Newtonian fluid and have demonstrated substantial improvement in the precision of velocity measurement achievable in a fixed acquisition time with non-uniform sampling. The method is readily extendible to multidimensional data. Analysis of a 256 by 256 pixel image with 8 q samples and an effective velocity resolution of better than 1/680 of the Nyquist range requires approximately 1 minute computation time on a 400 MHz SUN Ultrasparc II processor.  相似文献   

8.
Reputation-based network selection mechanism using game theory   总被引:1,自引:0,他引:1  
Current and future wireless environments are based on the coexistence of multiple networks supported by various access technologies deployed by different operators. As wireless network deployments increase, their usage is also experiencing a significant growth. In this heterogeneous multi-technology multi-application multi-terminal multi-user environment users will be able to freely connect to any of the available access technologies. Network selection mechanisms will be required in order to keep mobile users “always best connected” anywhere and anytime. In such a heterogeneous environment, game theory techniques can be adopted in order to understand and model competitive or cooperative scenarios between rational decision makers. In this work we propose a theoretical framework for combining reputation-based systems, game theory and network selection mechanism. We define a network reputation factor which reflects the network’s previous behaviour in assuring service guarantees to the user. Using the repeated Prisoner’s Dilemma game, we model the user–network interaction as a cooperative game and we show that by defining incentives for cooperation and disincentives against defecting on service guarantees, repeated interaction sustains cooperation.  相似文献   

9.
负载预测在故障管理中有着十分重要的作用,通过对CPU负载以及内存使用率的预测可以对系统进行实时监控,预知未来时间段资源的可用性,发出异常告警;文中提出一种加权改进的自回归模型,通过对最小二乘法求出的参数进行加权处理,结合时间序列分析理论,建立一个负载预测模型,用于CPU负载和内存使用率的预测;实验证明,对AR模型的参数进行加权的方法优化了参数估计,预测误差减小了60%~80%。  相似文献   

10.
We study the dynamics of correlation and variance in systems under the load of environmental factors. A universal effect in ensembles of similar systems under the load of similar factors is described: in crisis, typically, even before obvious symptoms of crisis appear, correlation increases, and, at the same time, variance (and volatility) increases too. This effect is supported by many experiments and observations of groups of humans, mice, trees, grassy plants, and on financial time series.A general approach to the explanation of the effect through dynamics of individual adaptation of similar non-interactive individuals to a similar system of external factors is developed. Qualitatively, this approach follows Selye’s idea about adaptation energy.  相似文献   

11.
Yongkui Liu  Zhi Li  Long Wang 《Physica A》2010,389(12):2390-2396
We investigate the evolutionary prisoner’s dilemma with memory-based agents on a square lattice. By introducing memory effects into this game, we assume that individuals’ performance is evaluated in terms of the accumulative payoffs in their memories. It is shown that if individuals behave as their successful neighbors, then cooperation can be significantly promoted. The mechanism responsible for the promotion of cooperation is discussed in detail. We confirm that the promotion of cooperation induced by memory effects remains effective when a preferential selection rule or an asynchronous updating rule is employed. Our work may shed some new light on the study of evolutionary games in real-world situations where the effects of individuals’ memories play a key role in the evolution of cooperation.  相似文献   

12.
We introduce a two-player model of reinforcement learning with memory. Past actions of an iterated game are stored in a memory and used to determine player’s next action. To examine the behaviour of the model some approximate methods are used and confronted against numerical simulations and exact master equation. When the length of memory of players increases to infinity the model undergoes an absorbing-state phase transition. Performance of examined strategies is checked in the prisoner’ dilemma game. It turns out that it is advantageous to have a large memory in symmetric games, but it is better to have a short memory in asymmetric ones.  相似文献   

13.
Francesco Serinaldi 《Physica A》2010,389(14):2770-4432
The detection of long range dependence (LRD) is an important task in time series analysis. LRD is often summarized by the well-known Hurst parameter (or exponent) H∈[0,1], which can be estimated by a number of methods. Some of these techniques are designed to be applied to signals behaving as a stationary fractional Gaussian noise (fGn), whereas others imply that the analyzed time series behave as a non-stationary fractional Brownian motion (fBm). Moreover, some estimators do not yield the Hurst parameter but indexes related to H and ranging outside the unit interval. Therefore, the fGn or fBm nature of the studied time series has to be preliminarily analyzed before applying any estimation method, and the relationships between H and the indexes resulting from the analyses have to be taken into account to obtain coherent results. Since fGn-like series represent the increments of fBm-like processes and both the signals are characterized by the same H value by definition, estimators designed for fGn-like series can be applied to fBm-like sequences after preventive differentiation, and conversely estimators designed for fBm-like processes can be applied to fGn-like series after preventive integration. The signal characterization is particularly important when H is estimated on financial time series because the returns represent the first difference of price time series, which are often assumed to behave like self-affine sequences. The analysis of simulated fGn and fBm time series shows that all the considered methods yield comparable H values when properly applied. The reanalysis of several market price time series already studied in the literature points out that a correct application of the estimators (supported by a preventive signal classification) yields homogeneous H values allowing for a useful cross-validation of results reported in different works. Moreover, some conclusions reported in the literature about the anti-persistence of some financial series are shown to be incorrect because of the inappropriate application of the estimation methods.  相似文献   

14.
The existence of memory in financial time series has been extensively studied for several stock markets around the world by means of different approaches. However, fixed income markets, i.e. those where corporate and sovereign bonds are traded, have been much less studied. We believe that, given the relevance of these markets, not only from the investors’, but also from the issuers’ point of view (government and firms), it is necessary to fill this gap in the literature. In this paper, we study the sovereign market efficiency of thirty bond indices of both developed and emerging countries, using an innovative statistical tool in the financial literature: the complexity-entropy causality plane. This representation space allows us to establish an efficiency ranking of different markets and distinguish different bond market dynamics. We conclude that the classification derived from the complexity-entropy causality plane is consistent with the qualifications assigned by major rating companies to the sovereign instruments. Additionally, we find a correlation between permutation entropy, economic development and market size that could be of interest for policy makers and investors.  相似文献   

15.
We introduce a technique of time series analysis, potential forecasting, which is based on dynamical propagation of the probability density of time series. We employ polynomial coefficients of the orthogonal approximation of the empirical probability distribution and extrapolate them in order to forecast the future probability distribution of data. The method is tested on artificial data, used for hindcasting observed climate data, and then applied to forecast Arctic sea-ice time series. The proposed methodology completes a framework for ‘potential analysis’ of tipping points which altogether serves anticipating, detecting and forecasting nonlinear changes including bifurcations using several independent techniques of time series analysis. Although being applied to climatological series in the present paper, the method is very general and can be used to forecast dynamics in time series of any origin.  相似文献   

16.
Peter Grindrod  Mark Parsons 《Physica A》2011,390(21-22):3970-3981
The plethora of digital communication technologies, and their mass take up, has resulted in a wealth of interest in social network data collection and analysis in recent years. Within many such networks the interactions are transient: thus those networks evolve over time. In this paper we introduce a class of models for such networks using evolving graphs with memory dependent edges, which may appear and disappear according to their recent history. We consider time discrete and time continuous variants of the model. We consider the long term asymptotic behaviour as a function of parameters controlling the memory dependence. In particular we show that such networks may continue evolving forever, or else may quench and become static (containing immortal and/or extinct edges). This depends on the existence or otherwise of certain infinite products and series involving age dependent model parameters. We show how to differentiate between the alternatives based on a finite set of observations. To test these ideas we show how model parameters may be calibrated based on limited samples of time dependent data, and we apply these concepts to three real networks: summary data on mobile phone use from a developing region; online social-business network data from China; and disaggregated mobile phone communications data from a reality mining experiment in the US. In each case we show that there is evidence for memory dependent dynamics, such as that embodied within the class of models proposed here.  相似文献   

17.
For Metropolis Monte Carlo simulations in statistical physics, efficient, easy- to-implement, and unbiased statistical estimators of thermodynamic properties are based on the transition dynamics. Using an Ising model example, we demonstrate (problem-specific) variance reductions compared to conventional histogram estimators. A proof of variance reduction in a microstate limit is presented.  相似文献   

18.
19.
The time series analysis of seismic sequences needs proper methodologies that allow us to capture the main features of the time dynamics of earthquakes. Among these features, the identification of periodicities along with the quantification of their intensity represents an important task, concerning the detection of regular dynamical behaviours, with clear implications for earthquake prediction. In the present study, we applied three different methods to investigate the time dynamics of the seismic activity of the Northern Caucasus–Azerbaijan part of the Greater Caucasus–Kopet Dag region. We analysed the monthly number of earthquakes which occurred between 1996 and 2012 by means of: (i) the robust estimation of the periodogram, (ii) the singular spectrum analysis (SSA), and (iii) the Fisher–Shannon method. Two main significant periodicities are detected: 102 months and 20 months. The first corresponds actually to the long-term variation of the monthly seismic activity of the area, while the second represents the more intense cyclic component. Periodicities of 7 and 30 months are also identified, but with a lower intensity than the 20-month periodicity. The Fisher–Shannon method has revealed that the long-term variation of the series is also characterized by higher organization and lower degree of disorder. The present study shows how the application of methods from statistical mechanics could contribute to unveil dynamical features in seismicity.  相似文献   

20.
We demonstrate the resonant-like behaviour of the cardiopulmonary system in healthy people occurring at the natural low frequency oscillations of 0.1 Hz, which are often visible in the continuous pressure waveform. These oscillations represent the spontaneous oscillatory activity of the vasomotor centre and are sometimes called the Mayer waves. These 10-second rhythms probably couple with forced breathing at the same frequency and cause the observed cardiopulmonary resonance phenomenon. We develop a new method to study this phenomenon, namely the averaged Lomb-Scargle periodogram method, which is shown to be very effective in enhancing common frequencies in a group of different time series and suppressing those which vary between datasets. Using this method we show that in cardiopulmonary resonance the cardiopulmonary system behaves in a very similar way to a simple mechanical or electrical oscillator, i.e. becomes highly regular and its averaged spectrum exhibits a clear dominant peak and harmonics. If the forcing frequency is higher than 0.1 Hz, the total power and the share of power in the dominant peak and harmonics are lower and the prominence of the dominant peak and its harmonics greatly diminishes. It is shown that the power contributions from different forcing frequencies follow the resonance curve.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号