首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the present work we investigate the multiscale nature of the correlations for high frequency data (1 min) in different futures markets over a period of two years, starting on the 1st of January 2003 and ending on the 31st of December 2004. In particular, by using the concept of local Hurst exponent, we point out how the behaviour of this parameter, usually considered as a benchmark for persistency/antipersistency recognition in time series, is largely time-scale dependent in the market context. These findings are a direct consequence of the intrinsic complexity of a system where trading strategies are scale-adaptive. Moreover, our analysis points out different regimes in the dynamical behaviour of the market indices under consideration.  相似文献   

2.
We analyze the S&P 500 index data for the 13-year period, from January 1, 1984 to December 31, 1996, with one data point every 10 min. For this database, we study the distribution and clustering of volatility return intervals, which are defined as the time intervals between successive volatilities above a certain threshold q. We find that the long memory in the volatility leads to a clustering of above-median as well as below-median return intervals. In addition, it turns out that the short return intervals form larger clusters compared to the long return intervals. When comparing the empirical results to the ARMA-FIGARCH and fBm models for volatility, we find that the fBm model predicts scaling better than the ARMA-FIGARCH model, which is consistent with the argument that both ARMA-FIGARCH and fBm capture the long-term dependence in return intervals to a certain extent, but only fBm accounts for the scaling. We perform the Student's t-test to compare the empirical data with the shuffled records, ARMA-FIGARCH and fBm. We analyze separately the clusters of above-median return intervals and the clusters of below-median return intervals for different thresholds q. We find that the empirical data are statistically different from the shuffled data for all thresholds q. Our results also suggest that the ARMA-FIGARCH model is statistically different from the S&P 500 for intermediate q for both above-median and below-median clusters, while fBm is statistically different from S&P 500 for small and large q for above-median clusters and for small q for below-median clusters. Neither model can fully explain the entire regime of q studied.  相似文献   

3.
We investigate the planar maximally filtered graphs of the portfolio of the 300 most capitalized stocks traded at the New York Stock Exchange during the time period 2001–2003. Topological properties such as the average length of shortest paths, the betweenness and the degree are computed on different planar maximally filtered graphs generated by sampling the returns at different time horizons ranging from 5 min up to one trading day. This analysis confirms that the selected stocks compose a hierarchical system progressively structuring as the sampling time horizon increases. Finally, a cluster formation, associated to economic sectors, is quantitatively investigated.  相似文献   

4.
We present a general method to detect and extract from a finite time sample statistically meaningful correlations between input and output variables of large dimensionality. Our central result is derived from the theory of free random matrices, and gives an explicit expression for the interval where singular values are expected in the absence of any true correlations between the variables under study. Our result can be seen as the natural generalization of the Marčenko-Pastur distribution for the case of rectangular correlation matrices. We illustrate the interest of our method on a set of macroeconomic time series.  相似文献   

5.
We examine the volatility of an Indian stock market in terms of correlation of stocks and quantify the volatility using the random matrix approach. First we discuss trends observed in the pattern of stock prices in the Bombay Stock Exchange for the three-year period 2000–2002. Random matrix analysis is then applied to study the relationship between the coupling of stocks and volatility. The study uses daily returns of 70 stocks for successive time windows of length 85 days for the year 2001. We compare the properties of matrix C of correlations between price fluctuations in time regimes characterized by different volatilities. Our analyses reveal that (i) the largest (deviating) eigenvalue of C correlates highly with the volatility of the index, (ii) there is a shift in the distribution of the components of the eigenvector corresponding to the largest eigenvalue across regimes of different volatilities, (iii) the inverse participation ratio for this eigenvector anti-correlates significantly with the market fluctuations and finally, (iv) this eigenvector of C can be used to set up a Correlation Index, CI whose temporal evolution is significantly correlated with the volatility of the overall market index.  相似文献   

6.
World currency network constitutes one of the most complex structures that is associated with the contemporary civilization. On a way towards quantifying its characteristics we study the cross correlations in changes of the daily foreign exchange rates within the basket of 60 currencies in the period December 1998–May 2005. Such a dynamics turns out to predominantly involve one outstanding eigenvalue of the correlation matrix. The magnitude of this eigenvalue depends however crucially on which currency is used as a base currency for the remaining ones. Most prominent it looks from the perspective of a peripheral currency. This largest eigenvalue is seen to systematically decrease and thus the structure of correlations becomes more heterogeneous, when more significant currencies are used as reference. An extreme case in this later respect is the USD in the period considered. Besides providing further insight into subtle nature of complexity, these observations point to a formal procedure that in general can be used for practical purposes of measuring the relative currencies significance on various time horizons.  相似文献   

7.
We discuss recent results concerning statistical regularities in the return intervals of volatility in financial markets. In particular, we show how the analysis of volatility return intervals, defined as the time between two volatilities larger than a given threshold, can help to get a better understanding of the behavior of financial time series. We find scaling in the distribution of return intervals for thresholds ranging over a factor of 25, from 0.6 to 15 standard deviations, and also for various time windows from one minute up to 390 min (an entire trading day). Moreover, these results are universal for different stocks, commodities, interest rates as well as currencies. We also analyze the memory in the return intervals which relates to the memory in the volatility and find two scaling regimes, ℓ<ℓ* with α1=0.64±0.02 and ℓ> ℓ* with α2=0.92±0.04; these exponent values are similar to results of Liu et al. for the volatility. As an application, we use the scaling and memory properties of the return intervals to suggest a possibly useful method for estimating risk.  相似文献   

8.
We investigate scaling and memory effects in return intervals between price volatilities above a certain threshold q for the Japanese stock market using daily and intraday data sets. We find that the distribution of return intervals can be approximated by a scaling function that depends only on the ratio between the return interval τ and its mean 〈τ〉. We also find memory effects such that a large (or small) return interval follows a large (or small) interval by investigating the conditional distribution and mean return interval. The results are similar to previous studies of other markets and indicate that similar statistical features appear in different financial markets. We also compare our results between the period before and after the big crash at the end of 1989. We find that scaling and memory effects of the return intervals show similar features although the statistical properties of the returns are different.  相似文献   

9.
We consider the roughness properties of NYSE (New York Stock Exchange) stock-price fluctuations. The statistical properties of the data are relatively homogeneous within the same day but the large jumps between different days prevent the extension of the analysis to large times. This leads to intrinsic finite size effects which alter the apparent Hurst (H) exponent. We show, by analytical methods, that finite size effects always lead to an enhancement of H. We then consider the effect of fat tails on the analysis of the roughness and show that the finite size effects are strongly enhanced by the fat tails. The non stationarity of the stock price dynamics also enhances the finite size effects which, in principle, can become important even in the asymptotic regime. We then compute the Hurst exponent for a set of stocks of the NYSE and argue that the interpretation of the value of H is highly ambiguous in view of the above results. Finally we propose an alternative determination of the roughness in terms of the fluctuations from moving averages with variable characteristic times. This permits to eliminate most of the previous problems and to characterize the roughness in useful way. In particular this approach corresponds to the automatic elimination of trends at any scale.  相似文献   

10.
We explore the deviations from efficiency in the returns and volatility returns of Latin-American market indices. Two different approaches are considered. The dynamics of the Hurst exponent is obtained via a wavelet rolling sample approach, quantifying the degree of long memory exhibited by the stock market indices under analysis. On the other hand, the Tsallis q entropic index is measured in order to take into account the deviations from the Gaussian hypothesis. Different dynamic rankings of inefficieny are obtained, each of them contemplates a different source of inefficiency. Comparing with the results obtained for a developed country (US), we confirm a similar degree of long-range dependence for our emerging markets. Moreover, we show that the inefficiency in the Latin-American countries comes principally from the non-Gaussian form of the probability distributions.  相似文献   

11.
The present paper expands on recent attempts at estimating the parameters of simple interacting-agent models of financial markets [S. Alfarano, T. Lux, F. Wagner, Computational Economics 26, 19 (2005); S. Alfarano, T. Lux, F. Wagner, in Funktionsf?higkeit und Stabilit?t von Finanzm?rkten, edited by W. Franz, H. Ramser, M. Stadler (Mohr Siebeck, Tübingen, 2005), pp. 241–254]. Here we provide additional evidence by (i) investigating a large sample of individual stocks from the Tokyo Stock Exchange, and (ii) comparing results from the baseline noise trader/fundamentalist model of [S. Alfarano, T. Lux, F. Wagner, Computational Economics 26, 19 (2005)] with those obtained from an even simpler version with a preponderance of noise trader behaviour. As it turns out, this somewhat more parsimonious “maximally skewed” variant is often not rejected in favor of the more complex version. We also find that all stocks are dominated by noise trader behaviour irrespective of whether the data prefer the skewed or the baseline version of our model.  相似文献   

12.
Competition has been introduced in the electricity markets with the goal of reducing prices and improving efficiency. The basic idea which stays behind this choice is that, in competitive markets, a greater quantity of the good is exchanged at a lower price, leading to higher market efficiency. Electricity markets are pretty different from other commodities mainly due to the physical constraints related to the network structure that may impact the market performance. The network structure of the system on which the economic transactions need to be undertaken poses strict physical and operational constraints. Strategic interactions among producers that game the market with the objective of maximizing their producer surplus must be taken into account when modeling competitive electricity markets. The physical constraints, specific of the electricity markets, provide additional opportunity of gaming to the market players. Game theory provides a tool to model such a context. This paper discussed the application of game theory to physical constrained electricity markets with the goal of providing tools for assessing the market performance and pinpointing the critical network constraints that may impact the market efficiency. The basic models of game theory specifically designed to represent the electricity markets will be presented. IEEE30 bus test system of the constrained electricity market will be discussed to show the network impacts on the market performances in presence of strategic bidding behavior of the producers.  相似文献   

13.
The value of stocks, indices and other assets, are examples of stochastic processes with unpredictable dynamics. In this paper, we discuss asymmetries in short term price movements that can not be associated with a long term positive trend. These empirical asymmetries predict that stock index drops are more common on a relatively short time scale than the corresponding raises. We present several empirical examples of such asymmetries. Furthermore, a simple model featuring occasional short periods of synchronized dropping prices for all stocks constituting the index is introduced with the aim of explaining these facts. The collective negative price movements are imagined triggered by external factors in our society, as well as internal to the economy, that create fear of the future among investors. This is parameterized by a “fear factor” defining the frequency of synchronized events. It is demonstrated that such a simple fear factor model can reproduce several empirical facts concerning index asymmetries. It is also pointed out that in its simplest form, the model has certain shortcomings.  相似文献   

14.
In this paper, we quantitatively investigate the properties of a statistical ensemble of stock prices. We focus attention on the relative price defined as X(t) = S(t)/S(0), where S(0), is the stock price for an onset time of the bubble. We selected approximately 3200 stocks traded on the Japanese Stock Exchange, and formed a statistical ensemble of daily relative prices for each trading day in the 3-year period from January 4, 1999 to December 28, 2001, corresponding to the period in which internet Bubble formed and crashed in the Japanese stock market. We found that the upper tail of the complementary cumulative distribution function of the ensemble of the relative prices in the high value of the price is well described by a power-law distribution, P(S>x) ∼x , with an exponent that moves over time. Furthermore we found that as the power-law exponents α approached two, the bubble burst. It is reasonable to suppose that it indicates that internet bubble is about to burst.  相似文献   

15.
A non-trivial probability structure is evident in the binary data extracted from the up/down price movements of very high frequency data such as tick-by-tick data for USD/JPY. In this paper, we analyze the Sony bank USD/JPY rates, ignoring the small deviations from the market price. We then show there is a similar non-trivial probability structure in the Sony bank rate, in spite of the Sony bank rate's having less frequent and larger deviations than tick-by-tick data. However, this probability structure is not found in the data which has been sampled from tick-by-tick data at the same rate as the Sony bank rate. Therefore, the method of generating the Sony bank rate from the market rate has the potential for practical use since the method retains the probability structure as the sampling frequency decreases.  相似文献   

16.
The present study shows how the information on `hidden' market variables effects optimal investment strategies. We take the point of view of two investors, one who has access to the hidden variables and one who only knows the quotes of a given asset. Following Kelly's theory on investment strategies, the Shannon information and the doubling investment rate are quantified for both investors. Thanks to his privileged knowledge, the first investor can follow a better investment strategy. Nevertheless, the second investor can extract some of the hidden information looking at the past history of the asset variable. Unfortunately, due to the complexity of his strategy, this investor will have computational difficulties when he tries to apply it. He will than follow a simplified strategy, based only on the average sign of the last l quotes of the asset. This results have been tested with some Monte Carlo simulations.  相似文献   

17.
We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.  相似文献   

18.
A new approach is presented to describe the change in the statistics of the log return distribution of financial data as a function of the timescale. To this purpose a measure is introduced, which quantifies the distance of a considered distribution to a reference distribution. The existence of a small timescale regime is demonstrated, which exhibits different properties compared to the normal timescale regime for timescales larger than one minute. This regime seems to be universal for individual stocks. It is shown that the existence of this small timescale regime is not dependent on the special choice of the distance measure or the reference distribution. These findings have important implications for risk analysis, in particular for the probability of extreme events.  相似文献   

19.
Avalanches, or Avalanche-like, events are often observed in the dynamical behaviour of many complex systems which span from solar flaring to the Earth's crust dynamics and from traffic flows to financial markets. Self-organized criticality (SOC) is one of the most popular theories able to explain this intermittent charge/discharge behaviour. Despite a large amount of theoretical work, empirical tests for SOC are still in their infancy. In the present paper we address the common problem of revealing SOC from a simple time series without having much information about the underlying system. As a working example we use a modified version of the multifractal random walk originally proposed as a model for the stock market dynamics. The study reveals, despite the lack of the typical ingredients of SOC, an avalanche-like dynamics similar to that of many physical systems. While, on one hand, the results confirm the relevance of cascade models in representing turbulent-like phenomena, on the other, they also raise the question about the current state of reliability of SOC inference from time series analysis.  相似文献   

20.
The question of optimal portfolio is addressed. The conventional Markowitz portfolio optimisation is discussed and the shortcomings due to non-Gaussian security returns are outlined. A method is proposed to minimise the likelihood of extreme non-Gaussian drawdowns of the portfolio value. The theory is called Leptokurtic, because it minimises the effects from “fat tails” of returns. The leptokurtic portfolio theory provides an optimal portfolio for investors, who define their risk-aversion as unwillingness to experience sharp drawdowns in asset prices. Two types of risks in asset returns are defined: a fluctuation risk, that has Gaussian distribution, and a drawdown risk, that deals with distribution tails. These risks are quantitatively measured by defining the “noise kernel” — an ellipsoidal cloud of points in the space of asset returns. The size of the ellipse is controlled with the threshold parameter: the larger the threshold parameter, the larger return are accepted for investors as normal fluctuations. The return vectors falling into the kernel are used for calculation of fluctuation risk. Analogously, the data points falling outside the kernel are used for the calculation of drawdown risks. As a result the portfolio optimisation problem becomes three-dimensional: in addition to the return, there are two types of risks involved. Optimal portfolio for drawdown-averse investors is the portfolio minimising variance outside the noise kernel. The theory has been tested with MSCI North America, Europe and Pacific total return stock indices.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号