首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The present paper expands on recent attempts at estimating the parameters of simple interacting-agent models of financial markets [S. Alfarano, T. Lux, F. Wagner, Computational Economics 26, 19 (2005); S. Alfarano, T. Lux, F. Wagner, in Funktionsf?higkeit und Stabilit?t von Finanzm?rkten, edited by W. Franz, H. Ramser, M. Stadler (Mohr Siebeck, Tübingen, 2005), pp. 241–254]. Here we provide additional evidence by (i) investigating a large sample of individual stocks from the Tokyo Stock Exchange, and (ii) comparing results from the baseline noise trader/fundamentalist model of [S. Alfarano, T. Lux, F. Wagner, Computational Economics 26, 19 (2005)] with those obtained from an even simpler version with a preponderance of noise trader behaviour. As it turns out, this somewhat more parsimonious “maximally skewed” variant is often not rejected in favor of the more complex version. We also find that all stocks are dominated by noise trader behaviour irrespective of whether the data prefer the skewed or the baseline version of our model.  相似文献   

2.
A non-trivial probability structure is evident in the binary data extracted from the up/down price movements of very high frequency data such as tick-by-tick data for USD/JPY. In this paper, we analyze the Sony bank USD/JPY rates, ignoring the small deviations from the market price. We then show there is a similar non-trivial probability structure in the Sony bank rate, in spite of the Sony bank rate's having less frequent and larger deviations than tick-by-tick data. However, this probability structure is not found in the data which has been sampled from tick-by-tick data at the same rate as the Sony bank rate. Therefore, the method of generating the Sony bank rate from the market rate has the potential for practical use since the method retains the probability structure as the sampling frequency decreases.  相似文献   

3.
We define and study a rather complex market model, inspired from the Santa Fe artificial market and the Minority Game. Agents have different strategies among which they can choose, according to their relative profitability, with the possibility of not participating to the market. The price is updated according to the excess demand, and the wealth of the agents is properly accounted for. Only two parameters play a significant role: one describes the impact of trading on the price, and the other describes the propensity of agents to be trend following or contrarian. We observe three different regimes, depending on the value of these two parameters: an oscillating phase with bubbles and crashes, an intermittent phase and a stable `rational' market phase. The statistics of price changes in the intermittent phase resembles that of real price changes, with small linear correlations, fat tails and long range volatility clustering. We discuss how the time dependence of these two parameters spontaneously drives the system in the intermittent region. We analyze quantitatively the temporal correlation of activity in the intermittent phase, and show that the `random time strategy shift' mechanism that we proposed earlier allows one to understand the observed long ranged correlations. Other mechanisms leading to long ranged correlations are also reviewed. We discuss several other issues, such as the formation of bubbles and crashes, the influence of transaction costs and the distribution of agents wealth. Received 5 July 2002 / Received in final form 9 December 2002 Published online 14 February 2003 RID="a" ID="a"e-mail: irene.giardina@roma1.infn.it  相似文献   

4.
The present study shows how the information on `hidden' market variables effects optimal investment strategies. We take the point of view of two investors, one who has access to the hidden variables and one who only knows the quotes of a given asset. Following Kelly's theory on investment strategies, the Shannon information and the doubling investment rate are quantified for both investors. Thanks to his privileged knowledge, the first investor can follow a better investment strategy. Nevertheless, the second investor can extract some of the hidden information looking at the past history of the asset variable. Unfortunately, due to the complexity of his strategy, this investor will have computational difficulties when he tries to apply it. He will than follow a simplified strategy, based only on the average sign of the last l quotes of the asset. This results have been tested with some Monte Carlo simulations.  相似文献   

5.
The value of stocks, indices and other assets, are examples of stochastic processes with unpredictable dynamics. In this paper, we discuss asymmetries in short term price movements that can not be associated with a long term positive trend. These empirical asymmetries predict that stock index drops are more common on a relatively short time scale than the corresponding raises. We present several empirical examples of such asymmetries. Furthermore, a simple model featuring occasional short periods of synchronized dropping prices for all stocks constituting the index is introduced with the aim of explaining these facts. The collective negative price movements are imagined triggered by external factors in our society, as well as internal to the economy, that create fear of the future among investors. This is parameterized by a “fear factor” defining the frequency of synchronized events. It is demonstrated that such a simple fear factor model can reproduce several empirical facts concerning index asymmetries. It is also pointed out that in its simplest form, the model has certain shortcomings.  相似文献   

6.
We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.  相似文献   

7.
A new approach is presented to describe the change in the statistics of the log return distribution of financial data as a function of the timescale. To this purpose a measure is introduced, which quantifies the distance of a considered distribution to a reference distribution. The existence of a small timescale regime is demonstrated, which exhibits different properties compared to the normal timescale regime for timescales larger than one minute. This regime seems to be universal for individual stocks. It is shown that the existence of this small timescale regime is not dependent on the special choice of the distance measure or the reference distribution. These findings have important implications for risk analysis, in particular for the probability of extreme events.  相似文献   

8.
Avalanches, or Avalanche-like, events are often observed in the dynamical behaviour of many complex systems which span from solar flaring to the Earth's crust dynamics and from traffic flows to financial markets. Self-organized criticality (SOC) is one of the most popular theories able to explain this intermittent charge/discharge behaviour. Despite a large amount of theoretical work, empirical tests for SOC are still in their infancy. In the present paper we address the common problem of revealing SOC from a simple time series without having much information about the underlying system. As a working example we use a modified version of the multifractal random walk originally proposed as a model for the stock market dynamics. The study reveals, despite the lack of the typical ingredients of SOC, an avalanche-like dynamics similar to that of many physical systems. While, on one hand, the results confirm the relevance of cascade models in representing turbulent-like phenomena, on the other, they also raise the question about the current state of reliability of SOC inference from time series analysis.  相似文献   

9.
We examine the volatility of an Indian stock market in terms of correlation of stocks and quantify the volatility using the random matrix approach. First we discuss trends observed in the pattern of stock prices in the Bombay Stock Exchange for the three-year period 2000–2002. Random matrix analysis is then applied to study the relationship between the coupling of stocks and volatility. The study uses daily returns of 70 stocks for successive time windows of length 85 days for the year 2001. We compare the properties of matrix C of correlations between price fluctuations in time regimes characterized by different volatilities. Our analyses reveal that (i) the largest (deviating) eigenvalue of C correlates highly with the volatility of the index, (ii) there is a shift in the distribution of the components of the eigenvector corresponding to the largest eigenvalue across regimes of different volatilities, (iii) the inverse participation ratio for this eigenvector anti-correlates significantly with the market fluctuations and finally, (iv) this eigenvector of C can be used to set up a Correlation Index, CI whose temporal evolution is significantly correlated with the volatility of the overall market index.  相似文献   

10.
11.
In this paper an analysis of the Stirling cycle in thermoeconomic terms is developed using the entropy generation. In the thermoeconomic optimization of an irreversible Stirling heat pump cycle the F function has been introduced to evaluate the optimum for the higher and lower sources temperature ratio in the cycle: this ratio represents the value which optimizes the cycle itself. The variation of the function F is proportional to the variation of the entropy generation, the maxima and minima of F has been evaluated in a previous paper without giving the physical foundation of the method. We investigate the groundwork of this approach: to study the upper and lower limits of F function allows to determine the cycle stability and the optimization conditions. The optimization consists in the best COP at the least cost. The principle of maximum variation for the entropy generation becomes the analytic foundation of the optimization method in the thermoeconomic analysis for an irreversible Stirling heat pump cycle.  相似文献   

12.
Simple stochastic exchange games are based on random allocation of finite resources. These games are Markov chains that can be studied either analytically or by Monte Carlo simulations. In particular, the equilibrium distribution can be derived either by direct diagonalization of the transition matrix, or using the detailed balance equation, or by Monte Carlo estimates. In this paper, these methods are introduced and applied to the Bennati-Dragulescu-Yakovenko (BDY) game. The exact analysis shows that the statistical-mechanical analogies used in the previous literature have to be revised. An erratum to this article is available at .  相似文献   

13.
In the present work we investigate the multiscale nature of the correlations for high frequency data (1 min) in different futures markets over a period of two years, starting on the 1st of January 2003 and ending on the 31st of December 2004. In particular, by using the concept of local Hurst exponent, we point out how the behaviour of this parameter, usually considered as a benchmark for persistency/antipersistency recognition in time series, is largely time-scale dependent in the market context. These findings are a direct consequence of the intrinsic complexity of a system where trading strategies are scale-adaptive. Moreover, our analysis points out different regimes in the dynamical behaviour of the market indices under consideration.  相似文献   

14.
Many systems of different nature exhibit scale free behaviors. Economic systems with power law distribution in the wealth are one of the examples. To better understand the working behind the complexity, we undertook an experiment recording the interactions between market participants. A Web server was setup to administer the exchange of futures contracts whose liquidation prices were coupled to event outcomes. After free registration, participants started trading to compete for the money prizes upon maturity of the futures contracts at the end of the experiment. The evolving `cash' flow network was reconstructed from the transactions between players. We show that the network topology is hierarchical, disassortative and small-world with a power law exponent of 1.02±0.09 in the degree distribution after an exponential decay correction. The small-world property emerged early in the experiment while the number of participants was still small. We also show power law-like distributions of the net incomes and inter-transaction time intervals. Big winners and losers are associated with high degree, high betweenness centrality, low clustering coefficient and low degree-correlation. We identify communities in the network as groups of the like-minded. The distribution of the community sizes is shown to be power-law distributed with an exponent of 1.19±0.16.  相似文献   

15.
We investigate scaling and memory effects in return intervals between price volatilities above a certain threshold q for the Japanese stock market using daily and intraday data sets. We find that the distribution of return intervals can be approximated by a scaling function that depends only on the ratio between the return interval τ and its mean 〈τ〉. We also find memory effects such that a large (or small) return interval follows a large (or small) interval by investigating the conditional distribution and mean return interval. The results are similar to previous studies of other markets and indicate that similar statistical features appear in different financial markets. We also compare our results between the period before and after the big crash at the end of 1989. We find that scaling and memory effects of the return intervals show similar features although the statistical properties of the returns are different.  相似文献   

16.
This paper intends to meet recent claims for the attainment of more rigorous statistical methodology within the econophysics literature. To this end, we consider an econometric approach to investigate the outcomes of the log-periodic model of price movements, which has been largely used to forecast financial crashes. In order to accomplish reliable statistical inference for unknown parameters, we incorporate an autoregressive dynamic and a conditional heteroskedasticity structure in the error term of the original model, yielding the log-periodic-AR(1)-GARCH(1,1) model. Both the original and the extended models are fitted to financial indices of U. S. market, namely S&P500 and NASDAQ. Our analysis reveal two main points: (i) the log-periodic-AR(1)-GARCH(1,1) model has residuals with better statistical properties and (ii) the estimation of the parameter concerning the time of the financial crash has been improved.  相似文献   

17.
We consider the roughness properties of NYSE (New York Stock Exchange) stock-price fluctuations. The statistical properties of the data are relatively homogeneous within the same day but the large jumps between different days prevent the extension of the analysis to large times. This leads to intrinsic finite size effects which alter the apparent Hurst (H) exponent. We show, by analytical methods, that finite size effects always lead to an enhancement of H. We then consider the effect of fat tails on the analysis of the roughness and show that the finite size effects are strongly enhanced by the fat tails. The non stationarity of the stock price dynamics also enhances the finite size effects which, in principle, can become important even in the asymptotic regime. We then compute the Hurst exponent for a set of stocks of the NYSE and argue that the interpretation of the value of H is highly ambiguous in view of the above results. Finally we propose an alternative determination of the roughness in terms of the fluctuations from moving averages with variable characteristic times. This permits to eliminate most of the previous problems and to characterize the roughness in useful way. In particular this approach corresponds to the automatic elimination of trends at any scale.  相似文献   

18.
Competition has been introduced in the electricity markets with the goal of reducing prices and improving efficiency. The basic idea which stays behind this choice is that, in competitive markets, a greater quantity of the good is exchanged at a lower price, leading to higher market efficiency. Electricity markets are pretty different from other commodities mainly due to the physical constraints related to the network structure that may impact the market performance. The network structure of the system on which the economic transactions need to be undertaken poses strict physical and operational constraints. Strategic interactions among producers that game the market with the objective of maximizing their producer surplus must be taken into account when modeling competitive electricity markets. The physical constraints, specific of the electricity markets, provide additional opportunity of gaming to the market players. Game theory provides a tool to model such a context. This paper discussed the application of game theory to physical constrained electricity markets with the goal of providing tools for assessing the market performance and pinpointing the critical network constraints that may impact the market efficiency. The basic models of game theory specifically designed to represent the electricity markets will be presented. IEEE30 bus test system of the constrained electricity market will be discussed to show the network impacts on the market performances in presence of strategic bidding behavior of the producers.  相似文献   

19.
The GARCH (p, q) model is a very interesting stochastic process with widespread applications and a central role in empirical finance. The Markovian GARCH (1, 1) model has only 3 control parameters and a much discussed question is how to estimate them when a series of some financial asset is given. Besides the maximum likelihood estimator technique, there is another method which uses the variance, the kurtosis and the autocorrelation time to determine them. We propose here to use the standardized 6th moment. The set of parameters obtained in this way produces a very good probability density function and a much better time autocorrelation function. This is true for both studied indexes: NYSE Composite and FTSE 100. The probability of return to the origin is investigated at different time horizons for both Gaussian and Laplacian GARCH models. In spite of the fact that these models show almost identical performances with respect to the final probability density function and to the time autocorrelation function, their scaling properties are, however, very different. The Laplacian GARCH model gives a better scaling exponent for the NYSE time series, whereas the Gaussian dynamics fits better the FTSE scaling exponent.  相似文献   

20.
We study the temporal evolutions of three stock markets; Standard and Poor's 500 index, Nikkei 225 Stock Average, and the Korea Composite Stock Price Index. We observe that the probability density function of the log-return has a fat tail but the tail index has been increasing continuously in recent years. We have also found that the variance of the autocorrelation function, the scaling exponent of the standard deviation, and the statistical complexity decrease, but that the entropy density increases as time goes over time. We introduce a modified microscopic spin model and simulate the model to confirm such increasing and decreasing tendencies in statistical quantities. These findings indicate that these three stock markets are becoming more efficient. An erratum to this article is available at .  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号