首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A non-trivial probability structure is evident in the binary data extracted from the up/down price movements of very high frequency data such as tick-by-tick data for USD/JPY. In this paper, we analyze the Sony bank USD/JPY rates, ignoring the small deviations from the market price. We then show there is a similar non-trivial probability structure in the Sony bank rate, in spite of the Sony bank rate's having less frequent and larger deviations than tick-by-tick data. However, this probability structure is not found in the data which has been sampled from tick-by-tick data at the same rate as the Sony bank rate. Therefore, the method of generating the Sony bank rate from the market rate has the potential for practical use since the method retains the probability structure as the sampling frequency decreases.  相似文献   

2.
The present study shows how the information on `hidden' market variables effects optimal investment strategies. We take the point of view of two investors, one who has access to the hidden variables and one who only knows the quotes of a given asset. Following Kelly's theory on investment strategies, the Shannon information and the doubling investment rate are quantified for both investors. Thanks to his privileged knowledge, the first investor can follow a better investment strategy. Nevertheless, the second investor can extract some of the hidden information looking at the past history of the asset variable. Unfortunately, due to the complexity of his strategy, this investor will have computational difficulties when he tries to apply it. He will than follow a simplified strategy, based only on the average sign of the last l quotes of the asset. This results have been tested with some Monte Carlo simulations.  相似文献   

3.
The value of stocks, indices and other assets, are examples of stochastic processes with unpredictable dynamics. In this paper, we discuss asymmetries in short term price movements that can not be associated with a long term positive trend. These empirical asymmetries predict that stock index drops are more common on a relatively short time scale than the corresponding raises. We present several empirical examples of such asymmetries. Furthermore, a simple model featuring occasional short periods of synchronized dropping prices for all stocks constituting the index is introduced with the aim of explaining these facts. The collective negative price movements are imagined triggered by external factors in our society, as well as internal to the economy, that create fear of the future among investors. This is parameterized by a “fear factor” defining the frequency of synchronized events. It is demonstrated that such a simple fear factor model can reproduce several empirical facts concerning index asymmetries. It is also pointed out that in its simplest form, the model has certain shortcomings.  相似文献   

4.
The present paper expands on recent attempts at estimating the parameters of simple interacting-agent models of financial markets [S. Alfarano, T. Lux, F. Wagner, Computational Economics 26, 19 (2005); S. Alfarano, T. Lux, F. Wagner, in Funktionsf?higkeit und Stabilit?t von Finanzm?rkten, edited by W. Franz, H. Ramser, M. Stadler (Mohr Siebeck, Tübingen, 2005), pp. 241–254]. Here we provide additional evidence by (i) investigating a large sample of individual stocks from the Tokyo Stock Exchange, and (ii) comparing results from the baseline noise trader/fundamentalist model of [S. Alfarano, T. Lux, F. Wagner, Computational Economics 26, 19 (2005)] with those obtained from an even simpler version with a preponderance of noise trader behaviour. As it turns out, this somewhat more parsimonious “maximally skewed” variant is often not rejected in favor of the more complex version. We also find that all stocks are dominated by noise trader behaviour irrespective of whether the data prefer the skewed or the baseline version of our model.  相似文献   

5.
A new approach is presented to describe the change in the statistics of the log return distribution of financial data as a function of the timescale. To this purpose a measure is introduced, which quantifies the distance of a considered distribution to a reference distribution. The existence of a small timescale regime is demonstrated, which exhibits different properties compared to the normal timescale regime for timescales larger than one minute. This regime seems to be universal for individual stocks. It is shown that the existence of this small timescale regime is not dependent on the special choice of the distance measure or the reference distribution. These findings have important implications for risk analysis, in particular for the probability of extreme events.  相似文献   

6.
Avalanches, or Avalanche-like, events are often observed in the dynamical behaviour of many complex systems which span from solar flaring to the Earth's crust dynamics and from traffic flows to financial markets. Self-organized criticality (SOC) is one of the most popular theories able to explain this intermittent charge/discharge behaviour. Despite a large amount of theoretical work, empirical tests for SOC are still in their infancy. In the present paper we address the common problem of revealing SOC from a simple time series without having much information about the underlying system. As a working example we use a modified version of the multifractal random walk originally proposed as a model for the stock market dynamics. The study reveals, despite the lack of the typical ingredients of SOC, an avalanche-like dynamics similar to that of many physical systems. While, on one hand, the results confirm the relevance of cascade models in representing turbulent-like phenomena, on the other, they also raise the question about the current state of reliability of SOC inference from time series analysis.  相似文献   

7.
In this paper, we quantitatively investigate the properties of a statistical ensemble of stock prices. We focus attention on the relative price defined as X(t) = S(t)/S(0), where S(0), is the stock price for an onset time of the bubble. We selected approximately 3200 stocks traded on the Japanese Stock Exchange, and formed a statistical ensemble of daily relative prices for each trading day in the 3-year period from January 4, 1999 to December 28, 2001, corresponding to the period in which internet Bubble formed and crashed in the Japanese stock market. We found that the upper tail of the complementary cumulative distribution function of the ensemble of the relative prices in the high value of the price is well described by a power-law distribution, P(S>x) ∼x , with an exponent that moves over time. Furthermore we found that as the power-law exponents α approached two, the bubble burst. It is reasonable to suppose that it indicates that internet bubble is about to burst.  相似文献   

8.
9.
In this paper an analysis of the Stirling cycle in thermoeconomic terms is developed using the entropy generation. In the thermoeconomic optimization of an irreversible Stirling heat pump cycle the F function has been introduced to evaluate the optimum for the higher and lower sources temperature ratio in the cycle: this ratio represents the value which optimizes the cycle itself. The variation of the function F is proportional to the variation of the entropy generation, the maxima and minima of F has been evaluated in a previous paper without giving the physical foundation of the method. We investigate the groundwork of this approach: to study the upper and lower limits of F function allows to determine the cycle stability and the optimization conditions. The optimization consists in the best COP at the least cost. The principle of maximum variation for the entropy generation becomes the analytic foundation of the optimization method in the thermoeconomic analysis for an irreversible Stirling heat pump cycle.  相似文献   

10.
Simple stochastic exchange games are based on random allocation of finite resources. These games are Markov chains that can be studied either analytically or by Monte Carlo simulations. In particular, the equilibrium distribution can be derived either by direct diagonalization of the transition matrix, or using the detailed balance equation, or by Monte Carlo estimates. In this paper, these methods are introduced and applied to the Bennati-Dragulescu-Yakovenko (BDY) game. The exact analysis shows that the statistical-mechanical analogies used in the previous literature have to be revised. An erratum to this article is available at .  相似文献   

11.
In the present work we investigate the multiscale nature of the correlations for high frequency data (1 min) in different futures markets over a period of two years, starting on the 1st of January 2003 and ending on the 31st of December 2004. In particular, by using the concept of local Hurst exponent, we point out how the behaviour of this parameter, usually considered as a benchmark for persistency/antipersistency recognition in time series, is largely time-scale dependent in the market context. These findings are a direct consequence of the intrinsic complexity of a system where trading strategies are scale-adaptive. Moreover, our analysis points out different regimes in the dynamical behaviour of the market indices under consideration.  相似文献   

12.
Many systems of different nature exhibit scale free behaviors. Economic systems with power law distribution in the wealth are one of the examples. To better understand the working behind the complexity, we undertook an experiment recording the interactions between market participants. A Web server was setup to administer the exchange of futures contracts whose liquidation prices were coupled to event outcomes. After free registration, participants started trading to compete for the money prizes upon maturity of the futures contracts at the end of the experiment. The evolving `cash' flow network was reconstructed from the transactions between players. We show that the network topology is hierarchical, disassortative and small-world with a power law exponent of 1.02±0.09 in the degree distribution after an exponential decay correction. The small-world property emerged early in the experiment while the number of participants was still small. We also show power law-like distributions of the net incomes and inter-transaction time intervals. Big winners and losers are associated with high degree, high betweenness centrality, low clustering coefficient and low degree-correlation. We identify communities in the network as groups of the like-minded. The distribution of the community sizes is shown to be power-law distributed with an exponent of 1.19±0.16.  相似文献   

13.
The question of optimal portfolio is addressed. The conventional Markowitz portfolio optimisation is discussed and the shortcomings due to non-Gaussian security returns are outlined. A method is proposed to minimise the likelihood of extreme non-Gaussian drawdowns of the portfolio value. The theory is called Leptokurtic, because it minimises the effects from “fat tails” of returns. The leptokurtic portfolio theory provides an optimal portfolio for investors, who define their risk-aversion as unwillingness to experience sharp drawdowns in asset prices. Two types of risks in asset returns are defined: a fluctuation risk, that has Gaussian distribution, and a drawdown risk, that deals with distribution tails. These risks are quantitatively measured by defining the “noise kernel” — an ellipsoidal cloud of points in the space of asset returns. The size of the ellipse is controlled with the threshold parameter: the larger the threshold parameter, the larger return are accepted for investors as normal fluctuations. The return vectors falling into the kernel are used for calculation of fluctuation risk. Analogously, the data points falling outside the kernel are used for the calculation of drawdown risks. As a result the portfolio optimisation problem becomes three-dimensional: in addition to the return, there are two types of risks involved. Optimal portfolio for drawdown-averse investors is the portfolio minimising variance outside the noise kernel. The theory has been tested with MSCI North America, Europe and Pacific total return stock indices.  相似文献   

14.
We examine the volatility of an Indian stock market in terms of correlation of stocks and quantify the volatility using the random matrix approach. First we discuss trends observed in the pattern of stock prices in the Bombay Stock Exchange for the three-year period 2000–2002. Random matrix analysis is then applied to study the relationship between the coupling of stocks and volatility. The study uses daily returns of 70 stocks for successive time windows of length 85 days for the year 2001. We compare the properties of matrix C of correlations between price fluctuations in time regimes characterized by different volatilities. Our analyses reveal that (i) the largest (deviating) eigenvalue of C correlates highly with the volatility of the index, (ii) there is a shift in the distribution of the components of the eigenvector corresponding to the largest eigenvalue across regimes of different volatilities, (iii) the inverse participation ratio for this eigenvector anti-correlates significantly with the market fluctuations and finally, (iv) this eigenvector of C can be used to set up a Correlation Index, CI whose temporal evolution is significantly correlated with the volatility of the overall market index.  相似文献   

15.
We consider the roughness properties of NYSE (New York Stock Exchange) stock-price fluctuations. The statistical properties of the data are relatively homogeneous within the same day but the large jumps between different days prevent the extension of the analysis to large times. This leads to intrinsic finite size effects which alter the apparent Hurst (H) exponent. We show, by analytical methods, that finite size effects always lead to an enhancement of H. We then consider the effect of fat tails on the analysis of the roughness and show that the finite size effects are strongly enhanced by the fat tails. The non stationarity of the stock price dynamics also enhances the finite size effects which, in principle, can become important even in the asymptotic regime. We then compute the Hurst exponent for a set of stocks of the NYSE and argue that the interpretation of the value of H is highly ambiguous in view of the above results. Finally we propose an alternative determination of the roughness in terms of the fluctuations from moving averages with variable characteristic times. This permits to eliminate most of the previous problems and to characterize the roughness in useful way. In particular this approach corresponds to the automatic elimination of trends at any scale.  相似文献   

16.
In this work we present an analysis of a spatially non homogeneous ultimatum game. By considering different underlying topologies as substrates on top of which the game takes place we obtain nontrivial behaviors for the evolution of the strategies of the players. We analyze separately the effect of the size of the neighborhood and the spatial structure. Whereas this last effect is the most significant one, we show that even for disordered networks and provided the neighborhood of each site is small, the results can be significantly different from those obtained in the case of fully connected networks.  相似文献   

17.
We define and study a rather complex market model, inspired from the Santa Fe artificial market and the Minority Game. Agents have different strategies among which they can choose, according to their relative profitability, with the possibility of not participating to the market. The price is updated according to the excess demand, and the wealth of the agents is properly accounted for. Only two parameters play a significant role: one describes the impact of trading on the price, and the other describes the propensity of agents to be trend following or contrarian. We observe three different regimes, depending on the value of these two parameters: an oscillating phase with bubbles and crashes, an intermittent phase and a stable `rational' market phase. The statistics of price changes in the intermittent phase resembles that of real price changes, with small linear correlations, fat tails and long range volatility clustering. We discuss how the time dependence of these two parameters spontaneously drives the system in the intermittent region. We analyze quantitatively the temporal correlation of activity in the intermittent phase, and show that the `random time strategy shift' mechanism that we proposed earlier allows one to understand the observed long ranged correlations. Other mechanisms leading to long ranged correlations are also reviewed. We discuss several other issues, such as the formation of bubbles and crashes, the influence of transaction costs and the distribution of agents wealth. Received 5 July 2002 / Received in final form 9 December 2002 Published online 14 February 2003 RID="a" ID="a"e-mail: irene.giardina@roma1.infn.it  相似文献   

18.
In this paper, we solve a general problem of optimizing a portfolio in a futures markets framework, extending the previous work of Galluccio et al. [Physica A 259, 449 (1998)]. We allow for long buying/short selling of a relatively large number of assets, assuming a fixed level of margin requirement. Because of non-linearity in the constraint, we derive a multiple equilibrium solution, in a size exponential respect to the number of assets. That means that we can not obtain the unique efficiency frontier, but many of them and each one is related to different levels of risk. Such a problem is analogous to that of finding the ground state in long-ranged Ising spin glass with external field. In order to get the best portfolio (i.e. that is along the best efficiency frontier), we have to implement a two-step procedure, performing the exhaustive enumeration of all local minima. We develop a concrete application, where the different part of the proposed solution are computed. Received 31 December 2001  相似文献   

19.
On the basis of the market microstructure theory and the continuous time stochastic volatility-style microstructure model, a discrete time stochastic volatility microstructure model with state-observability is proposed for describing the dynamics of financial markets. From the discrete time microstructure model proposed, estimates of two immeasurable state variables representing the market excess demand and liquidity respectively may be obtained. A simple trading strategy for dynamic asset allocation, based on the indirectly obtained excess demand information instead of the prediction for price, is presented. An approach to the estimation of the discrete time microstructure model using the extended Kalman filter and the maximum likelihood method is also presented. Case studies on financial market modeling and the estimated model-based asset dynamic allocation control for the JPY/USD (Japanese Yen/US Dollar) exchange rate and Japan TOPIX (TOkyo stock Price IndeX) show satisfactory modeling precision and control performance. Received 11 March 2002 / Received in final form 4 November 2002 Published online 4 February 2003 RID="a" ID="a"Currently a visiting researcher at the Institute of Statistical Mathematics, 4-6-7 Minami Azabu, Minato-ku, Tokyo 106-8569, Japan e-mail: peng@ism.ac.jp  相似文献   

20.
A statistical connection is identified between the current spread in a market over a given time period and the drift of the market during previous time periods. It is shown that periods of high spread are likely to be preceded by periods with relatively large market drifts. Several markets, including the UK pound per US Dollar, US Dollar per Yen, UK pound per Euro, and the UK FT100 index have been analysed from 1991 to 2000 over variable periods of weeks, months and quarters. Within each period, i the natural logarithm of the daily end-of-trade market value has been least squares fitted to a linear regression line, and evaluations made of the regression line slope μ i, the direct spread si with respect to the mean value, and the regression spread ri of the deviations from the regression line. Significant correlations have been observed between the current monthly direct spread si for each period i and the absolute value of the drifts |μ i-j| evaluated j periods earlier. This correlation coefficient is as high as 0.746 for a period of one quarter (j = 1) and appears to die away after around 9 months for quarterly averages, after around 4 months for monthly averages and after around 2 months for weekly averages. Received 11 October 2000  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号