首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Janusz Mi?kiewicz 《Physica A》2010,389(8):1677-1687
The idea of entropy was introduced in thermodynamics, but it can be used in time series analysis. There are various ways to define and measure the entropy of a system. Here the so called Theil index, which is often used in economy and finance, is applied as it were an entropy measure. In this study the time series are remapped through the Theil index. Then the linear correlation coefficient between the remapped time series is evaluated as a function of time and time window size and the corresponding statistical distance is defined. The results are compared with the the usual correlation distance measure for the time series themselves. As an example this entropy correlation distance method (ECDM) is applied to several series, as those of the Consumer Price Index (CPI) in order to test some so called globalisation processes. Distance matrices are calculated in order to construct two network structures which are next analysed. The role of two different time scales introduced by the Theil index and a correlation coefficient is also discussed. The evolution of the mean distance between the most developed countries is presented and the globalisation periods of the prices discussed. It is finally shown that the evolution of mean distance between the most developed countries on several networks follows the process of introducing the European currency — the Euro. It is contrasted to the GDP based analysis. It is stressed that the entropy correlation distance measure is more suitable in detecting significant changes, like a globalisation process than the usual statistical (correlation based) measure.  相似文献   

2.
The economy globalization measure problem is discussed. Four macroeconomic indices of twenty among the “richest” countries are examined. Four types of “distances” are calculated. Two types of networks are next constructed for each distance measure definition. It is shown that the globalization process can be best characterised by an entropy measure, based on entropy Manhattan distance. It is observed that a globalization maximum was reached during the interval 1970-2000. More recently a deglobalization process has been observed.  相似文献   

3.
An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, —which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ?15 years.  相似文献   

4.
We study correlations between web-downloaded gross domestic product (GDP)'s of rich countries. GDP is used as wealth signatures of the country economical state. We calculate the yearly fluctuations of the GDP. We look for forward and backward correlations between such fluctuations. The correlation measure is based on the Theil index. The system is represented by an evolving weighted network, nodes being the GDP fluctuations (or countries) at different times.In order to extract structures from the network, we focus on filtering the time delayed correlations by removing the least correlated links. This percolation idea-based method reveals the emergence of connections, that are visualized by a branching representation. Note that the network is made of weighted and directed links when taking into account a delay time. Such a measure of collective habits does not readily fit the usual expectations, except if an economy globalization framework is accepted.  相似文献   

5.
Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the “Effort To Compress” the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).  相似文献   

6.
Many similarity measure algorithms of nodes in weighted graph data have been proposed by employing the degree of nodes in recent years. Despite these algorithms obtaining great results, there may be still some limitations. For instance, the strength of nodes is ignored. Aiming at this issue, the relative entropy of the distance distribution based similarity measure of nodes is proposed in this paper. At first, the structural weights of nodes are given by integrating their degree and strength. Next, the distance between any two nodes is calculated with the help of their structural weights and the Euclidean distance formula to further obtain the distance distribution of each node. After that, the probability distribution of nodes is constructed by normalizing their distance distributions. Thus, the relative entropy can be applied to measure the difference between the probability distributions of the top d important nodes and all nodes in graph data. Finally, the similarity of two nodes can be measured in terms of this above-mentioned difference calculated by relative entropy. Experimental results demonstrate that the algorithm proposed by considering the strength of node in the relative entropy has great advantages in the most similar node mining and link prediction.  相似文献   

7.
Entropy indicates irregularity or randomness of a dynamic system. Over the decades, entropy calculated at different scales of the system through subsampling or coarse graining has been used as a surrogate measure of system complexity. One popular multi-scale entropy analysis is the multi-scale sample entropy (MSE), which calculates entropy through the sample entropy (SampEn) formula at each time scale. SampEn is defined by the “logarithmic likelihood” that a small section (within a window of a length m) of the data “matches” with other sections will still “match” the others if the section window length increases by one. “Match” is defined by a threshold of r times standard deviation of the entire time series. A problem of current MSE algorithm is that SampEn calculations at different scales are based on the same matching threshold defined by the original time series but data standard deviation actually changes with the subsampling scales. Using a fixed threshold will automatically introduce systematic bias to the calculation results. The purpose of this paper is to mathematically present this systematic bias and to provide methods for correcting it. Our work will help the large MSE user community avoiding introducing the bias to their multi-scale SampEn calculation results.  相似文献   

8.
In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter qq. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990–31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008–11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia–Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.  相似文献   

9.
One of the most difficult problems in the field of non-linear time series analysis is the unequivocal characterization of a measured signal. We present a practicable procedure which allows to decide if a given time series is pure noise, chaotic but distorted by noise, purely chaotic, or a Markov process. Furthermore, the method gives an estimate of the Kolmogorov-Sinai (KS) entropy and the noise level. The procedure is based on a measure of the sensitive dependence on the initial conditions which is called ε-information flow. This measure generalizes the concept of KS entropy and characterizes the underlying dynamics. The ε-information flow is approximated by the calculation of various correlation integrals.  相似文献   

10.
11.
The generalized Kullback-Leibler distance Dq (q is the Tsallis parameter) is shown to be an useful measure for analysis of functional magnetic resonance imaging (fMRI) data series. This generalized form of entropy is used to evaluate the “distance” between the probability functions p1 and p2 of the signal levels related to periods of stimulus and non-stimulus in event-related fMRI experiments. The probability densities of the mean distance (averaged over the N epochs of the entire experiment) are obtained through numerical simulations for different values of signal-to-noise ratio (SNR) and found to be fitted very well by Gamma distributions (χ2<0.0008) for small values of N (N<30). These distributions allow us to determine the sensitivity and specificity of the method by construction of the receiver operating characteristic (ROC) curves. The performance of the method is also investigated in terms of the parameters q and L (number of signal levels) and our results indicate that the optimum choice is q=0.8 and L=3. The entropic index q is found to exert control on both sensitivity and specificity of the method. As q (q>0) is raised, sensitivity increases but specificity decreases. Finally, the method is applied in the analysis of a real event-related fMRI motor stimulus experiment and the resulting maps show activation in primary and secondary motor brain areas.  相似文献   

12.
Using the modified sample entropy to detect determinism   总被引:2,自引:0,他引:2  
A modified sample entropy (mSampEn), based on the nonlinear continuous and convex function, has been proposed and proven to be superior to the standard sample entropy (SampEn) in several aspects. In this Letter, we empirically investigate the ability of the mSampEn statistic combined with surrogate data method to detect determinism. The effects of the datasets length and noise on the proposed method to differentiate between deterministic and stochastic dynamics are tested on several benchmark time series. The noise performance of the mSampEn statistic is also compared with the singular value decomposition (SVD) and symplectic geometry spectrum (SGS) based methods. The results indicate that the mSampEn statistic is a robust index for detecting determinism in short and noisy time series.  相似文献   

13.
Gerard Briscoe  Philippe De Wilde 《Physica A》2011,390(21-22):3732-3741
A measure called physical complexity is established and calculated for a population of sequences, based on statistical physics, automata theory, and information theory. It is a measure of the quantity of information in an organism’s genome. It is based on Shannon’s entropy, measuring the information in a population evolved in its environment, by using entropy to estimate the randomness in the genome. It is calculated from the difference between the maximal entropy of the population and the actual entropy of the population when in its environment, estimated by counting the number of fixed loci in the sequences of a population. Up until now, physical complexity has only been formulated for populations of sequences with the same length. Here, we investigate an extension to support variable length populations. We then build upon this to construct a measure for the efficiency of information storage, which we later use in understanding clustering within populations. Finally, we investigate our extended physical complexity through simulations, showing it to be consistent with the original.  相似文献   

14.
霍铖宇  马小飞  宁新宝 《物理学报》2017,66(16):160502-160502
心率数据是最易于获取的人体生理数据之一,基于心率变异性的睡眠分析是近年来各种用于日常健康管理的可穿戴设备功能的一个重要发展方向,需要不断探索可以应用于标准睡眠分期时间窗(约30 s)的各类短时特征参数.利用近期报道的有限穿越水平可视图,并进一步提出一种加权有限穿越水平可视图,将不同睡眠状态下的短时心率变异序列映射为网络,进而提取平均集聚系数、特征路径长度、集聚系数熵、路径分布熵、加权集聚系数熵和加权路径分布熵等网络特征参数进行统计分析.结果表明,各网络参数值在醒觉、浅睡期、深睡期和快速眼动期的幅度水平具有显著差异,体现了所述方法在基于短时心率变异数据的睡眠分期中的有效性.同时,进一步研究了健康年轻人和中老年人在不同睡眠状态下的网络参数值,发现两者虽然存在整体的水平差异,但是在不同睡眠状态间的变化仍具有相同的趋势,反映出相对于正常的年龄老化,睡眠调制对心脏动力学系统具有更显著的影响,也说明所述方法可作为基于心率变异性的睡眠研究的一种新的辅助工具.  相似文献   

15.
Complexity of two-dimensional patterns   总被引:1,自引:0,他引:1  
To describe quantitatively the complexity of two-dimensional patterns we introduce a complexity measure based on a mean information gain. Two types of patterns are studied: geometric ornaments and patterns arising in random sequential adsorption of discs on a plane (RSA). For the geometric ornaments analytical expressions for entropy and complexity measures are presented, while for the RSA patterns these are calculated numerically. We compare the information-gain complexity measure with some alternative measures and show advantages of the former one, as applied to two-dimensional structures. Namely, this does not require knowledge of the “maximal” entropy of the pattern, and at the same time sensitively accounts for the inherent correlations in the system. Received 12 November 1999  相似文献   

16.
An echo state network (ESN) is an efficient recurrent neural network (RNN) that is widely used in time series prediction tasks due to its simplicity and low training cost. However, the “black-box” nature of reservoirs hinders the development of ESN. Although a large number of studies have concentrated on reservoir interpretability, the perspective of reservoir modeling is relatively single, and the relationship between reservoir richness and reservoir projection capacity has not been effectively established. To tackle this problem, a novel reservoir interpretability framework based on permutation entropy (PE) theory is proposed in this paper. In structure, this framework consists of reservoir state extraction, PE modeling, and PE analysis. Based on these, the instantaneous reservoir states and neuronal time-varying states are extracted, which are followed by phase space reconstruction, sorting, and entropy calculation. Firstly, the obtained instantaneous state entropy (ISE) and global state entropy (GSE) can measure reservoir richness for interpreting good reservoir projection capacity. On the other hand, the multiscale complexity–entropy analysis of global and neuron-level reservoir states is performed to reveal more detailed dynamics. Finally, the relationships between ESN performance and reservoir dynamic are investigated via Pearson correlation, considering different prediction steps and time scales. Experimental evaluations on several benchmarks and real-world datasets demonstrate the effectiveness and superiority of the proposed reservoir interpretability framework.  相似文献   

17.
A series of Ni44−xCoxMn45Sn11 (x=0, 1, 2) alloys were prepared by means of arc melting, magnetization curves under different temperatures were measured using vibrating sample magnetometer. The bicubic interpolations method was employed to calculate the magnetization M(T, H), and the magnetic entropy changes were figured out using the thermodynamic relation of magnetic materials. Results show that the values calculated based on the bicubic interpolation method are more accurate compared with those from conventional method, and the curves obtained are relatively smooth.  相似文献   

18.
The Parikh–Wilczek tunnelling framework, which treats Hawking radiation as a tunnelling process, is investigated once more in this work. The first order correction, the log-corrected entropy-area relation, emerges naturally in the tunnelling picture if we consider the emission of a spherical shell. The second order correction to the emission rate for the Schwarzschild black hole is also calculated. At this level, the entropy of the black hole will contain three parts: the usual Bekenstein–Hawking entropy, a logarithmic term and an inverse area term. We find that the coefficient of the logarithmic term is −1. Thus, apart from a coefficient, our correction to the black hole entropy is consistent with that calculated in loop quantum gravity.  相似文献   

19.
We study the time series of the total energy of polypeptides and proteins. These time series were generated by molecular dynamics methods and analyzed by applying detrended fluctuation analysis to estimate the long-range power-law correlation, i.e. to measure scaling exponents α. Such exponents were calculated for all systems and their values follow environment conditions, i.e., they are temperature dependent and also, in a continuum medium approach, vary according to the dielectric constants (we simulated ?=2 and ?=80). The procedure was applied to investigate polyalanines, and other realistic models of proteins (Insect Defensin A and Hemoglobin). The present findings exhibit results that are consistent with previous ones obtained by other methodologies.  相似文献   

20.
T. Conlon  M. Crane 《Physica A》2008,387(21):5197-5204
The wide acceptance of Hedge Funds by Institutional Investors and Pension Funds has led to an explosive growth in assets under management. These investors are drawn to Hedge Funds due to the seemingly low correlation with traditional investments and the attractive returns. The correlations and market risk (the Beta in the Capital Asset Pricing Model) of Hedge Funds are generally calculated using monthly returns data, which may produce misleading results as Hedge Funds often hold illiquid exchange-traded securities or difficult to price over-the-counter securities. In this paper, the Maximum Overlap Discrete Wavelet Transform (MODWT) is applied to measure the scaling properties of Hedge Fund correlation and market risk with respect to the S&P 500. It is found that the level of correlation and market risk varies greatly according to the strategy studied and the time scale examined. Finally, the effects of scaling properties on the risk profile of a portfolio made up of Hedge Funds is studied using correlation matrices calculated over different time horizons.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号