首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
基于鲁棒极端学习机的混沌时间序列建模预测   总被引:1,自引:0,他引:1       下载免费PDF全文
沈力华  陈吉红  曾志刚  金健 《物理学报》2018,67(3):30501-030501
针对混沌时间序列预测模型易受异常点影响,导致模型预测精度低的问题,在贝叶斯框架下提出一种鲁棒极端学习机.所提模型将具有重尾分布特性的高斯混合分布作为模型输出似然函数,得到一种对异常点和噪声更具鲁棒性的预测模型.但由于将高斯混合分布作为模型输出似然函数后,模型输出的边缘似然函数变成难以解析处理的形式,因此引入变分方法进行近似推理,实现模型参数的估计.在加入异常点和噪声的情况下,将所提模型应用于大气环流模拟模型方程Lorenz序列以及Rossler混沌时间序列和太阳黑子混沌时间序列的预测中,预测结果验证了所提模型的有效性.  相似文献   

2.
3.
Recently a new framework has been proposed to explore the dynamics of pseudoperiodic time series by constructing a complex network [J. Zhang, M. Small, Phys. Rev. Lett. 96 (2006) 238701]. Essentially, this is a transformation from the time domain to the network domain, which allows for the dynamics of the time series to be studied via organization of the network. In this paper, we focus on the deterministic chaotic Rössler time series and stochastic noisy periodic data that yield substantially different structures of networks. In particular, we test an extensive range of network topology statistics, which have not been discussed in previous work, but which are capable of providing a comprehensive statistical characterization of the dynamics from different angles. Our goal is to find out how they reflect and quantify different aspects of specific dynamics, and how they can be used to distinguish different dynamical regimes. For example, we find that the joint degree distribution appears to fundamentally characterize spatial organizations of cycles in phase space, and this is quantified via an assortativity coefficient. We applied network statistics to electrocardiograms of a healthy individual and an arrythmia patient. Such time series are typically pseudoperiodic, but are noisy and nonstationary and degrade traditional phase-space based methods. These time series are, however, better differentiated by our network-based statistics.  相似文献   

4.
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.  相似文献   

5.
We introduce a technique of time series analysis, potential forecasting, which is based on dynamical propagation of the probability density of time series. We employ polynomial coefficients of the orthogonal approximation of the empirical probability distribution and extrapolate them in order to forecast the future probability distribution of data. The method is tested on artificial data, used for hindcasting observed climate data, and then applied to forecast Arctic sea-ice time series. The proposed methodology completes a framework for ‘potential analysis’ of tipping points which altogether serves anticipating, detecting and forecasting nonlinear changes including bifurcations using several independent techniques of time series analysis. Although being applied to climatological series in the present paper, the method is very general and can be used to forecast dynamics in time series of any origin.  相似文献   

6.
异常值的存在往往干扰着时间序列三维荧光光谱的定性和定量分析。充分利用时间维和光谱维的内在特性, 提出了一种有效的异常值检测方法。在时间维结合方差提取异常值可能性最大的波长点;通过对异常值存在方式的分析,在任意两个三维荧光光谱的相似度基础上给出了光谱维上的累积相似度;最后利用时间维的校正矩阵对所有三维荧光光谱的每个波长点荧光强度进行修正并计算对应的累积相似度,从而根据累积相似度对异常值进行判断。时间维校正矩阵的采用不仅提高了算法的有效性而且其特征区域的选择大大减少了光谱维相似度的计算量。相关的数值试验表明光谱维选取50%的波长点仍然能有效对异常值进行检测。  相似文献   

7.
Dror Mirzayof 《Physica A》2010,389(24):5573-5580
Many natural time series exhibit long range temporal correlations that may be characterized by power-law scaling exponents. However, in many cases, the time series have uneven time intervals due to, for example, missing data points, noisy data, and outliers. Here we study the effect of randomly missing data points on the power-law scaling exponents of time series that are long range temporally correlated. The Fourier transform and detrended fluctuation analysis (DFA) techniques are used for scaling exponent estimation. We find that even under extreme dilution of more than 50%, the value of the scaling exponent remains almost unaffected. Random dilution is also applied on heart interbeat interval time series. It is found that dilution of 70%-80% of the data points leads to a reduction of only 8% in the scaling exponent; it is also found that it is possible to discriminate between healthy and heart failure subjects even under extreme dilution of more than 90%.  相似文献   

8.
The unique scaling behavior of financial time series have attracted the research interest of physicists. Variables such as stock returns, share volume, and number of trades have been found to display distributions that are consistent with a power-law tail. We present an overview of recent research joining practitioners of economic theory and statistical physics to try to understand better some puzzles regarding economic fluctuations. One of these puzzles is how to describe outliers, i.e. phenomena that lie outside of patterns of statistical regularity. We review recent research, which suggests that such outliers may not in fact exist and that the same laws seem to govern outliers as well as day-to-day fluctuations.  相似文献   

9.
We present a treatment of many-body fermionic systems that facilitates an expression of well-known quantities in a series expansion inħ. The ensuing semiclassical result contains, to a leading order of the response function, the classical time correlation function of the observable followed by the Weyl-Wigner series; on top of these terms are the periodic-orbit correction terms. The treatment given here starts from linear response assumption of the many-body theory and in its connection with semiclassical theory, it assumes that the one-body quantal system has a classically chaotic dynamics. Applications of the framework are also discussed.  相似文献   

10.
We study the dynamics of the linear and non-linear serial dependencies in financial time series in a rolling window framework. In particular, we focus on the detection of episodes of statistically significant two- and three-point correlations in the returns of several leading currency exchange rates that could offer some potential for their predictability. We employ a rolling window approach in order to capture the correlation dynamics for different window lengths and analyze the distributions of periods with statistically significant correlations. We find that for sufficiently large window lengths these distributions fit well to power-law behavior. We also measure the predictability itself by a hit rate, i.e. the rate of consistency between the signs of the actual returns and their predictions, obtained from a simple correlation-based predictor. It is found that during these relatively brief periods the returns are predictable to a certain degree and the predictability depends on the selection of the window length.  相似文献   

11.
12.
The problem of preserving fidelity in numerical computation of nonlinear ordinary differential equations is studied in terms of preserving local differential structure and approximating global integration structure of the dynamical system. The ordinary differential equations are lifted to the corresponding partial differential equations in the framework of algebraic dynamics, and a new algorithm—algebraic dynamics algorithm is proposed based on the exact analytical solutions of the ordinary differential equations by the algebraic dynamics method. In the new algorithm, the time evolution of the ordinary differential system is described locally by the time translation operator and globally by the time evolution operator. The exact analytical piece-like solution of the ordinary differential equations is expressed in terms of Taylor series with a local convergent radius, and its finite order truncation leads to the new numerical algorithm with a controllable precision better than Runge Kutta Algorithm and Symplectic Geometric Algorithm.  相似文献   

13.
Recently, a framework for analyzing time series by constructing an associated complex network has attracted significant research interest. One of the advantages of the complex network method for studying time series is that complex network theory provides a tool to describe either important nodes, or structures that exist in the networks, at different topological scale. This can then provide distinct information for time series of different dynamical systems. In this paper, we systematically investigate the recurrence-based phase space network of order k that has previously been used to specify different types of dynamics in terms of the motif ranking from a different perspective. Globally, we find that the network size scales with different scale exponents and the degree distribution follows a quasi-symmetric bell shape around the value of 2k with different values of degree variance from periodic to chaotic Ro?ssler systems. Local network properties such as the vertex degree, the clustering coefficients and betweenness centrality are found to be sensitive to the local stability of the orbits and hence contain complementary information.  相似文献   

14.
The universal character of the dynamics of various extreme phenomena is an outstanding scientific challenge. We show that X-ray flux and Dst time series during powerful solar flares and intense magnetic storms, respectively, obey a nonextensive energy distribution function for earthquake dynamics with similar values for the Tsallis entropic index q. Thus, evidence for universality in solar flares, magnetic storms and earthquakes arise naturally in the framework of Tsallis statistical mechanics. The observed similarity suggests a common approach to the interpretation of these diverse phenomena in terms of driving physical mechanisms that have the same character.  相似文献   

15.
The framework of infinitely divisible scaling was first developed to analyse the statistical intermittency of turbulence in fluid dynamics. It also reveals a powerful tool to describe and model various situations including Internet traffic, financial time series, textures ... A series of recent works introduced the infinitely divisible cascades in 1 dimension, a family of multifractal processes that can be easily synthesized numerically. This work extends the definition of infinitely divisible cascades from 1 dimension to d dimensions in the scalar case. Thus, a class of models is proposed both for data analysis and for numerical simulation in dimension d≥1. In this article, we give the definitions and main properties of infinitely divisible cascades in d dimensions. Then we focus on the modelling of statistical intermittency in turbulent flows. Several other applications are considered.  相似文献   

16.
The Nasdaq Composite fell another % on Friday the 14'th of April 2000 signaling the end of a remarkable speculative high-tech bubble starting in spring 1997. The closing of the Nasdaq Composite at 3321 corresponds to a total loss of over 35% since its all-time high of 5133 on the 10'th of March 2000. Similarities to the speculative bubble preceding the infamous crash of October 1929 are quite striking: the belief in what was coined a “New Economy” both in 1929 and presently made share-prices of companies with three digits price-earning ratios soar. Furthermore, we show that the largest draw downs of the Nasdaq are outliers with a confidence level better than 99% and that these two speculative bubbles, as well as others, both nicely fit into the quantitative framework proposed by the authors in a series of recent papers. Received 3 May 2000  相似文献   

17.
The behavior displayed by a quantum system when it is perturbed by a series of von Neumann measurements along time is analyzed. Because of the similarity between this general process with giving a deck of playing cards a shuffle, here it is referred to as quantum shuffling, showing that the quantum Zeno and anti-Zeno effects emerge naturally as two time limits. Within this framework, a connection between the gradual transition from anti-Zeno to Zeno behavior and the appearance of an underlying Markovian dynamics is found. Accordingly, although a priori it might result counterintuitive, the quantum Zeno effect corresponds to a dynamical regime where any trace of knowledge on how the unperturbed system should evolve initially is wiped out (very rapid shuffling). This would explain why the system apparently does not evolve or decay for a relatively long time, although it eventually undergoes an exponential decay. By means of a simple working model, conditions characterizing the shuffling dynamics have been determined, which can be of help to understand and to devise quantum control mechanisms in a number of processes from the atomic, molecular and optical physics.  相似文献   

18.
The classical and quantum dynamics in a high frequency field are found to be described by an effective time independent Hamiltonian. It is calculated in a systematic expansion in the inverse of the frequency (omega) to order omega(-4). The work is an extension of the classical result for the Kapitza pendulum, which was calculated in the past to order omega(-2). The analysis makes use of an implementation of the method of separation of time scales and of a quantum gauge transformation in the framework of Floquet theory. The effective time independent Hamiltonian enables one to explore the dynamics in the presence of rapidly oscillating fields, in the framework of theories that were developed for systems with time independent Hamiltonians. The results are relevant, in particular, for exploring the dynamics of cold atoms.  相似文献   

19.
As an extension of the support vector machine, support vector regression (SVR) plays a significant role in image denoising. However, due to ignoring the spatial distribution information of noisy pixels, the conventional SVR denoising model faces the bottleneck of overfitting in the case of serious noise interference, which leads to a degradation of the denoising effect. For this problem, this paper proposes a significance measurement framework for evaluating the sample significance with sample spatial density information. Based on the analysis of the penalty factor in SVR, significance SVR (SSVR) is presented by assigning the sample significance factor to each sample. The refined penalty factor enables SSVR to be less susceptible to outliers in the solution process. This overcomes the drawback that the SVR imposes the same penalty factor for all samples, which leads to the objective function paying too much attention to outliers, resulting in poorer regression results. As an example of the proposed framework applied in image denoising, a cutoff distance-based significance factor is instantiated to estimate the samples’ importance in SSVR. Experiments conducted on three image datasets showed that SSVR demonstrates excellent performance compared to the best-in-class image denoising techniques in terms of a commonly used denoising evaluation index and observed visual.  相似文献   

20.
Time series classification (TSC) is a significant problem in data mining with several applications in different domains. Mining different distinguishing features is the primary method. One promising method is algorithms based on the morphological structure of time series, which are interpretable and accurate. However, existing structural feature-based algorithms, such as time series forest (TSF) and shapelet traverse, all features through many random combinations, which means that a lot of training time and computing resources are required to filter meaningless features, important distinguishing information will be ignored. To overcome this problem, in this paper, we propose a perceptual features-based framework for TSC. We are inspired by how humans observe time series and realize that there are usually only a few essential points that need to be remembered for a time series. Although the complex time series has a lot of details, a small number of data points is enough to describe the shape of the entire sample. First, we use the improved perceptually important points (PIPs) to extract key points and use them as the basis for time series segmentation to obtain a combination of interval-level and point-level features. Secondly, we propose a framework to explore the effects of perceptual structural features combined with decision trees (DT), random forests (RF), and gradient boosting decision trees (GBDT) on TSC. The experimental results on the UCR datasets show that our work has achieved leading accuracy, which is instructive for follow-up research.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号