首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We propose two robust data‐driven techniques for detecting network structure change points between heavy‐tailed multivariate time series for situations where both the placement and number of change points are unknown. The first technique utilizes the graphical lasso method to estimate the change points, whereas the second technique utilizes the tlasso method. The techniques not only locate the change points but also estimate an undirected graph (or precision matrix) representing the relationship between the time series within each interval created by pairs of adjacent change points. An inference procedure on the edges is used in the graphs to effectively remove false‐positive edges, which are caused by the data deviating from normality. The techniques are compared using simulated multivariate t‐distributed (heavy‐tailed) time series data and the best method is applied to two financial returns data sets of stocks and indices. The results illustrate the method's ability to determine how the dependence structure of the returns changes over time. This information could potentially be used as a tool for portfolio optimization.  相似文献   

2.
Grey model GM (1,1) has been widely used in short-term prediction of energy production and consumption due to its advantages in data sets with small numbers of samples. However, the existing GM (1,1) modelling method can merely forecast the general trend of a time series but fails to identify and predicts the seasonal fluctuations. In the research, the authors propose a data grouping approach based grey modelling method DGGM (1,1) to predict quarterly hydropower production in China. Firstly, the proposed method is used to divide an entire quarterly time series into four groups, each of which contains only time series data within the same quarter. Afterwards, by using the new series of four quarters, models are established, each of which includes specific seasonal characteristics. Finally, according to the chronological order, the prediction results of four GM (1,1) models are combined into a complete quarterly time series to reflect seasonal differences. The mean absolute percent errors (MAPEs) of the test set 2011Q1–2015Q4 solved using the DGGM (1,1), traditional GM (1,1), and SARIMA models are 16.2%, 22.1%, and 22.2%, respectively; the results indicated that DGGM (1,1) has better adaptability and offers a higher prediction accuracy. It is predicted that China's hydropower production from 2016 to 2020 is supposed to maintain its seasonal growth with the third and first quarters showing the highest and lowest productions, respectively.  相似文献   

3.
The autocorrelation function of seasonal time series data is shown to have peaks which occur at the correlation lags equal to the integer multiples of the fundamental period that is present in the series. This property is shown to be valid even if some of the harmonics including the fundamental are removed from the time series data. Using this property, an analytical procedure is presented for estimating the variance of the white noise generating the low frequency random walk model present in the data. The procedure is similarly extended to estimate the variance of white noise generating the autoregressive (AR) and moving average (MA) noise models. The method is validated on several seasonal time series data whose components are known a priori.  相似文献   

4.
本文提出了一种对季节性数据建立数学模型的新方法──横断面方法.其思想是,把一个季节性时间序列划分成为数相当于一个季节周期长度的若干个子序列,在这些子序列中已完全消除了季节性因素,然后对这些子序列分别建立形式各异的数学模型,最后再把这些子序列的数学模型综合起来就得到了对原始序列的数学模型.  相似文献   

5.
Multicommodity flows belong to the class of primal block-angular problems. An efficient interior-point method has already been developed for linear and quadratic network optimization problems. It solved normal equations, using sparse Cholesky factorizations for diagonal blocks, and a preconditioned conjugate gradient for linking constraints. In this work we extend this procedure, showing that the preconditioner initially developed for multicommodity flows applies to any primal block-angular problem, although its efficiency depends on each particular linking constraints structure. We discuss the conditions under which the preconditioner is effective. The procedure is implemented in a user-friendly package in the MATLAB environment. Computational results are reported for four primal block-angular problems: multicommodity flows, nonoriented multicommodity flows, minimum-distance controlled tabular adjustment for statistical data protection, and the minimum congestion problem. The results show that this procedure holds great potential for solving large primal-block angular problems efficiently.  相似文献   

6.
近似周期时间序列具有近似的周期趋势,即近似周期性.所谓近似周期性是指它看起来有周期性,但是每个周期的长度不是常数,比如太阳黑子数序列.近似周期时间序列在社会经济现象建模中有着广泛的应用前景.对于近似周期时间序列,关键在于刻画它的近似周期趋势,因为一旦近似周期趋势被刻画出来它就可以作为一个普通的时间序列来处理.然而,关于近似周期趋势刻画的研究却很少. 本文首先建立一些必要的理论,特别地,提出了带长度压缩的保形变换概念,并且得到了带长度压缩的线性保形变换的充分必要,然后基于此理论作者提出了一种估计尺度变换的方法,该方法可以很好地估计出近似周期趋势.最后,对一个仿真实例进行了分析.结果表明,本文所提出的方法强力有效.  相似文献   

7.
The development of sensor networks has enabled detailed tracking of customer behavior in stores. Shopping path data which records each customer??s position and time information is attracting attention as new marketing data. However, there are no proposed marketing models which can identify good customers from huge amounts of time series data on customer movement in the store. This research aims to use shopping path data resulting from tracking customer behavior in the store, using information on the sequence of visiting each product zone in the store and staying time at each product zone, to find how they affect purchasing. To discover useful knowledge for store management, shopping paths data has been transformed into sequence data including information on visit sequence and staying times in the store, and LCMseq has been applied to them to extract frequent sequence patterns. In this paper, we find characteristic in-store behavior patterns of good customers by using actual data of a Japanese supermarket.  相似文献   

8.
In trying to distinguish data features within time series data for specific time intervals, time series segmentation technology is often required. This research divides time series data into segments of varying lengths. A time series segmentation algorithm based on the Ant Colony Optimization (ACO) algorithm is proposed to exhibit the changeability of the time series data. In order to verify the effect of the proposed algorithm, we experiment with the Bottom-Up method, which has been reported in available literature to give good results for time series segmentation. Simulation data and genuine stock price data are also used in some of our experiments. The research result shows that time series segmentation run by the ACO algorithm not only automatically identifies the number of segments, but its segmentation cost was lower than that of the time series segmentation using the Bottom-Up method. More importantly, during the ACO algorithm process, the degree of data loss is also less compared to that of the Bottom-Up method.  相似文献   

9.
Suppose that for a given time series the experimenter knows that it has a certain periodic property and that he wishes to find out the length of the period. For this problem a nonparametric procedure is proposed. It consists of a new smoothing technique based on Kendall's Tau and a specific counting method. The procedure is studied under a simple model of periodic time series which are composed of periodic (deterministic) functions, a linear trend and exchangeable (stochastic) sequences. The performance of the procedure is illustrated by a simple example.  相似文献   

10.
If isoparametric coordinates are used to deal with curved boundariesin the finite element method, the original boundary is implicitlyreplaced by a series of parabolic or cubic arcs. The equationsof these arcs involve parameters which are the coordinates ofpoints on the curved side, and a simple procedure is outlinedfor choosing these parameters in such a way that each arc isa parabola which passes through four points of the originalcurve thus ensuring a good approximation to it.  相似文献   

11.
The two-dimensional time harmonic problem of diffraction through a slit is considered for arbitrary Dirichlet data, prescribed in the slit; on the screen itself the wavefield satisfies a homogeneous Neumann boundary condition. First of all, a sequence of special Dirichlet data is constructed, for each of these data the resulting wavefield can be expressed in closed form. The sequence can be constructed in a way which yields an orthonormal basis of Dirichlet data. After this has been done, arbitrary Dirichlet data can be expanded into a series of the orthonormal basis functions; this results in a representation for the searched wavefield. The presented method gives a deep insight into the mathematical structure of the problem.  相似文献   

12.
A land classification method was designed for the Community of Madrid (CM), which has lands suitable for either agriculture use or natural spaces. The process started from an extensive previous CM study that contains sets of land attributes with data for 122 types and a minimum-requirements method providing a land quality classification (SQ) for each land. Borrowing some tools from Operations Research (OR) and from Decision Science, that SQ has been complemented by an additive valuation method that involves a more restricted set of 13 representative attributes analysed using Attribute Valuation Functions to obtain a quality index, QI, and by an original composite method that uses a fuzzy set procedure to obtain a combined quality index, CQI, that contains relevant information from both the SQ and the QI methods.  相似文献   

13.
对于样本点是离散的情况,可用对样本点排序的方法确定可靠性置信下限,排序有很多种,有L-P排序、序贯排序、极大似然估计排序、修正L-P排序等。本文提出一种具有直观合理性的新的排序方法,计算指数寿命型元件串联系统可靠性经典精确最优置信下限。  相似文献   

14.
The shifting bottleneck (SB) heuristic is among the most successful approximation methods for solving the job shop problem. It is essentially a machine based decomposition procedure where a series of one machine sequencing problems (OMSPs) are solved. However, such a procedure has been reported to be highly ineffective for the flow shop problems. In particular, we show that for the 2-machine flow shop problem, the SB heuristic will deliver the optimal solution in only a small number of instances. We examine the reason behind the failure of the machine based decomposition method for the flow shop. An optimal machine based decomposition procedure is formulated for the 2-machine flow shop, the time complexity of which is worse than that of the celebrated Johnson’s rule. The contribution of the present study lies in showing that the same machine based decomposition procedures which are so successful in solving complex job shops can also be suitably modified to optimally solve the simpler flow shops.  相似文献   

15.
We consider the problem of estimating the optimal steady effort level from a time series of catch and effort data, taking account of errors in the observation of the “effective effort” as well as randomness in the stock-production function. The “total least squares” method ignores the time series nature of the data, while the “approximate likelihood” method takes it into account. We compare estimation schemes based upon these two methods by applying them to artificial data for which the “correct” parameters are known. We use a similar procedure to compare the effectiveness of a “power model” for stock and production with the “Ricker model.” We apply these estimation methods to some sets of real data, and obtain an interval estimate of the optimal effort.  相似文献   

16.
Likelihood-Based Inference for Extreme Value Models   总被引:7,自引:0,他引:7  
Estimation of the extremal behavior of a process is often based on the fitting of asymptotic extreme value models to relatively short series of data. Maximum likelihood has emerged as a flexible and powerful modeling tool in such applications, but its performance with small samples has been shown to be poor relative to an alternative fitting procedure based on probability weighted moments. We argue here that the small-sample superiority of the probability weighted moments estimator is due to the assumption of a restricted parameter space, corresponding to finite population moments. To incorporate similar information in a likelihood-based analysis, we propose a penalized maximum likelihood estimator that retains the modeling flexibility and large-sample optimality of the maximum likelihood estimator, but improves on its small-sample properties. The properties of the penalized likelihood estimator are verified in a simulation study, and in application to sea-level data, which also enables the procedure to be evaluated in the context of structural models for extremes.  相似文献   

17.
This paper presents a method for obtaining closed form solutions to serial and nonserial dynamic programming problems with quadratic stage returns and linear transitions. Global parametric optimum solutions can be obtained regardless of the convexity of the stage returns. The closed form solutions are developed for linear, convex, and nonconvex quadratic returns, as well as the procedure for recursively solving each stage of the problem. Dynamic programming is a mathematical optimization technique which is especially powerful for certain types of problems. This paper presents a procedure for obtaining analytical solutions to a class of dynamic programming problems. In addition, the procedure has been programmed on the computer to facilitate the solution of large problems.  相似文献   

18.
A commonly used method of monitoring the condition of rail track is to run an inspection vehicle over the track at intervals of about 3 months. Measurements of several geometric properties of the track are automatically recorded about every 900 mm, resulting in long sequences of data (signals) arising from runs of up to 100 km. Condition monitoring is done by comparing the results of a current run with those of a previously recorded reference run. Before this can be done, the two signals need to be aligned so that corresponding distance measurements in each signal actually refer to the same point on the track. A procedure for matching the two signals is presented, which has at its heart a dynamic programming method. The procedure is demonstrated on data from rail tracks in Australia.  相似文献   

19.
One useful approach for fitting linear models with scalar outcomes and functional predictors involves transforming the functional data to wavelet domain and converting the data-fitting problem to a variable selection problem. Applying the LASSO procedure in this situation has been shown to be efficient and powerful. In this article, we explore two potential directions for improvements to this method: techniques for prescreening and methods for weighting the LASSO-type penalty. We consider several strategies for each of these directions which have never been investigated, either numerically or theoretically, in a functional linear regression context. We compare the finite-sample performance of the proposed methods through both simulations and real-data applications with both 1D signals and 2D image predictors. We also discuss asymptotic aspects. We show that applying these procedures can lead to improved estimation and prediction as well as better stability. Supplementary materials for this article are available online.  相似文献   

20.
Recently, a Bayesian network model for inferring non-stationary regulatory processes from gene expression time series has been proposed. The Bayesian Gaussian Mixture (BGM) Bayesian network model divides the data into disjunct compartments (data subsets) by a free allocation model, and infers network structures, which are kept fixed for all compartments. Fixing the network structure allows for some information sharing among compartments, and each compartment is modelled separately and independently with the Gaussian BGe scoring metric for Bayesian networks. The BGM model can equally be applied to both static (steady-state) and dynamic (time series) gene expression data. However, it is this flexibility that renders its application to time series data suboptimal. To improve the performance of the BGM model on time series data we propose a revised approach in which the free allocation of data points is replaced by a changepoint process so as to take the temporal structure into account. The practical inference follows the Bayesian paradigm and approximately samples the network, the number of compartments and the changepoint locations from the posterior distribution with Markov chain Monte Carlo (MCMC). Our empirical results show that the proposed modification leads to a more efficient inference tool for analysing gene expression time series.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号