首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Abstract

Modeling of space-time functions can be done using observations in the form of averages of the function over a set of irregularly shaped regions in space-time. Such observations are most common in applications where the data are gathered for administrative, political, geographic, or agricultural regions. The value of such functions can be predicted by first estimating the dependence structure of the underlying stochastic process. Our proposed method for estimating the covariance function from the integrals of a stationary isotropic stochastic process poses the problem as a set of integral equations. To test this proposal we applied it to epidemiological data on the incidence rates of three diseases in the United States between 1980 and 1994. Spatial correlations obtained in this way reasonably described the mechanism by which those diseases spread. We therefore conclude that it is possible to reliably estimate covariance functions from aggregate observations. The estimate of the covariance functions provides valuable insights into the nature of the space-time process—in the epidemiological data it described a possible mechanism by which the diseases spread.  相似文献   

2.

We propose a method for obtaining the maximum likelihood estimators of the parameters of the Markov-Modulated Diffusion Risk Model in which the inter-claim times, the claim sizes, and the volatility diffusion process are influenced by an underlying Markov jump process. We consider cases when this process has been observed in two scenarios: first, only observing the inter-claim times and the claim sizes in an interval time, and second, considering the number of claims and the underlying Markov jump process at discrete times. In both cases, the data can be viewed as incomplete observations of a model with a tractable likelihood function, so we propose to use algorithms based on stochastic Expectation-Maximization algorithms to do the statistical inference. For the second scenario, we present a simulation study to estimate the ruin probability. Moreover, we apply the Markov-Modulated Diffusion Risk Model to fit a real dataset of motor insurance.

  相似文献   

3.
We propose a formal test of separability of covariance models based on a likelihood ratio statistic. The test is developed in the context of multivariate repeated measures (for example, several variables measured at multiple times on many subjects), but can also apply to a replicated spatio-temporal process and to problems in meteorology, where horizontal and vertical covariances are often assumed to be separable. Separable models are a common way to model spatio-temporal covariances because of the computational benefits resulting from the joint space-time covariance being factored into the product of a covariance function that depends only on space and a covariance function that depends only on time. We show that when the null hypothesis of separability holds, the distribution of the test statistic does not depend on the type of separable model. Thus, it is possible to develop reference distributions of the test statistic under the null hypothesis. These distributions are used to evaluate the power of the test for certain nonseparable models. The test does not require second-order stationarity, isotropy, or specification of a covariance model. We apply the test to a multivariate repeated measures problem.  相似文献   

4.
Estimating the probability of extreme temperature events is difficult because of limited records across time and the need to extrapolate the distributions of these events, as opposed to just the mean, to locations where observations are not available. Another related issue is the need to characterize the uncertainty in the estimated probability of extreme events at different locations. Although the tools for statistical modeling of univariate extremes are well-developed, extending these tools to model spatial extreme data is an active area of research. In this paper, in order to make inference about spatial extreme events, we introduce a new nonparametric model for extremes. We present a Dirichlet-based copula model that is a flexible alternative to parametric copula models such as the normal and t-copula. The proposed modelling approach is fitted using a Bayesian framework that allow us to take into account different sources of uncertainty in the data and models. We apply our methods to annual maximum temperature values in the east-south-central United States.  相似文献   

5.
Stochastic earthquake models are often based on a marked point process approach as for instance presented in Vere-Jones (Int. J. Forecast., 11:503–538, 1995). This gives a fine resolution both in space and time making it possible to represent each earthquake. However, it is not obvious that this approach is advantageous when aiming at earthquake predictions. In the present paper we take a coarse point of view considering grid cells of 0.5 × 0.5°, or about 50 × 50 km, and time periods of 4 months, which seems suitable for predictions. More specifically, we will discuss different alternatives of a Bayesian hierarchical space–time model in the spirit of Wikle et al. (Environ. Ecol. Stat., 5:117–154, 1998). For each time period the observations are the magnitudes of the largest observed earthquake within each grid cell. As data we apply parts of an earthquake catalogue provided by The Northern California Earthquake Data Center where we limit ourselves to the area 32–37° N, 115–120° W for the time period January 1981 through December 1999 containing the Landers and Hector Mine earthquakes of magnitudes, respectively, 7.3 and 7.1 on the Richter scale. Based on space-time model alternatives one step earthquake predictions for the time periods containing these two events for all grid cells are arrived at. The model alternatives are implemented within an MCMC framework in Matlab. The model alternative that gives the overall best predictions based on a standard loss is claimed to give new knowledge on the spatial and time related dependencies between earthquakes. Also considering a specially designed loss using spatially averages of the 90th percentiles of the predicted values distribution of each cell it is clear that the best model predicts the high risk areas rather well. By using these percentiles we believe that one has a valuable tool for defining high and low risk areas in a region in short term predictions.   相似文献   

6.
This article proposes a class of conditionally specified models for the analysis of multivariate space-time processes. Such models are useful in situations where there is sparse spatial coverage of one of the processes and much more dense coverage of the other process(es). The dependence structure across processes and over space, and time is completely specified through a neighborhood structure. These models are applicable to both point and block sources; for example, multiple pollutant monitors (point sources) or several county-level exposures (block sources). We introduce several computational tricks that are integral for model fitting, give some simple sufficient and necessary conditions for the space-time covariance matrix to be positive definite, and implement a Gibbs sampler, using Hybrid MC steps, to sample from the posterior distribution of the parameters. Model fit is assessed via the DIC. Predictive accuracy, over both time and space, is assessed both relatively and absolutely via mean squared prediction error and coverage probabilities. As an illustration of these models, we fit them to particulate matter and ozone data collected in the Los Angeles, CA, area in 1995 over a three-month period. In these data, the spatial coverage of particulate matter was sparse relative to that of ozone.  相似文献   

7.
In this paper we explicitly solve a non-linear filtering problem with mixed observations, modelled by a Brownian motion and a generalized Cox process, whose jump intensity is given in terms of a Lévy measure. Motivated by empirical observations of R. Cont and P. Tankov we propose a model for financial assets, which captures the phenomenon of time inhomogeneity of the jump size density. We apply the explicit formula to obtain the optimal filter for the corresponding filtering problem.  相似文献   

8.
Considering absolute log returns as a proxy for stochastic volatility, the influence of explanatory variables on absolute log returns of ultra high frequency data is analysed. The irregular time structure and time dependency of the data is captured by utilizing a continuous time ARMA(p,q) process. In particular, we propose a mixed effect model class for the absolute log returns. Explanatory variable information is used to model the fixed effects, whereas the error is decomposed in a non‐negative Lévy driven continuous time ARMA(p,q) process and a market microstructure noise component. The parameters are estimated in a state space approach. In a small simulation study the performance of the estimators is investigated. We apply our model to IBM trade data and quantify the influence of bid‐ask spread and duration on a daily basis. To verify the correlation in irregularly spaced data we use the variogram, known from spatial statistics. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

9.
Burn‐in tests help manufacturers detect defective items and remove them before being sold to customers. In a competitive marketplace, cost is a major consideration and not employing a burn‐in test may result in higher and needless expenses. With this in mind, we consider degradation‐based burn‐in tests in which the degradation path follows a Wiener process and weak items are identified when the process crosses a piecewise linear function. We also study linear functions as a special case of such a piecewise linear barrier. Within this setup, we apply a cost model to determine the optimal burn‐in test. Finally, we discuss an illustrative example using GaAs laser degradation data and present an optimal burn‐in test for it.  相似文献   

10.
Regularly varying stochastic processes are able to model extremal dependence between process values at locations in random fields. We investigate the empirical extremogram as an estimator of dependence in the extremes. We provide conditions to ensure asymptotic normality of the empirical extremogram centred by a pre-asymptotic version. The proof relies on a CLT for exceedance variables. For max-stable processes with Fréchet margins we provide conditions such that the empirical extremogram centred by its true version is asymptotically normal. The results of this paper apply to a variety of spatial and space–time processes, and to time series models. We apply our results to max-moving average processes and Brown–Resnick processes.  相似文献   

11.
Schwarz waveform relaxation algorithms (SWR) are naturally parallel solvers for evolution partial differential equations. They are based on a decomposition of the spatial domain into subdomains, and a partition of the time interval of interest into time windows. On each time window, an iteration, during which subproblems are solved in space-time subdomains, is then used to obtain better and better approximations of the overall solution. The information exchange between subdomains in space-time is performed through classical or optimized transmission conditions (TCs). We analyze in this paper the optimization problem when the time windows are short. We use as our model problem the optimized SWR algorithm with Robin TCs applied to the heat equation. After a general convergence analysis using energy estimates, we prove that in one spatial dimension, the optimized Robin parameter scales like the inverse of the length of the time window, which is fundamentally different from the known scaling on general bounded time windows, which is like the inverse of the square root of the time window length. We illustrate our analysis with a numerical experiment.  相似文献   

12.
In this paper we study the numerical approximation of Turing patterns corresponding to steady state solutions of a PDE system of reaction–diffusion equations modeling an electrodeposition process. We apply the Method of Lines (MOL) and describe the semi-discretization by high order finite differences in space given by the Extended Central Difference Formulas (ECDFs) that approximate Neumann boundary conditions (BCs) with the same accuracy. We introduce a test equation to describe the interplay between the diffusion and the reaction time scales. We present a stability analysis of a selection of time-integrators (IMEX 2-SBDF method, Crank–Nicolson (CN), Alternating Direction Implicit (ADI) method) for the test equation as well as for the Schnakenberg model, prototype of nonlinear reaction–diffusion systems with Turing patterns. Eventually, we apply the ADI-ECDF schemes to solve the electrodeposition model until the stationary patterns (spots & worms and only spots) are reached. We validate the model by comparison with experiments on Cu film growth by electrodeposition.  相似文献   

13.
The wavelet variance is a scale-based decomposition of the process variance for a time series and has been used to analyze, for example, time deviations in atomic clocks, variations in soil properties in agricultural plots, accumulation of snow fields in the polar regions and marine atmospheric boundary layer turbulence. We propose two new unbiased estimators of the wavelet variance when the observed time series is ‘gappy,’ i.e., is sampled at regular intervals, but certain observations are missing. We deduce the large sample properties of these estimators and discuss methods for determining an approximate confidence interval for the wavelet variance. We apply our proposed methodology to series of gappy observations related to atmospheric pressure data and Nile River minima.  相似文献   

14.
The space-time fractional Poisson process (STFPP), defined by Orsingher and Poilto (2012), is a generalization of the time fractional Poisson process (TFPP) and the space fractional Poisson process (SFPP). We study the fractional generalization of the non-homogeneous Poisson process and call it the non-homogeneous space-time fractional Poisson process (NHSTFPP). We compute their pmf and generating function and investigate the associated differential equation. The limit theorems for the NHSTFPP process are studied. We study the distributional properties, the asymptotic expansion of the correlation function of the non-homogeneous time fractional Poisson process (NHTFPP) and subsequently investigate the long-range dependence (LRD) property of a special NHTFPP. We investigate the limit theorem for the fractional non-homogeneous Poisson process (FNHPP) studied by Leonenko et al. (2014). Finally, we present some simulated sample paths of the NHSTFPP process.  相似文献   

15.
Abstract

An essential feature of longitudinal data is the existence of autocorrelation among the observations from the same unit or subject. Two-stage random-effects linear models are commonly used to analyze longitudinal data. These models are not flexible enough, however, for exploring the underlying data structures and, especially, for describing time trends. Semi-parametric models have been proposed recently to accommodate general time trends. But these semi-parametric models do not provide a convenient way to explore interactions among time and other covariates although such interactions exist in many applications. Moreover, semi-parametric models require specifying the design matrix of the covariates (time excluded). We propose nonparametric models to resolve these issues. To fit nonparametric models, we use the novel technique of the multivariate adaptive regression splines for the estimation of mean curve and then apply an EM-like iterative procedure for covariance estimation. After giving a general algorithm of model building, we show how to design a fast algorithm. We use both simulated and published data to illustrate the use of our proposed method.  相似文献   

16.
利用微分几何学的知识建立了一种登机模型,模型主要研究一般关系的时空.它是一个机制模型,可以用来获得解析解.使用米制单位描述登机过程.乘客在登机时的妨碍关系产生了时空中自然的偏序,登机时间与最长的偏序链的大小一致.  相似文献   

17.
We consider the problem of the construction of the goodness-of-fit test in the case of continuous time observations of a diffusion process with small noise. The null hypothesis is parametric and we use a minimum distance estimator of the unknown parameter. We propose an asymptotically distribution free test for this model.  相似文献   

18.
Song  Bo  Jiang  Yao-Lin  Wang  Xiaolong 《Numerical Algorithms》2021,86(4):1685-1703

The Dirichlet-Neumann and Neumann-Neumann waveform relaxation methods are nonoverlapping spatial domain decomposition methods to solve evolution problems, while the parareal algorithm is in time parallel fashion. Based on the combinations of these space and time parallel strategies, we present and analyze two parareal algorithms based on the Dirichlet-Neumann and the Neumann-Neumann waveform relaxation method for the heat equation by choosing Dirichlet-Neumann/Neumann-Neumann waveform relaxation as two new kinds of fine propagators instead of the classical fine propagator. Both new proposed algorithms could be viewed as a space-time parallel algorithm, which increases the parallelism both in space and in time. We derive for the heat equation the convergence results for both algorithms in one spatial dimension. We also illustrate our theoretical results with numerical experiments finally.

  相似文献   

19.
Space-Time Point-Process Models for Earthquake Occurrences   总被引:5,自引:0,他引:5  
Several space-time statistical models are constructed based on both classical empirical studies of clustering and some more speculative hypotheses. Then we discuss the discrimination between models incorporating contrasting assumptions concerning the form of the space-time clusters. We also examine further practical extensions of the model to situations where the background seismicity is spatially non-homogeneous, and the clusters are non-isotropic. The goodness-of-fit of the models, as measured by AIC values, is discussed for two high quality data sets, in different tectonic regions. AIC also allows the details of the clustering structure in space to be clarified. A simulation algorithm for the models is provided, and used to confirm the numerical accuracy of the likelihood calculations. The simulated data sets show the similar spatial distributions to the real ones, but differ from them in some features of space-time clustering. These differences may provide useful indicators of directions for further study.  相似文献   

20.
We consider a mathematical model which describes the dynamic process of contact between a piezoelectric body and an electrically conductive foundation. We model the material’s behavior with a nonlinear electro-viscoelastic constitutive law; the contact is frictionless and is described with the normal compliance condition and a regularized electrical conductivity condition. We derive a variational formulation for the problem and then, under a smallness assumption on the data, we prove the existence of a unique weak solution to the model. We also investigate the behavior of the solution with respect the electric data on the contact surface and prove a continuous dependence result. Then, we introduce a fully discrete scheme, based on the finite element method to approximate the spatial variable and the backward Euler scheme to discretize the time derivatives. We treat the contact by using a penalized approach and a version of Newton’s method. We implement this scheme in a numerical code and, in order to verify its accuracy, we present numerical simulations in the study of two-dimensional test problems. These simulations provide a numerical validation of our continuous dependence result and illustrate the effects of the conductivity of the foundation, as well.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号