首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 656 毫秒
1.
The unstable periodic orbits of a chaotic system provide an important skeleton of the dynamics in a chaotic system, but they can be difficult to find from an observed time series. We present a global method for finding periodic orbits based on their symbolic dynamics, which is made possible by several recent methods to find good partitions for symbolic dynamics from observed time series. The symbolic dynamics are approximated by a Markov chain estimated from the sequence using information-theoretical concepts. The chain has a probabilistic graph representation, and the cycles of the graph may be exhaustively enumerated with a classical deterministic algorithm, providing a global, comprehensive list of symbolic names for its periodic orbits. Once the symbolic codes of the periodic orbits are found, the partition is used to localize the orbits back in the original state space. Using the periodic orbits found, we can estimate several quantities of the attractor such as the Lyapunov exponent and topological entropy.  相似文献   

2.
We apply Bayesian inference to analyze three-bond scalar coupling constants in an objective and consistent way. The Karplus curve and a Gaussian error law are used to model scalar coupling measurements. By applying Bayes' theorem, we obtain a probability distribution for all unknowns, i.e., the torsion angles, the Karplus parameters, and the standard deviation of the Gaussian. We infer all these unknowns from scalar coupling data using Markov chain Monte Carlo sampling and analytically derive a probability distribution that only involves the torsion angles.  相似文献   

3.
The mutual relationship among Markov partitions is investigated for one-dimensional piecewise monotonic map. It is shown that if a Markov partition is regarded as a map-refinement of the other Markov partition, that is, a concept we newly introduce in this article, one can uniquely translate a set of symbolic sequences by one Markov partition to those by the other or vice versa. However, the set of symbolic sequences constructed using Markov partitions is not necessarily translated with each other if there exists no map-refinement relation among them. By using a roof map we demonstrate how the resultant symbolic sequences depend on the choice of Markov partitions.  相似文献   

4.
Symbolic nonlinear time series analysis methods have the potential for analyzing nonlinear data efficiently with low sensitivity to noise. In symbolic nonlinear time series analysis a time series for a fixed delay is partitioned into a small number (called the alphabet size) of cells labeled by symbols, creating a symbolic time series. Symbolic methods involve computing the statistics of words made from the symbolic time series. Specifically, the Shannon entropy of the distribution of possible words for a range of word lengths is computed. The rate of increase of the entropy with word length is the metric (Kolmogorov-Sinai) entropy. Methods of computing the metric entropy for flows as well as for maps are shown. A method of computing the information dimension appropriate to symbolic analysis is proposed. In terms of this formulation, the information dimension is determined by the scaling of entropy as alphabet size is modestly increased, using the information obtained from large word length. We discuss the role of sampling time and the issue of using these methods when there may be no generating partition.  相似文献   

5.
The heterogeneous graphical Granger model (HGGM) for causal inference among processes with distributions from an exponential family is efficient in scenarios when the number of time observations is much greater than the number of time series, normally by several orders of magnitude. However, in the case of “short” time series, the inference in HGGM often suffers from overestimation. To remedy this, we use the minimum message length principle (MML) to determinate the causal connections in the HGGM. The minimum message length as a Bayesian information-theoretic method for statistical model selection applies Occam’s razor in the following way: even when models are equal in their measure of fit-accuracy to the observed data, the one generating the most concise explanation of data is more likely to be correct. Based on the dispersion coefficient of the target time series and on the initial maximum likelihood estimates of the regression coefficients, we propose a minimum message length criterion to select the subset of causally connected time series with each target time series and derive its form for various exponential distributions. We propose two algorithms—the genetic-type algorithm (HMMLGA) and exHMML to find the subset. We demonstrated the superiority of both algorithms in synthetic experiments with respect to the comparison methods Lingam, HGGM and statistical framework Granger causality (SFGC). In the real data experiments, we used the methods to discriminate between pregnancy and labor phase using electrohysterogram data of Islandic mothers from Physionet databasis. We further analysed the Austrian climatological time measurements and their temporal interactions in rain and sunny days scenarios. In both experiments, the results of HMMLGA had the most realistic interpretation with respect to the comparison methods. We provide our code in Matlab. To our best knowledge, this is the first work using the MML principle for causal inference in HGGM.  相似文献   

6.
High levels of the so-called community noise may produce hazardous effect on the health of a population exposed to them for large periods of time. Hence, the study of the behaviour of those noise measurements is very important. In this work we analyse that in terms of the probability of exceeding a given threshold level a certain number of times in a time interval of interest. Since the datasets considered contain missing measurements, we use a time series model to estimate the missing values and complete the datasets. Once the data is complete, we use a non-homogeneous Poisson model with multiple change-points to estimate the probability of interest. Estimation of the parameters of the models are made using the usual time series methodology as well as the Bayesian point of view via Markov chain Monte Carlo algorithms. The models are applied to data obtained from two measuring sites in Messina, Italy.  相似文献   

7.
A symbolic analysis of observed time series requires a discrete partition of a continuous state space containing the dynamics. A particular kind of partition, called "generating," preserves all deterministic dynamical information in the symbolic representation, but such partitions are not obvious beyond one dimension. Existing methods to find them require significant knowledge of the dynamical evolution operator. We introduce a statistic and algorithm to refine empirical partitions for symbolic state reconstruction. This method optimizes an essential property of a generating partition, avoiding topological degeneracies, by minimizing the number of "symbolic false nearest neighbors." It requires only the observed time series and is sensible even in the presence of noise when no truly generating partition is possible.  相似文献   

8.
A Bayesian tutorial for data assimilation   总被引:1,自引:0,他引:1  
Data assimilation is the process by which observational data are fused with scientific information. The Bayesian paradigm provides a coherent probabilistic approach for combining information, and thus is an appropriate framework for data assimilation. Viewing data assimilation as a problem in Bayesian statistics is not new. However, the field of Bayesian statistics is rapidly evolving and new approaches for model construction and sampling have been utilized recently in a wide variety of disciplines to combine information. This article includes a brief introduction to Bayesian methods. Paying particular attention to data assimilation, we review linkages to optimal interpolation, kriging, Kalman filtering, smoothing, and variational analysis. Discussion is provided concerning Monte Carlo methods for implementing Bayesian analysis, including importance sampling, particle filtering, ensemble Kalman filtering, and Markov chain Monte Carlo sampling. Finally, hierarchical Bayesian modeling is reviewed. We indicate how this approach can be used to incorporate significant physically based prior information into statistical models, thereby accounting for uncertainty. The approach is illustrated in a simplified advection–diffusion model.  相似文献   

9.
This paper develops a general trans-dimensional Bayesian methodology for geoacoustic inversion. Trans-dimensional inverse problems are a generalization of fixed-dimensional inversion that includes the number and type of model parameters as unknowns in the problem. By extending the inversion state space to multiple subspaces of different dimensions, the posterior probability density quantifies the state of knowledge regarding inversion parameters, including effects due to limited knowledge about appropriate parametrization of the environment and error processes. The inversion is implemented here using a reversible-jump Markov chain Monte Carlo algorithm and the seabed is parametrized using a partition model. Unknown data errors are addressed by including a data-error model. Jumps between dimensions are implemented with a birth-death methodology that allows transitions between dimensions by adding or removing interfaces while maintaining detailed balance in the Markov chain. Trans-dimensional inversion results in an inherently parsimonious solution while partition modeling provides a naturally self-regularizing algorithm based on data information content, not on subjective regularization functions. Together, this results in environmental estimates that quantify appropriate seabed structure as supported by the data, allowing sharp discontinuities while approximating smooth transitions where needed. This approach applies generally to geoacoustic inversion and is illustrated here with seabed reflection-coefficient data.  相似文献   

10.
The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher efficiency and of a consistent conceptual basis for dealing with the problem of induction in the presence of uncertainty. This trend is likely to continue in the future, when the way we collect, manipulate and analyse observations and compare them with theoretical models will assume an even more central role in cosmology.

This review is an introduction to Bayesian methods in cosmology and astrophysics and recent results in the field. I first present Bayesian probability theory and its conceptual underpinnings, Bayes' Theorem and the role of priors. I discuss the problem of parameter inference and its general solution, along with numerical techniques such as Monte Carlo Markov Chain methods. I then review the theory and application of Bayesian model comparison, discussing the notions of Bayesian evidence and effective model complexity, and how to compute and interpret those quantities. Recent developments in cosmological parameter extraction and Bayesian cosmological model building are summarised, highlighting the challenges that lie ahead.  相似文献   

11.
12.
In this work, a framework to boost the efficiency of Bayesian inference in probabilistic models is introduced by embedding a Markov chain sampler within a variational posterior approximation. We call this framework “refined variational approximation”. Its strengths are its ease of implementation and the automatic tuning of sampler parameters, leading to a faster mixing time through automatic differentiation. Several strategies to approximate evidence lower bound (ELBO) computation are also introduced. Its efficient performance is showcased experimentally using state-space models for time-series data, a variational encoder for density estimation and a conditional variational autoencoder as a deep Bayes classifier.  相似文献   

13.
We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selection through automatic evaluation of the marginal likelihood. We demonstrate the accuracy and efficiency of the approach on nonlinear inverse problems of varying dimension, involving the inference of parameters appearing in ordinary and partial differential equations.  相似文献   

14.
We conduct a case study in which we empirically illustrate the performance of different classes of Bayesian inference methods to estimate stochastic volatility models. In particular, we consider how different particle filtering methods affect the variance of the estimated likelihood. We review and compare particle Markov Chain Monte Carlo (MCMC), RMHMC, fixed-form variational Bayes, and integrated nested Laplace approximation to estimate the posterior distribution of the parameters. Additionally, we conduct the review from the point of view of whether these methods are (1) easily adaptable to different model specifications; (2) adaptable to higher dimensions of the model in a straightforward way; (3) feasible in the multivariate case. We show that when using the stochastic volatility model for methods comparison, various data-generating processes have to be considered to make a fair assessment of the methods. Finally, we present a challenging specification of the multivariate stochastic volatility model, which is rarely used to illustrate the methods but constitutes an important practical application.  相似文献   

15.
Variational algorithms have gained prominence over the past two decades as a scalable computational environment for Bayesian inference. In this article, we explore tools from the dynamical systems literature to study the convergence of coordinate ascent algorithms for mean field variational inference. Focusing on the Ising model defined on two nodes, we fully characterize the dynamics of the sequential coordinate ascent algorithm and its parallel version. We observe that in the regime where the objective function is convex, both the algorithms are stable and exhibit convergence to the unique fixed point. Our analyses reveal interesting discordances between these two versions of the algorithm in the region when the objective function is non-convex. In fact, the parallel version exhibits a periodic oscillatory behavior which is absent in the sequential version. Drawing intuition from the Markov chain Monte Carlo literature, we empirically show that a parameter expansion of the Ising model, popularly called the Edward–Sokal coupling, leads to an enlargement of the regime of convergence to the global optima.  相似文献   

16.
17.
Partitions provide simple symbolic representations for complex systems. For a deterministic system, a generating partition establishes one-to-one correspondence between an orbit and the infinite symbolic sequence generated by the partition. For a stochastic system, however, a generating partition does not exist. In this paper, we propose a method to obtain a partition that best specifies the locations of points for a time series generated from a stochastic system by using the corresponding symbolic sequence under a constraint of an information rate. When the length of the substrings is limited with a finite length, the method coincides with that for estimating a generating partition from a time series generated from a deterministic system. The two real datasets analyzed in Kennel and Buhl, Phys. Rev. Lett. 91, 084102 (2003), are reanalyzed with the proposed method to understand their underlying dynamics intuitively.  相似文献   

18.
The free energy principle (FEP) has been presented as a unified brain theory, as a general principle for the self-organization of biological systems, and most recently as a principle for a theory of every thing. Additionally, active inference has been proposed as the process theory entailed by FEP that is able to model the full range of biological and cognitive events. In this paper, we challenge these two claims. We argue that FEP is not the general principle it is claimed to be, and that active inference is not the all-encompassing process theory it is purported to be either. The core aspects of our argumentation are that (i) FEP is just a way to generalize Bayesian inference to all domains by the use of a Markov blanket formalism, a generalization we call the Markov blanket trick; and that (ii) active inference presupposes successful perception and action instead of explaining them.  相似文献   

19.
Wavelet methods for image regularization offer a data-driven alternative to Gaussian smoothing in functional magnetic resonance (fMRI) analysis. Their impact has been limited by the difficulties in integrating regularization in the wavelet domain and inference in the image domain, precluding the probabilistic decision on which areas are activated by a task. Here we present an integrated framework for Bayesian estimation and regularization in wavelet space that allows the usual voxelwise hypothesis testing. This framework is flexible, being an adaptation to fMRI time series of a more general wavelet-based functional mixed-effect model. Through testing on a combination of simulated and real fMRI data, we show evidence of improved signal recovery, without compromising test accuracy in image space.  相似文献   

20.
On the control of chaotic systems via symbolic time series analysis   总被引:1,自引:0,他引:1  
Symbolic analysis of time series is extended to systems with inputs, in order to obtain input/output symbolic models to be used for control policy design. For that, the notion of symbolic word is broadened to possibly include past input values. Then, a model is derived in the form of a controlled Markov chain, i.e., transition probabilities are conditioned on the control value. The quality of alternative models with different word length and alphabet size is assessed by means of an indicator based on Shannon entropy. A control problem is formulated, with the goal of confining the system output in a smaller domain with respect to that of the uncontrolled case. Solving this problem (by means of a suitable numerical method) yields the relevant control policy, as well as an estimate of the probability distribution of the output of the controlled system. Three examples of application (based on the analysis of time series synthetically generated by the logistic map, the Lorenz system, and an epidemiological model) are presented and used to discuss the features and limitations of the method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号