首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 310 毫秒
1.
In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of ‘symbols’, as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach – of a permutation procedure and a complexity analysis – is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.  相似文献   

2.
Gerard Briscoe  Philippe De Wilde 《Physica A》2011,390(21-22):3732-3741
A measure called physical complexity is established and calculated for a population of sequences, based on statistical physics, automata theory, and information theory. It is a measure of the quantity of information in an organism’s genome. It is based on Shannon’s entropy, measuring the information in a population evolved in its environment, by using entropy to estimate the randomness in the genome. It is calculated from the difference between the maximal entropy of the population and the actual entropy of the population when in its environment, estimated by counting the number of fixed loci in the sequences of a population. Up until now, physical complexity has only been formulated for populations of sequences with the same length. Here, we investigate an extension to support variable length populations. We then build upon this to construct a measure for the efficiency of information storage, which we later use in understanding clustering within populations. Finally, we investigate our extended physical complexity through simulations, showing it to be consistent with the original.  相似文献   

3.
Using the Hartree-Fock non-relativistic wave functions in the position and momentum spaces, the statistical measure of complexity C, due to López-Ruiz, Mancini, and Calbet for the neutral atoms as well as their monopositive and mononegative ions with atomic number Z=1-54 are reported. In C, given by the product of exponential power Shannon entropy and the average density, the latter is then replaced by the Fisher measure to obtain the Fisher-Shannon plane. Our numerical results suggest that in overall the Fisher-Shannon plane reproduces the trends given by C, with significantly enhanced sensitivity in the position, momentum and the product spaces in all neutral atoms and ions considered.  相似文献   

4.
We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of apparent memory stored in a source and the amounts of information that must be extracted from observations of a source in order for it to be optimally predicted and for an observer to synchronize to it. To measure the difficulty of synchronization, we define the transient information and prove that, for Markov processes, it is related to the total uncertainty experienced while synchronizing to a process. One consequence of ignoring a process's structural properties is that the missed regularities are converted to apparent randomness. We demonstrate that this problem arises particularly for settings where one has access only to short measurement sequences. Numerically and analytically, we determine the Shannon entropy growth curve, and related quantities, for a range of stochastic and deterministic processes. We conclude by looking at the relationships between a process's entropy convergence behavior and its underlying computational structure.  相似文献   

5.
6.
We show that Information Theory quantifiers are suitable tools for detecting and for quantifying noise-induced temporal correlations in stochastic resonance phenomena. We use the Bandt & Pompe (BP) method [Phys. Rev. Lett. 88, 174102 (2002)] to define a probability distribution, P, that fully characterizes temporal correlations. The BP method is based on a comparison of neighboring values, and here is applied to the temporal sequence of residence-time intervals generated by the paradigmatic model of a Brownian particle in a sinusoidally modulated bistable potential. The probability distribution P generated via the BP method has associated a normalized Shannon entropy, H[P], and a statistical complexity measure, C[P], which is defined as proposed by Rosso et al. [Phys. Rev. Lett. 99, 154102 (2007)]. The statistical complexity quantifies not only randomness but also the presence of correlational structures, the two extreme circumstances of maximum knowledge (“perfect order") and maximum ignorance (“complete randomness") being regarded an “trivial", and in consequence, having complexity C = 0. We show that both, H and C, display resonant features as a function of the noise intensity, i.e., for an optimal level of noise the entropy displays a minimum and the complexity, a maximum. This resonant behavior indicates noise-enhanced temporal correlations in the sequence of residence-time intervals. The methodology proposed here has great potential for the precise detection of subtle signatures of noise-induced temporal correlations in real-world complex signals.  相似文献   

7.
We have proven that there exists a quantum state approximating any multi-copy state universally when we measure the error by means of the normalized relative entropy. While the qubit case was proven by Krattenthaler and Slater (IEEE Trans. IT 46, 810–819 (2009)), the general case has been open for more than ten years. For a deeper analysis, we have solved the mini-max problem concerning ‘approximation error’ up to the second order. Furthermore, we have applied this result to quantum lossless data compression, and have constructed a universal quantum lossless data compression.  相似文献   

8.
格子复杂性和符号序列的细粒化   总被引:1,自引:0,他引:1       下载免费PDF全文
柯大观  张宏  童勤业 《物理学报》2005,54(2):534-542
提出一种新的有限长一维符号序列的复杂性度量——格子复杂性,建立在Lempel Ziv复杂性和一维迭代映射系统的符号动力学基础上.同时提出了符号序列的细粒化方法,可与格子复杂性以及Lempel Ziv复杂性结合.新度量在细粒化指数较小时与Lempel Ziv复杂性基本一致,在细粒化指数增大时显示出截然不同的特性.以Logistic映射为对象的计算实验表明,格子复杂性对混沌区的边缘最敏感.最后还讨论了上述复杂性度量的其他一些重要性质. 关键词: 混沌 复杂性度量 格子复杂性 细粒化  相似文献   

9.
10.
The upper and lower bounds of the linear variance decay (LVD) dimension density are analytically deduced using multivariate series with uncorrelated and perfectly correlated component series. Then, the normalized LVD dimension density (δnormLVD) is introduced. In order to measure the complexity of a scalar series with δnormLVD, a pseudo-multivariate series was constructed from the scalar time series using time-delay embedding. Thus, δnormLVD is used to characterize the complexity of the pseudo-multivariate series. The results from the model systems and fMRI data of anxiety subjects reveal that this method can be used to analyze short and noisy time series.  相似文献   

11.
This paper considers the realizability of quantum gates from the perspective of information complexity. Since the gate is a physical device that must be controlled classically, it is subject to random error. We define the complexity of gate operation in terms of the difference between the entropy of the variables associated with initial and final states of the computation. We argue that the gate operations are irreversible if there is a difference in the accuracy associated with input and output variables. It is shown that under some conditions the gate operation may be associated with unbounded entropy, implying impossibility of implementation. PACS number: 03.65  相似文献   

12.
In this paper, we study the 1D Anderson model with long-range correlated on-site energies. This diagonal-correlated disorder is considered in such a way that the random sequence of site energies εn has a 1/kα power spectrum, where k is the wave-vector of the modulations on the random sequence landscape. Using the Runge-Kutta method to solve the time-dependent Schrödinger equation, we compute the participation number and the Shannon entropy for an initially localized wave packet. We observe that strong correlations can induce ballistic transport associated with the emergence of low-energy extended states, in agreement with previous works in this model. We further identify an intermediate regime with super-diffusive spreading of the wave-packet.  相似文献   

13.
We discuss algorithms for estimating the Shannon entropy h of finite symbol sequences with long range correlations. In particular, we consider algorithms which estimate h from the code lengths produced by some compression algorithm. Our interest is in describing their convergence with sequence length, assuming no limits for the space and time complexities of the compression algorithms. A scaling law is proposed for extrapolation from finite sample lengths. This is applied to sequences of dynamical systems in non-trivial chaotic regimes, a 1-D cellular automaton, and to written English texts. (c)1996 American Institute of Physics.  相似文献   

14.
One of most important issues in quantum information theory concerns transmission of information through noisy quantum channels. We discuss a few channel characteristics expressed by means of generalized entropies. Such characteristics can often be treated in line with more usual treatment based on the von Neumann entropies. For any channel, we show that the q-average output entropy of degree q ≥ 1 is bounded from above by the q-entropy of the input density matrix. The concavity properties of the (q, s)-entropy exchange are considered. Fano type quantum bounds on the (q, s)-entropy exchange are derived. We also give upper bounds on the map (q, s)-entropies in terms of the output entropy, corresponding to the completely mixed input.  相似文献   

15.
Recently, the scientific community has witnessed a substantial increase in the generation of protein sequence data, triggering emergent challenges of increasing importance, namely efficient storage and improved data analysis. For both applications, data compression is a straightforward solution. However, in the literature, the number of specific protein sequence compressors is relatively low. Moreover, these specialized compressors marginally improve the compression ratio over the best general-purpose compressors. In this paper, we present AC2, a new lossless data compressor for protein (or amino acid) sequences. AC2 uses a neural network to mix experts with a stacked generalization approach and individual cache-hash memory models to the highest-context orders. Compared to the previous compressor (AC), we show gains of 2–9% and 6–7% in reference-free and reference-based modes, respectively. These gains come at the cost of three times slower computations. AC2 also improves memory usage against AC, with requirements about seven times lower, without being affected by the sequences’ input size. As an analysis application, we use AC2 to measure the similarity between each SARS-CoV-2 protein sequence with each viral protein sequence from the whole UniProt database. The results consistently show higher similarity to the pangolin coronavirus, followed by the bat and human coronaviruses, contributing with critical results to a current controversial subject. AC2 is available for free download under GPLv3 license.  相似文献   

16.
混沌伪随机序列的谱熵复杂性分析   总被引:2,自引:0,他引:2       下载免费PDF全文
孙克辉  贺少波  何毅  尹林子 《物理学报》2013,62(1):10501-010501
为了准确分析混沌伪随机序列的结构复杂性,采用谱熵算法对Logistic映射、Gaussian映射和TD-ERCS系统产生的混沌伪随机序列复杂度进行了分析.谱熵算法具有参数少、对序列长度N(惟一参数)和伪随机进制数K鲁棒性好的特点.采用窗口滑动法分析了混沌伪随机序列的复杂度演变特性,计算了离散混沌系统不同初值和不同系统参数条件下的复杂度.研究表明,谱熵算法能有效地分析混沌伪随机序列的结构复杂度;在这三个混沌系统中,TD-ERCS系统为广域高复杂度混沌系统,复杂度性能最好;不同窗口和不同初值条件下的混沌系统复杂度在较小范围内波动.为混沌序列在信息安全中的应用提供了理论和实验依据.  相似文献   

17.
《Physica A》2006,365(1):91-95
The game-theoretical approach to non-extensive entropy measures of statistical physics is based on an abstract measure of complexity from which the entropy measure is derived in a natural way. A wide class of possible complexity measures is considered and a property of factorization, apparently related to escorting, is investigated. It is shown that only those complexity measures which are connected with Tsallis entropy have the factorization property.  相似文献   

18.
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.  相似文献   

19.
This paper models a translation for base-2 pseudorandom number generators (PRNGs) to mixed-radix uses such as card shuffling. In particular, we explore a shuffler algorithm that relies on a sequence of uniformly distributed random inputs from a mixed-radix domain to implement a Fisher–Yates shuffle that calls for inputs from a base-2 PRNG. Entropy is lost through this mixed-radix conversion, which is assumed to be surjective mapping from a relatively large domain of size 2J to a set of arbitrary size n. Previous research evaluated the Shannon entropy loss of a similar mapping process, but this previous bound ignored the mixed-radix component of the original formulation, focusing only on a fixed n value. In this paper, we calculate a more precise formula that takes into account a variable target domain radix, n, and further derives a tighter bound on the Shannon entropy loss of the surjective map, while demonstrating monotonicity in a decrease in entropy loss based on increased size J of the source domain 2J. Lastly, this formulation is used to specify the optimal parameters to simulate a card-shuffling algorithm with different test PRNGs, validating a concrete use case with quantifiable deviations from maximal entropy, making it suitable to low-power implementation in a casino.  相似文献   

20.
The size dependent complexity of protein sequences in various families in the FSSP database is characterized by sequence entropy, sequence similarity and sequence identity. As the average length Lf of sequences in the family increases, an increasing trend of the sequence entropy and a decreasing trend of the sequence similarity and sequence identity are found. As Lf increases beyond 250, a saturation of the sequence entropy, the sequence similarity and the sequence identity is observed. Such a saturated behavior of complexity is attributed to the saturation of the probability Pg of global (long-range) interactions in protein structures when Lf >250. It is also found that the alphabet size of residue types describing the sequence diversity depends on the value of Lf, and becomes saturated at 12.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号