首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
Many methods have been experimented to study decoherence in quantum dot (QD). Tsallis, Shannon and Gaussian entropy have been used to study decoherence separately; in this paper, we compared the results of the Gaussian, Shannon, and Tsallis entropies in 0-D nanosystem. The linear combination operator and the unitary transformation was used to derive the magnetopolaron spectrum that strongly interacts with the LO phonons in the presence of an electric field in the pseudoharmonic and delta quantum dot. Numerical results revealed for the quantum pseudo dot that: (i) the amplitude of Gauss entropy is greater than the amplitude of Tsallis entropy which in turn is greater than the amplitude of Shannon entropy. The Tsallis entropy is not more significant in nanosystem compared to Shannon and Gauss entropies, (ii) with an increase of the zero point, the dominance of the Gauss entropy on the Shannon entropy was observed on one hand and the dominance of the Shannon entropy on the Tsallis entropy on the other hand; this suggested that in nanosystem, Gauss entropy is more suitable in the evaluation of the average of information in the system, for the delta quantum dot it was observed that (iii) when the Gauss entropy is considered, a lot of information about the system is missed. The collapse revival phenomenon in Shannon entropy was observed in RbCl and GaAs delta quantum dot with the enhancement of delta parameter; with an increase in this parameter, the system in the case of CsI evolved coherently; with Shannon and Tsallis entropies, information in the system is faster and coherently exchanged; (iv) the Shannon entropy is more significant because its amplitude outweighs the others when the delta dimension length enhances. The Tsallis entropy involves as wave bundle; which oscillate periodically with an increase of the oscillation period when delta dimension length is improved.  相似文献   

2.
The extremization of the information-theoretic measures (Fisher information, Shannon entropy, Tsallis entropy), which complementary describe the spreading of the physical states of natural systems, gives rise to fundamental equations of motion and/or conservation laws. At times, the associated extreme entropy distributions are known for some given constraints, usually moments or radial expectation values. In this work, first we give the existence conditions of the maxent probability distributions in a D-dimensional scenario where two moments (not necessarily of consecutive order) are known. Then we find general relations which involve four elements (the extremized entropy, the other two information-theoretic measures and the variance of the extremum density) in scenarios with different dimensionalities and moment constraints.  相似文献   

3.
Robert Sneddon 《Physica A》2007,386(1):101-118
Estimating the information contained in natural data, such as electroencephalography data, is unusually difficult because the relationship between the physical data and the information that it encodes is unknown. This unknown relationship is often called the encoding problem. The present work provides a solution to this problem by deriving a method to estimate the Tsallis entropy in natural data. The method is based on two findings. The first finding is that the physical instantiation of any information event, that is, the physical occurrence of a symbol of information, must begin and end at a discontinuity or critical point (maximum, minimum, or saddle point) in the data. The second finding is that, in certain data types such as the encephalogram (EEG), the variance within of an EEG waveform event is directly proportional to its probability of occurrence.These two outcomes yield two results. The first is the easy binning of data into separate information events. The second is the ability to estimate probabilities in two ways: frequency counting and computing the variance within of an EEG waveform. These results are used to derive a linear estimator of the Tsallis entropy functional, allowing it to be estimated without deducing the encoding.This method for estimating the Tsallis entropy is first used to estimate the information in simple signals. The amount of information estimated is highly accurate. The method is then applied to two problems in electroencephalography. The first is distinguishing normal aging from very early Alzheimer's disease (mild cognitive impairment), and the second is medication monitoring of Alzheimer's disease treatment. The former is done with an accuracy of 92% and the latter with an accuracy of 91%. This detection accuracy is the highest published accuracy in the literature, which suggests that this method for Tsallis entropy estimation is both accurate and useful.  相似文献   

4.
Noise-aided information transmission via stochastic resonance is shown and analyzed in a binary channel by means of information measures based on the Tsallis entropy. The analysis extends the classic reference of binary information transmission based on the Shannon entropy, and also parallels a recent study based on the Rényi entropy. The conditions for a maximally pronounced stochastic resonance identify optimal Tsallis measures. The study involves a correspondence between Tsallis and Rényi information measures, specially relevant to the characterization of stochastic resonance, and establishing that for such effects identical properties are shared in common by both Tsallis and Rényi measures.  相似文献   

5.
We investigated the dynamics of particulate matter data, recorded in Tito, a small industrial area of southern Italy. The analysis of these signals was performed using the Fisher information measure (FIM), which is a powerful tool for investigating complex and nonstationary signals, and the Shannon entropy, which is a well-known tool for investigating the degree of disorder in dynamical systems. Our results point to an increase of disorder and complexity from fine to coarse particulates.  相似文献   

6.
The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover, different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.  相似文献   

7.
Some natural phenomena are deviating from standard statistical behavior and their study has increased interest in obtaining new definitions of information measures. But the steps for deriving the best definition of the entropy of a given dynamical system remain unknown. In this paper, we introduce some parametric extended divergences combining Jeffreys divergence and Tsallis entropy defined by generalized logarithmic functions, which lead to new inequalities. In addition, we give lower bounds for one-parameter extended Fermi–Dirac and Bose–Einstein divergences. Finally, we establish some inequalities for the Tsallis entropy, the Tsallis relative entropy and some divergences by the use of the Young’s inequality.  相似文献   

8.
Considerable advances in automatic speech recognition have been made in the last decades, thanks specially to the use of hidden Markov models. In the field of speech signal analysis, different techniques have been developed. However, deterioration in the performance of the speech recognizers has been observed when they are trained with clean signal and tested with noisy signals. This is still an open problem in this field. Continuous multiresolution entropy has been shown to be robust to additive noise in applications to different physiological signals. In previous works we have included Shannon and Tsallis entropies, and their corresponding divergences, in different speech analysis and recognition systems. In this paper we present an extension of the continuous multiresolution entropy to different divergences and we propose them as new dimensions for the pre-processing stage of a speech recognition system. This approach takes into account information about changes in the dynamics of speech signal at different scales. The methods proposed here are tested with speech signals corrupted with babble and white noise. Their performance is compared with classical mel cepstral parametrization. The results suggest that these continuous multiresolution entropy related measures provide valuable information to the speech recognition system and that they could be considered to be included as an extra component in the pre-processing stage.  相似文献   

9.
We propose a generalized entropy maximization procedure, which takes into account the generalized averaging procedures and information gain definitions underlying the generalized entropies. This novel generalized procedure is then applied to Rényi and Tsallis entropies. The generalized entropy maximization procedure for Rényi entropies results in the exponential stationary distribution asymptotically for q∈(0,1] in contrast to the stationary distribution of the inverse power law obtained through the ordinary entropy maximization procedure. Another result of the generalized entropy maximization procedure is that one can naturally obtain all the possible stationary distributions associated with the Tsallis entropies by employing either ordinary or q-generalized Fourier transforms in the averaging procedure.  相似文献   

10.
基于二维最小Tsallis交叉熵的图像阈值分割方法   总被引:15,自引:0,他引:15       下载免费PDF全文
利用Tsallis熵的非广延性,提出了二维最小Tsallis交叉熵阈值分割方法.首先给出了二维Tsallis交叉熵的定义,并以最小二维Tsallis交叉熵为准则,利用粒子群优化算法来搜索最优二维阈值向量.该方法不仅进一步考虑了像素之间的空间邻域信息,而且考虑了目标和背景之间的相互关系,其分割性能优于基于Shannon熵的交叉熵阈值法和一维最小Tsallis交叉熵阈值法,并且具有很强的抗噪声能力.实验结果表明,该方法可以实现快速、准确的分割. 关键词: Tsallis交叉熵 二维直方图 粒子群优化算法 图像分割  相似文献   

11.
Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy (DCRTE). In the present paper, we propose non-parametric kernel type estimators for CRTE and DCRTE where the considered observations exhibit an ρ-mixing dependence condition. Asymptotic properties of the estimators were established under suitable regularity conditions. A numerical evaluation of the proposed estimator is exhibited and a Monte Carlo simulation study was carried out.  相似文献   

12.
13.
Using a q-analog of Boltzmann's combinatorial basis of entropy, the non-asymptotic non-degenerate and degenerate combinatorial forms of the Tsallis entropy function are derived. The new measures – supersets of the Tsallis entropy and the non-asymptotic variant of the Shannon entropy – are functions of the probability and degeneracy of each state, the Tsallis parameter q and the number of entities N. The analysis extends the Tsallis entropy concept to systems of small numbers of entities, with implications for the permissible range of q and the role of degeneracy.  相似文献   

14.
The convexity of the Wigner–Yanase–Dyson information, as first proved by Lieb, is a deep and fundamental result because it leads to the strong subadditivity of quantum entropy. The Wigner–Yanase–Dyson information is a particular kind of quantum Fisher information with important applications in quantum estimation theory. But unlike the quantum entropy, which is the unique natural quantum extension of the classical Shannon entropy, there are many different variants of quantum Fisher information, and it is desirable to investigate their convexity. This article is devoted to studying the convexity of a direct generalization of the Wigner–Yanase–Dyson information. Some sufficient conditions are obtained, and some necessary conditions are illustrated. In a particular case, a surprising necessary and sufficient condition is obtained. Our results reveal the intricacy and subtlety of the convexity issue for general quantum Fisher information.   相似文献   

15.
By means of a probabilistic coupling technique, we establish some tight upper bounds on the variations of the Tsallis entropies in terms of the uniform distance. We treat both classical and quantum cases. The results provide some quantitative characterizations of the uniform continuity and stability properties of the Tsallis entropies. As direct consequences, we obtain the corresponding results for the Shannon entropy and the von Neumann entropy, which are stronger than the conventional ones.   相似文献   

16.
Li Heling  Xiong YingLi Yaya 《Physica A》2011,390(15):2769-2775
We derive the statistical distributions, partition functions and thermodynamic formulas for a completely open system on the basis of Tsallis entropy. These results are derived for two types of constraints, using the method of maximum entropy.  相似文献   

17.
T. Ochiai  J.C. Nacher 《Physica A》2009,388(23):4887-4892
In this work, we first formulate the Tsallis entropy in the context of complex networks. We then propose a network construction whose topology maximizes the Tsallis entropy. The growing network model has two main ingredients: copy process and random attachment mechanism (C-R model). We show that the resulting degree distribution exactly agrees with the required degree distribution that maximizes the Tsallis entropy. We also provide another example of network model using a combination of preferential and random attachment mechanisms (P-R model) and compare it with the distribution of the Tsallis entropy. In this case, we show that by adequately identifying the exponent factor q, the degree distribution can also be written in the q-exponential form. Taken together, our findings suggest that both mechanisms, copy process and preferential attachment, play a key role for the realization of networks with maximum Tsallis entropy. Finally, we discuss the interpretation of q parameter of the Tsallis entropy in the context of complex networks.  相似文献   

18.
We show that starting with either the non-extensive Tsallis entropy in Wang's formalism or the extensive Rényi entropy, it is possible to construct equilibrium non-Gibbs canonical distribution functions which satisfy the fundamental equations of thermodynamics. The statistical mechanics with Tsallis entropy does not satisfy the zeroth law of thermodynamics at dynamical and statistical independence request, whereas the extensive Rényi statistics fulfills all requirements of equilibrium thermodynamics in the microcanonical ensemble. Transformation formulas between Tsallis statistics in Wang representation and Rényi statistics are presented. The one-particle distribution function in Rényi statistics for a classical ideal gas at finite particle number has a power-law tail for large momenta.  相似文献   

19.
The Fisher-Shannon information (FS) plane, defined by the Fisher information measure and the Shannon entropy power, is used to investigate the complex dynamics of magnetotelluric data of three stations in Taiwan. In the FS plane the electric and magnetic components are significantly separated, characterized by different degrees of order. Further investigation shows that signals measured in areas with very high level of seismic activity are well discriminated.  相似文献   

20.
The scaling properties of various composite information-theoretic measures (Shannon and Rényi entropy sums, Fisher and Onicescu information products, Tsallis entropy ratio, Fisher-Shannon product and shape complexity) are studied in position and momentum spaces for the non-relativistic hydrogenic atoms in the presence of parallel magnetic and electric fields. Such measures are found to be invariant at the fixed values of the scaling parameters given by and . Numerical results which support the validity of the scaling properties are shown by choosing the representative example of the position space shape complexity. Physical significance of the resulting scaling behavior is discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号