首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 22 毫秒
1.
Sampe Entropy (SampEn), a measure quantifying regularity and complexity, is believed to be an effective analyzing method of diverse settings that include both deterministic chaotic and stochastic processes, particularly operative in the analysis of physiological signals that involve relatively small amount of data. However, the similarity definition of vectors is based on Heaviside function, of which the boundary is discontinuous and hard, may cause some problems in the validity and accuracy of SampEn. Sigmoid function is a smoothed and continuous version of Heaviside function. To overcome the problems SampEn encountered, a modified SampEn (mSampEn) based on nonlinear Sigmoid function was proposed. The performance of mSampEn was tested on the independent identically distributed (i.i.d.) uniform random numbers, the MIX stochastic model, the Rossler map, and the Hennon map. The results showed that mSampEn was superior to SampEn in several aspects, including giving entropy definition in case of small parameters, better relative consistency, robust to noise, and more independence on record length when characterizing time series generated from either deterministic or stochastic system with different regularities.  相似文献   

2.
Entropy indicates irregularity or randomness of a dynamic system. Over the decades, entropy calculated at different scales of the system through subsampling or coarse graining has been used as a surrogate measure of system complexity. One popular multi-scale entropy analysis is the multi-scale sample entropy (MSE), which calculates entropy through the sample entropy (SampEn) formula at each time scale. SampEn is defined by the “logarithmic likelihood” that a small section (within a window of a length m) of the data “matches” with other sections will still “match” the others if the section window length increases by one. “Match” is defined by a threshold of r times standard deviation of the entire time series. A problem of current MSE algorithm is that SampEn calculations at different scales are based on the same matching threshold defined by the original time series but data standard deviation actually changes with the subsampling scales. Using a fixed threshold will automatically introduce systematic bias to the calculation results. The purpose of this paper is to mathematically present this systematic bias and to provide methods for correcting it. Our work will help the large MSE user community avoiding introducing the bias to their multi-scale SampEn calculation results.  相似文献   

3.
朱胜利  甘露 《物理学报》2016,65(7):70502-070502
由于混沌时间序列和随机过程具有很多类似的性质, 因而在实际中很难将两者区分开来. 混沌信号检测与识别是混沌时间序列分析中一个重要的课题. 混沌信号是由确定性的混沌映射或混沌系统产生的, 相比于高斯白噪声序列, 其在非完整的二维相空间中表现出更加丰富的结构特性. 本文通过研究混沌时间序列和高斯白噪声序列在非完整二维相空间中的分布特性, 利用混沌信号的非线性动力学特性, 提出了一种基于非完整二维相空间分量置换的混沌信号检测方法. 该方法首先由接收序列得到非完整的二维相空间, 基于第一维分量大小关系实现对第二维分量的置换与分组, 进一步求得F检验统计量. 然后利用混沌系统的局部特性, 获取非完整二维相空间的动力学结构信息, 实现对混沌序列的有效检测. 在高斯白噪声条件下对多种混沌信号进行了信号检测的数值仿真. 仿真结果表明: 相比置换熵检测, 本文所提算法所需数据量小、计算简单以及具有更低的时间复杂度, 同时对噪声具有更好的鲁棒性.  相似文献   

4.
Electrocardiography (ECG) and electroencephalography (EEG) signals provide clinical information relevant to determine a patient’s health status. The nonlinear analysis of ECG and EEG signals allows for discovering characteristics that could not be found with traditional methods based on amplitude and frequency. Approximate entropy (ApEn) and sampling entropy (SampEn) are nonlinear data analysis algorithms that measure the data’s regularity, and these are used to classify different electrophysiological signals as normal or pathological. Entropy calculation requires setting the parameters r (tolerance threshold), m (immersion dimension), and τ (time delay), with the last one being related to how the time series is downsampled. In this study, we showed the dependence of ApEn and SampEn on different values of τ, for ECG and EEG signals with different sampling frequencies (Fs), extracted from a digital repository. We considered four values of Fs (128, 256, 384, and 512 Hz for the ECG signals, and 160, 320, 480, and 640 Hz for the EEG signals) and five values of τ (from 1 to 5). We performed parametric and nonparametric statistical tests to confirm that the groups of normal and pathological ECG and EEG signals were significantly different (p < 0.05) for each F and τ value. The separation between the entropy values of regular and irregular signals was variable, demonstrating the dependence of ApEn and SampEn with Fs and τ. For ECG signals, the separation between the conditions was more robust when using SampEn, the lowest value of Fs, and τ larger than 1. For EEG signals, the separation between the conditions was more robust when using SampEn with large values of Fs and τ larger than 1. Therefore, adjusting τ may be convenient for signals that were acquired with different Fs to ensure a reliable clinical classification. Furthermore, it is useful to set τ to values larger than 1 to reduce the computational cost.  相似文献   

5.
Image processing has played a relevant role in various industries, where the main challenge is to extract specific features from images. Specifically, texture characterizes the phenomenon of the occurrence of a pattern along the spatial distribution, taking into account the intensities of the pixels for which it has been applied in classification and segmentation tasks. Therefore, several feature extraction methods have been proposed in recent decades, but few of them rely on entropy, which is a measure of uncertainty. Moreover, entropy algorithms have been little explored in bidimensional data. Nevertheless, there is a growing interest in developing algorithms to solve current limits, since Shannon Entropy does not consider spatial information, and SampEn2D generates unreliable values in small sizes. We introduce a proposed algorithm, EspEn (Espinosa Entropy), to measure the irregularity present in two-dimensional data, where the calculation requires setting the parameters as follows: m (length of square window), r (tolerance threshold), and ρ (percentage of similarity). Three experiments were performed; the first two were on simulated images contaminated with different noise levels. The last experiment was with grayscale images from the Normalized Brodatz Texture database (NBT). First, we compared the performance of EspEn against the entropy of Shannon and SampEn2D. Second, we evaluated the dependence of EspEn on variations of the values of the parameters m, r, and ρ. Third, we evaluated the EspEn algorithm on NBT images. The results revealed that EspEn could discriminate images with different size and degrees of noise. Finally, EspEn provides an alternative algorithm to quantify the irregularity in 2D data; the recommended parameters for better performance are m = 3, r = 20, and ρ = 0.7.  相似文献   

6.
Sample entropy, an approximation of the Kolmogorov entropy, was proposed to characterize complexity of a time series, which is essentially defined as log(B/A), where B denotes the number of matched template pairs with length m and A denotes the number of matched template pairs with m+1, for a predetermined positive integer m. It has been widely used to analyze physiological signals. As computing sample entropy is time consuming, the box-assisted, bucket-assisted, x-sort, assisted sliding box, and kd-tree-based algorithms were proposed to accelerate its computation. These algorithms require O(N2) or O(N21m+1) computational complexity, where N is the length of the time series analyzed. When N is big, the computational costs of these algorithms are large. We propose a super fast algorithm to estimate sample entropy based on Monte Carlo, with computational costs independent of N (the length of the time series) and the estimation converging to the exact sample entropy as the number of repeating experiments becomes large. The convergence rate of the algorithm is also established. Numerical experiments are performed for electrocardiogram time series, electroencephalogram time series, cardiac inter-beat time series, mechanical vibration signals (MVS), meteorological data (MD), and 1/f noise. Numerical results show that the proposed algorithm can gain 100–1000 times speedup compared to the kd-tree and assisted sliding box algorithms while providing satisfactory approximate accuracy.  相似文献   

7.
8.
Low back pain (LBP) obviously reduces the quality of life but is also the world’s leading cause of years lived with disability. Alterations in motor response and changes in movement patterns are expected in LBP patients when compared to healthy people. Such changes in dynamics may be assessed by the nonlinear analysis of kinematical time series recorded from one patient’s motion. Since sample entropy (SampEn) has emerged as a relevant index measuring the complexity of a given time series, we propose the development of a clinical test based on SampEn of a time series recorded by a wearable inertial measurement unit for repeated bending and returns (b and r) of the trunk. Twenty-three healthy participants were asked to perform, in random order, 50 repetitions of this movement by touching a stool and another 50 repetitions by touching a box on the floor. The angular amplitude of the b and r movement and the sample entropy of the three components of the angular velocity and acceleration were computed. We showed that the repetitive b and r “touch the stool” test could indeed be the basis of a clinical test for the evaluation of low-back-pain patients, with an optimal duration of 70 s, acceptable in daily clinical practice.  相似文献   

9.
Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the “Effort To Compress” the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).  相似文献   

10.
杨孝敬  杨阳  李淮周  钟宁 《物理学报》2016,65(21):218701-218701
提出采用模糊近似熵的方法对功能磁共振成像(functional magnetic resonance imaging,fMRI)复杂度量化分析,并与样本熵进行比较.采用的22个成年抑郁症患者中,11位男性,年龄在18—65岁之间.我们期望测量的静息态fMRI信号复杂度与Goldberger/Lipsitz模型一致,越健康、越稳健其生理表现的复杂度越大,且复杂度随年龄的增大而降低.全脑平均模糊近似熵与年龄之间差异性显著(r=-0.512,p0.001).相比之下,样本熵与年龄之间差异性不显著(r=-0.102,p=0.482).模糊近似熵同样与年龄相关脑区(额叶、顶叶、边缘系统、颞叶、小脑顶叶)之间差异性显著(p0.05),样本熵与年龄相关脑区之间差异性不显著性.这些结果与Goldberger/Lipsitz模型一致,说明采用模糊近似熵分析fMRI数据复杂度是一个有效的新方法.  相似文献   

11.
The working environment of wind turbine gearboxes is complex, complicating the effective monitoring of their running state. In this paper, a new gearbox fault diagnosis method based on improved variational mode decomposition (IVMD), combined with time-shift multi-scale sample entropy (TSMSE) and a sparrow search algorithm-based support vector machine (SSA-SVM), is proposed. Firstly, a novel algorithm, IVMD, is presented for solving the problem where VMD parameters (K and α) need to be selected in advance, which mainly contains two steps: the maximum kurtosis index is employed to preliminarily determine a series of local optimal decomposition parameters (K and α), then from the local parameters, the global optimum parameters are selected based on the minimum energy loss coefficient (ELC). After decomposition by IVMD, the raw signal is divided into K intrinsic mode functions (IMFs), the optimal IMF(s) with abundant fault information is (are) chosen based on the minimum envelopment entropy criterion. Secondly, the time-shift technique is introduced to information entropy, the time-shift multi-scale sample entropy algorithm is applied for the analysis of the complexity of the chosen optimal IMF and extract fault feature vectors. Finally, the sparrow search algorithm, which takes the classification error rate of SVM as the fitness function, is used to adaptively optimize the SVM parameters. Next, the extracted TSMSEs are input into the SSA-SVM model as the feature vector to identify the gear signal types under different conditions. The simulation and experimental results confirm that the proposed method is feasible and superior in gearbox fault diagnosis when compared with other methods.  相似文献   

12.
How the complexity or irregularity of heart rate variability (HRV) changes across different sleep stages and the importance of these features in sleep staging are not fully understood. This study aimed to investigate the complexity or irregularity of the RR interval time series in different sleep stages and explore their values in sleep staging. We performed approximate entropy (ApEn), sample entropy (SampEn), fuzzy entropy (FuzzyEn), distribution entropy (DistEn), conditional entropy (CE), and permutation entropy (PermEn) analyses on RR interval time series extracted from epochs that were constructed based on two methods: (1) 270-s epoch length and (2) 300-s epoch length. To test whether adding the entropy measures can improve the accuracy of sleep staging using linear HRV indices, XGBoost was used to examine the abilities to differentiate among: (i) 5 classes [Wake (W), non-rapid-eye-movement (NREM), which can be divide into 3 sub-stages: stage N1, stage N2, and stage N3, and rapid-eye-movement (REM)]; (ii) 4 classes [W, light sleep (combined N1 and N2), deep sleep (N3), and REM]; and (iii) 3 classes: (W, NREM, and REM). SampEn, FuzzyEn, and CE significantly increased from W to N3 and decreased in REM. DistEn increased from W to N1, decreased in N2, and further decreased in N3; it increased in REM. The average accuracy of the three tasks using linear and entropy features were 42.1%, 59.1%, and 60.8%, respectively, based on 270-s epoch length; all were significantly lower than the performance based on 300-s epoch length (i.e., 54.3%, 63.1%, and 67.5%, respectively). Adding entropy measures to the XGBoost model of linear parameters did not significantly improve the classification performance. However, entropy measures, especially PermEn, DistEn, and FuzzyEn, demonstrated greater importance than most of the linear parameters in the XGBoost model.300-s270-s.  相似文献   

13.
In order to detect the incipient fault of rolling bearings and to effectively identify fault characteristics, based on amplitude-aware permutation entropy (AAPE), an enhanced method named hierarchical amplitude-aware permutation entropy (HAAPE) is proposed in this paper to solve complex time series in a new dynamic change analysis. Firstly, hierarchical analysis and AAPE are combined to excavate multilevel fault information, both low-frequency and high-frequency components of the abnormal bearing vibration signal. Secondly, from the experimental analysis, it is found that HAAPE is sensitive to the early failure of rolling bearings, which makes it suitable to evaluate the performance degradation of a bearing in its run-to-failure life cycle. Finally, a fault feature selection strategy based on HAAPE is put forward to select the bearing fault characteristics after the application of the least common multiple in singular value decomposition (LCM-SVD) method to the fault vibration signal. Moreover, several other entropy-based methods are also introduced for a comparative analysis of the experimental data, and the results demonstrate that HAAPE can extract fault features more effectively and with a higher accuracy.  相似文献   

14.
姜可宇  蔡志明  陆振波 《物理学报》2008,57(3):1471-1476
时间序列的非线性是判定该时间序列具有混沌特性的必要条件.提出一种基于线性和非线性AR模型归一化多步预测误差比值的非线性检验量δNAR,采用替代数据法来检测时间序列中的弱非线性.以Lorenz时间序列为例,分析了估计非线性检验量δNAR时各相关参数对弱非线性检测性能的影响.通过混沌时间序列非线性检测试验,对4种混沌时间序列中的3种,非线性检验量δNAR都表现出比基于AIC模型选择准则的非线性检验量相似文献   

15.
黄晓林  霍铖宇  司峻峰  刘红星 《物理学报》2014,63(10):100503-100503
样本熵(或近似熵)以信息增长率刻画时间序列的复杂性,能应用于短时序列,因而在生理信号分析中被广泛采用.然而,一方面由于传统样本熵采用与标准差线性相关的容限,使得熵值易受非平稳突变干扰的影响,另一方面传统样本熵还受序列概率分布的影响,从而导致其并非单纯反映序列的信息增长率.针对上述两个问题,将符号动力学与样本熵结合,提出等概率符号化样本熵方法,并对其物理意义、数学推导及参数选取都做了详细阐述.通过对噪声数据的仿真计算,验证了该方法的正确性及其区分不同强度时间相关的有效性.此方法应用于脑电信号分析的结果表明,在不对信号做人工伪迹去除的前提下,只需要1.25 s的脑电信号即可有效地区分出注意力集中和注意力发散两种状态.这进一步证明了该方法可很好地抵御非平稳突变干扰,能快速获得短时序列的潜在动力学特性,对脑电生物反馈技术具有很大的应用价值.  相似文献   

16.
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.  相似文献   

17.
In this study, the relationship between cardiovascular signal entropy and the risk of seven-year all-cause mortality was explored in a large sample of community-dwelling older adults from The Irish Longitudinal Study on Ageing (TILDA). The hypothesis under investigation was that physiological dysregulation might be quantifiable by the level of sample entropy (SampEn) in continuously noninvasively measured resting-state systolic (sBP) and diastolic (dBP) blood pressure (BP) data, and that this SampEn measure might be independently predictive of mortality. Participants’ date of death up to 2017 was identified from official death registration data and linked to their TILDA baseline survey and health assessment data (2010). BP was continuously monitored during supine rest at baseline, and SampEn values were calculated for one-minute and five-minute sections of this data. In total, 4543 participants were included (mean (SD) age: 61.9 (8.4) years; 54.1% female), of whom 214 died. Cox proportional hazards regression models were used to estimate the hazard ratios (HRs) with 95% confidence intervals (CIs) for the associations between BP SampEn and all-cause mortality. Results revealed that higher SampEn in BP signals was significantly predictive of mortality risk, with an increase of one standard deviation in sBP SampEn and dBP SampEn corresponding to HRs of 1.19 and 1.17, respectively, in models comprehensively controlled for potential confounders. The quantification of SampEn in short length BP signals could provide a novel and clinically useful predictor of mortality risk in older adults.  相似文献   

18.
The analysis of symbolic dynamics applied to physiological time series is able to retrieve information about dynamical properties of the underlying system that cannot be gained with standard methods like e.g. spectral analysis. Different approaches for the transformation of the original time series to the symbolic time series have been proposed. Yet the differences between the approaches are unknown. In this study three different transformation methods are investigated: (1) symbolization according to the deviation from the average time series, (2) symbolization according to several equidistant levels between the minimum and maximum of the time series, (3) binary symbolization of the first derivative of the time series. Furthermore, permutation entropy was used to quantify the symbolic series. Each method was applied to the cardiac interbeat interval series RR i and its difference ΔRR I of 17 healthy subjects obtained during head-up tilt testing. The symbolic dynamics of each method is analyzed by means of the occurrence of short sequences (“words”) of length 3. The occurrence of words is grouped according to words without variations of the symbols (0V%), words with one variation (1V%), two like variations (2LV%) and two unlike variations (2UV%). Linear regression analysis showed that for method 1 0V%, 1V%, 2LV% and 2UV% changed with increasing tilt angle. For method 2 0V%, 2LV% and 2UV% changed with increasing tilt angle and method 3 showed changes for 0V% and 1V%. Furthermore, also the permutation entropy decreased with increasing tilt angle. In conclusion, all methods are capable of reflecting changes of the cardiac autonomic nervous system during head-up tilt. All methods show that even the analysis of very short symbolic sequences is capable of tracking changes of the cardiac autonomic regulation during head-up tilt testing.  相似文献   

19.
Evaluating complex fluctuations in geoelectric time series is an important task not only for earthquake prediction but also for understanding complex processes related to earthquake preparation. Previous studies have reported alterations, such as the emergence of correlated dynamics in geoelectric potentials prior to an important earthquake (EQ). However, the presence of correlations and its relation with variability has not been widely explored. In this work we apply the detrended fluctuation analysis and the multiscale entropy methods to analyze the fluctuations of geoelectric time series monitored in two sites located in Mexico. We systematically calculate the correlation exponents and the sample entropy (SE) of geoelectric time series. Important differences in the scaling exponents and entropy profiles for several time scales are observed. In particular, a complex behavior, characterized by a high entropy across several scales and crossover in the correlation exponents, is observed in the vicinity of an that occurred on Sept. 14, 1995. Moreover, we compare the changes in the entropy of the original data with their corresponding shuffled version to see whether correlations in the original data are related to variability.  相似文献   

20.
We present a statistical approach for detecting the Markovian character of dynamical systems by analyzing their flow of information. Especially in the presence of noise which is mostly the case for real-world time series, the calculation of the information flow of the underlying system via the concept of symbolic dynamics is rather problematic since one has to use infinitesimal partitions. We circumvent this difficulty by measuring the information flow indirectly. More precisely, we calculate a measure based on higher order cumulants which quantifies the statistical dependencies between the past values of the time series and the point r steps ahead. As an extension of Theiler's method of surrogate data (Theiler et al., 1992) this cumulant based information flow (a function of the look-ahead r) is used as the discriminating statistic in testing the observed dynamics against a hierarchy of null hypotheses corresponding to nonlinear Markov processes of increasing order. This procedure is iterative in the sense that whenever a null hypothesis is rejected new data sets can be generated corresponding to better approximations of the original process in terms of information flow. Since we use higher order cumulants for calculating the discriminating statistic our method is also applicable to small data sets. Numerical results on artificial and real-world examples including non-chaotic, nonlinear processes, autoregressive models and noisy chaos show the effectiveness of our approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号