首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 234 毫秒
1.
Weak fault signals, high coupling data, and unknown faults commonly exist in fault diagnosis systems, causing low detection and identification performance of fault diagnosis methods based on T2 statistics or cross entropy. This paper proposes a new fault diagnosis method based on optimal bandwidth kernel density estimation (KDE) and Jensen–Shannon (JS) divergence distribution for improved fault detection performance. KDE addresses weak signal and coupling fault detection, and JS divergence addresses unknown fault detection. Firstly, the formula and algorithm of the optimal bandwidth of multidimensional KDE are presented, and the convergence of the algorithm is proved. Secondly, the difference in JS divergence between the data is obtained based on the optimal KDE and used for fault detection. Finally, the fault diagnosis experiment based on the bearing data from Case Western Reserve University Bearing Data Center is conducted. The results show that for known faults, the proposed method has 10% and 2% higher detection rate than T2 statistics and the cross entropy method, respectively. For unknown faults, T2 statistics cannot effectively detect faults, and the proposed method has approximately 15% higher detection rate than the cross entropy method. Thus, the proposed method can effectively improve the fault detection rate.  相似文献   

2.
We consider whether the new horizon-first law works in higher-dimensional f(R) theory. We firstly obtain the general formulas to calculate the entropy and the energy of a general spherically-symmetric black hole in D-dimensional f(R) theory. For applications, we compute the entropies and the energies of some black hokes in some interesting higher-dimensional f(R) theories.  相似文献   

3.
Aims: Bubble entropy (bEn) is an entropy metric with a limited dependence on parameters. bEn does not directly quantify the conditional entropy of the series, but it assesses the change in entropy of the ordering of portions of its samples of length m, when adding an extra element. The analytical formulation of bEn for autoregressive (AR) processes shows that, for this class of processes, the relation between the first autocorrelation coefficient and bEn changes for odd and even values of m. While this is not an issue, per se, it triggered ideas for further investigation. Methods: Using theoretical considerations on the expected values for AR processes, we examined a two-steps-ahead estimator of bEn, which considered the cost of ordering two additional samples. We first compared it with the original bEn estimator on a simulated series. Then, we tested it on real heart rate variability (HRV) data. Results: The experiments showed that both examined alternatives showed comparable discriminating power. However, for values of 10<m<20, where the statistical significance of the method was increased and improved as m increased, the two-steps-ahead estimator presented slightly higher statistical significance and more regular behavior, even if the dependence on parameter m was still minimal. We also investigated a new normalization factor for bEn, which ensures that bEn =1 when white Gaussian noise (WGN) is given as the input. Conclusions: The research improved our understanding of bubble entropy, in particular in the context of HRV analysis, and we investigated interesting details regarding the definition of the estimator.  相似文献   

4.
In this paper, we aim to reveal the connection between the predictability and prediction accuracy of stock closing price changes with different data frequencies. To find out whether data frequency will affect its predictability, a new information-theoretic estimator Plz, which is derived from the Lempel–Ziv entropy, is proposed here to quantify the predictability of five-minute and daily price changes of the SSE 50 index from the Chinese stock market. Furthermore, the prediction method EEMD-FFH we proposed previously was applied to evaluate whether financial data with higher sampling frequency leads to higher prediction accuracy. It turns out that intraday five-minute data are more predictable and also have higher prediction accuracy than daily data, suggesting that the data frequency of stock returns affects its predictability and prediction accuracy, and that higher frequency data have higher predictability and higher prediction accuracy. We also perform linear regression for the two frequency data sets; the results show that predictability and prediction accuracy are positive related.  相似文献   

5.
The stability of endoreversible heat engines has been extensively studied in the literature. In this paper, an alternative dynamic equations system was obtained by using restitution forces that bring the system back to the stationary state. The departing point is the assumption that the system has a stationary fixed point, along with a Taylor expansion in the first order of the input/output heat fluxes, without further specifications regarding the properties of the working fluid or the heat device specifications. Specific cases of the Newton and the phenomenological heat transfer laws in a Carnot-like heat engine model were analyzed. It was shown that the evolution of the trajectories toward the stationary state have relevant consequences on the performance of the system. A major role was played by the symmetries/asymmetries of the conductance ratio σhc of the heat transfer law associated with the input/output heat exchanges. Accordingly, three main behaviors were observed: (1) For small σhc values, the thermodynamic trajectories evolved near the endoreversible limit, improving the efficiency and power output values with a decrease in entropy generation; (2) for large σhc values, the thermodynamic trajectories evolved either near the Pareto front or near the endoreversible limit, and in both cases, they improved the efficiency and power values with a decrease in entropy generation; (3) for the symmetric case (σhc=1), the trajectories evolved either with increasing entropy generation tending toward the Pareto front or with a decrease in entropy generation tending toward the endoreversible limit. Moreover, it was shown that the total entropy generation can define a time scale for both the operation cycle time and the relaxation characteristic time.  相似文献   

6.
In this paper, we study the entropy functions on extreme rays of the polymatroidal region which contain a matroid, i.e., matroidal entropy functions. We introduce variable strength orthogonal arrays indexed by a connected matroid M and positive integer v which can be regarded as expanding the classic combinatorial structure orthogonal arrays. It is interesting that they are equivalent to the partition-representations of the matroid M with degree v and the (M,v) almost affine codes. Thus, a synergy among four fields, i.e., information theory, matroid theory, combinatorial design, and coding theory is developed, which may lead to potential applications in information problems such as network coding and secret-sharing. Leveraging the construction of variable strength orthogonal arrays, we characterize all matroidal entropy functions of order n5 with the exception of log10·U2,5 and logv·U3,5 for some v.  相似文献   

7.
This paper is our attempt, on the basis of physical theory, to bring more clarification on the question “What is life?” formulated in the well-known book of Schrödinger in 1944. According to Schrödinger, the main distinguishing feature of a biosystem’s functioning is the ability to preserve its order structure or, in mathematical terms, to prevent increasing of entropy. However, Schrödinger’s analysis shows that the classical theory is not able to adequately describe the order-stability in a biosystem. Schrödinger also appealed to the ambiguous notion of negative entropy. We apply quantum theory. As is well-known, behaviour of the quantum von Neumann entropy crucially differs from behaviour of classical entropy. We consider a complex biosystem S composed of many subsystems, say proteins, cells, or neural networks in the brain, that is, S=(Si). We study the following problem: whether the compound system S can maintain “global order” in the situation of an increase of local disorder and if S can preserve the low entropy while other Si increase their entropies (may be essentially). We show that the entropy of a system as a whole can be constant, while the entropies of its parts rising. For classical systems, this is impossible, because the entropy of S cannot be less than the entropy of its subsystem Si. And if a subsystems’s entropy increases, then a system’s entropy should also increase, by at least the same amount. However, within the quantum information theory, the answer is positive. The significant role is played by the entanglement of a subsystems’ states. In the absence of entanglement, the increasing of local disorder implies an increasing disorder in the compound system S (as in the classical regime). In this note, we proceed within a quantum-like approach to mathematical modeling of information processing by biosystems—respecting the quantum laws need not be based on genuine quantum physical processes in biosystems. Recently, such modeling found numerous applications in molecular biology, genetics, evolution theory, cognition, psychology and decision making. The quantum-like model of order stability can be applied not only in biology, but also in social science and artificial intelligence.  相似文献   

8.
We explore the quadratic form of the f(R)=R+bR2 gravitational theory to derive rotating N-dimensions black hole solutions with ai,i1 rotation parameters. Here, R is the Ricci scalar and b is the dimensional parameter. We assumed that the N-dimensional spacetime is static and it has flat horizons with a zero curvature boundary. We investigated the physics of black holes by calculating the relations of physical quantities such as the horizon radius and mass. We also demonstrate that, in the four-dimensional case, the higher-order curvature does not contribute to the black hole, i.e., black hole does not depend on the dimensional parameter b, whereas, in the case of N>4, it depends on parameter b, owing to the contribution of the correction R2 term. We analyze the conserved quantities, energy, and angular-momentum, of black hole solutions by applying the relocalization method. Additionally, we calculate the thermodynamic quantities, such as temperature and entropy, and examine the stability of black hole solutions locally and show that they have thermodynamic stability. Moreover, the calculations of entropy put a constraint on the parameter b to be b<116Λ to obtain a positive entropy.  相似文献   

9.
Quantifying the urbanization level is an essential yet challenging task in urban studies because of the high complexity of this phenomenon. The urbanization degree has been estimated using a variety of social, economic, and spatial measures. Among the spatial characteristics, the Shannon entropy of the landscape pattern has recently been intensively explored as one of the most effective urbanization indexes. Here, we introduce a new measure of the spatial entropy of land that characterizes its parcel mosaic, the structure resulting from the division of land into cadastral parcels. We calculate the entropies of the parcel areas’ distribution function in different portions of the urban systems. We have established that the Shannon and Renyi entropies R0 and R1/2 are most effective at differentiating the degree of a spatial organization of the land. Our studies are based on 30 urban systems located in the USA, Australia, and Poland, and three desert areas from Australia. In all the cities, the entropies behave the same as functions of the distance from the center. They attain the lowest values in the city core and reach substantially higher values in suburban areas. Thus, the parcel mosaic entropies provide a spatial characterization of land to measure its urbanization level effectively.  相似文献   

10.
We develop Categorical Exploratory Data Analysis (CEDA) with mimicking to explore and exhibit the complexity of information content that is contained within any data matrix: categorical, discrete, or continuous. Such complexity is shown through visible and explainable serial multiscale structural dependency with heterogeneity. CEDA is developed upon all features’ categorical nature via histogram and it is guided by all features’ associative patterns (order-2 dependence) in a mutual conditional entropy matrix. Higher-order structural dependency of k(3) features is exhibited through block patterns within heatmaps that are constructed by permuting contingency-kD-lattices of counts. By growing k, the resultant heatmap series contains global and large scales of structural dependency that constitute the data matrix’s information content. When involving continuous features, the principal component analysis (PCA) extracts fine-scale information content from each block in the final heatmap. Our mimicking protocol coherently simulates this heatmap series by preserving global-to-fine scales structural dependency. Upon every step of mimicking process, each accepted simulated heatmap is subject to constraints with respect to all of the reliable observed categorical patterns. For reliability and robustness in sciences, CEDA with mimicking enhances data visualization by revealing deterministic and stochastic structures within each scale-specific structural dependency. For inferences in Machine Learning (ML) and Statistics, it clarifies, upon which scales, which covariate feature-groups have major-vs.-minor predictive powers on response features. For the social justice of Artificial Intelligence (AI) products, it checks whether a data matrix incompletely prescribes the targeted system.  相似文献   

11.
The decomposition effect of variational mode decomposition (VMD) mainly depends on the choice of decomposition number K and penalty factor α. For the selection of two parameters, the empirical method and single objective optimization method are usually used, but the aforementioned methods often have limitations and cannot achieve the optimal effects. Therefore, a multi-objective multi-island genetic algorithm (MIGA) is proposed to optimize the parameters of VMD and apply it to feature extraction of bearing fault. First, the envelope entropy (Ee) can reflect the sparsity of the signal, and Renyi entropy (Re) can reflect the energy aggregation degree of the time-frequency distribution of the signal. Therefore, Ee and Re are selected as fitness functions, and the optimal solution of VMD parameters is obtained by the MIGA algorithm. Second, the improved VMD algorithm is used to decompose the bearing fault signal, and then two intrinsic mode functions (IMF) with the most fault information are selected by improved kurtosis and Holder coefficient for reconstruction. Finally, the envelope spectrum of the reconstructed signal is analyzed. The analysis of comparative experiments shows that the feature extraction method can extract bearing fault features more accurately, and the fault diagnosis model based on this method has higher accuracy.  相似文献   

12.
Proteins are essential molecules, that must correctly perform their roles for the good health of living organisms. The majority of proteins operate in complexes and the way they interact has pivotal influence on the proper functioning of such organisms. In this study we address the problem of protein–protein interaction and we propose and investigate a method based on the use of an ensemble of autoencoders. Our approach, entitled AutoPPI, adopts a strategy based on two autoencoders, one for each type of interactions (positive and negative) and we advance three types of neural network architectures for the autoencoders. Experiments were performed on several data sets comprising proteins from four different species. The results indicate good performances of our proposed model, with accuracy and AUC values of over 0.97 in all cases. The best performing model relies on a Siamese architecture in both the encoder and the decoder, which advantageously captures common features in protein pairs. Comparisons with other machine learning techniques applied for the same problem prove that AutoPPI outperforms most of its contenders, for the considered data sets.  相似文献   

13.
In this work, we start from a phenomenological Hamiltonian built from two known systems: the Hamiltonian of a pumped optomechanical system and the Jaynes-Cummings Hamiltonian. Using algebraic techniques we construct an approximate time evolution operator U^(t) for the forced optomechanical system (as a product of exponentials) and take the JC Hamiltonian as an interaction. We transform the later with U^(t) to obtain a generalized interaction picture Hamiltonian which can be linearized and whose time evolution operator is written in a product form. The analytic results are compared with purely numerical calculations using the full Hamiltonian and the agreement between them is remarkable.  相似文献   

14.
The effects of using a partly curved porous layer on the thermal management and entropy generation features are studied in a ventilated cavity filled with hybrid nanofluid under the effects of inclined magnetic field by using finite volume method. This study is performed for the range of pertinent parameters of Reynolds number (100Re1000), magnetic field strength (0Ha80), permeability of porous region (104Da5×102), porous layer height (0.15Htp0.45H), porous layer position (0.25Hyp0.45H), and curvature size (0b0.3H). The magnetic field reduces the vortex size, while the average Nusselt number of hot walls increases for Ha number above 20 and highest enhancement is 47% for left vertical wall. The variation in the average Nu with permeability of the layer is about 12.5% and 21% for left and right vertical walls, respectively, while these amounts are 12.5% and 32.5% when the location of the porous layer changes. The entropy generation increases with Hartmann number above 20, while there is 22% increase in the entropy generation for the case at the highest magnetic field. The porous layer height reduced the entropy generation for domain above it and it give the highest contribution to the overall entropy generation. When location of the curved porous layer is varied, the highest variation of entropy generation is attained for the domain below it while the lowest value is obtained at yp=0.3H. When the size of elliptic curvature is varied, the overall entropy generation decreases from b = 0 to b=0.2H by about 10% and then increases by 5% from b=0.2H to b=0.3H.  相似文献   

15.
In the nervous system, information is conveyed by sequence of action potentials, called spikes-trains. As MacKay and McCulloch suggested, spike-trains can be represented as bits sequences coming from Information Sources (IS). Previously, we studied relations between spikes’ Information Transmission Rates (ITR) and their correlations, and frequencies. Now, I concentrate on the problem of how spikes fluctuations affect ITR. The IS are typically modeled as stationary stochastic processes, which I consider here as two-state Markov processes. As a spike-trains’ fluctuation measure, I assume the standard deviation σ, which measures the average fluctuation of spikes around the average spike frequency. I found that the character of ITR and signal fluctuations relation strongly depends on the parameter s being a sum of transitions probabilities from a no spike state to spike state. The estimate of the Information Transmission Rate was found by expressions depending on the values of signal fluctuations and parameter s. It turned out that for smaller s<1, the quotient ITRσ has a maximum and can tend to zero depending on transition probabilities, while for 1<s, the ITRσ is separated from 0. Additionally, it was also shown that ITR quotient by variance behaves in a completely different way. Similar behavior was observed when classical Shannon entropy terms in the Markov entropy formula are replaced by their approximation with polynomials. My results suggest that in a noisier environment (1<s), to get appropriate reliability and efficiency of transmission, IS with higher tendency of transition from the no spike to spike state should be applied. Such selection of appropriate parameters plays an important role in designing learning mechanisms to obtain networks with higher performance.  相似文献   

16.
17.
In order to study the spread of an epidemic over a region as a function of time, we introduce an entropy ratio U describing the uniformity of infections over various states and their districts, and an entropy concentration coefficient C=1U. The latter is a multiplicative version of the Kullback-Leibler distance, with values between 0 and 1. For product measures and self-similar phenomena, it does not depend on the measurement level. Hence, C is an alternative to Gini’s concentration coefficient for measures with variation on different levels. Simple examples concern population density and gross domestic product. Application to time series patterns is indicated with a Markov chain. For the Covid-19 pandemic, entropy ratios indicate a homogeneous distribution of infections and the potential of local action when compared to measures for a whole region.  相似文献   

18.
This paper discussed the estimation of stress-strength reliability parameter R=P(Y<X) based on complete samples when the stress-strength are two independent Poisson half logistic random variables (PHLD). We have addressed the estimation of R in the general case and when the scale parameter is common. The classical and Bayesian estimation (BE) techniques of R are studied. The maximum likelihood estimator (MLE) and its asymptotic distributions are obtained; an approximate asymptotic confidence interval of R is computed using the asymptotic distribution. The non-parametric percentile bootstrap and student’s bootstrap confidence interval of R are discussed. The Bayes estimators of R are computed using a gamma prior and discussed under various loss functions such as the square error loss function (SEL), absolute error loss function (AEL), linear exponential error loss function (LINEX), generalized entropy error loss function (GEL) and maximum a posteriori (MAP). The Metropolis–Hastings algorithm is used to estimate the posterior distributions of the estimators of R. The highest posterior density (HPD) credible interval is constructed based on the SEL. Monte Carlo simulations are used to numerically analyze the performance of the MLE and Bayes estimators, the results were quite satisfactory based on their mean square error (MSE) and confidence interval. Finally, we used two real data studies to demonstrate the performance of the proposed estimation techniques in practice and to illustrate how PHLD is a good candidate in reliability studies.  相似文献   

19.
This article deals with compression of binary sequences with a given number of ones, which can also be considered as a list of indexes of a given length. The first part of the article shows that the entropy H of random n-element binary sequences with exactly k elements equal one satisfies the inequalities klog2(0.48·n/k)<H<klog2(2.72·n/k). Based on this result, we propose a simple coding using fixed length words. Its main application is the compression of random binary sequences with a large disproportion between the number of zeros and the number of ones. Importantly, the proposed solution allows for a much faster decompression compared with the Golomb-Rice coding with a relatively small decrease in the efficiency of compression. The proposed algorithm can be particularly useful for database applications for which the speed of decompression is much more important than the degree of index list compression.  相似文献   

20.
We discuss a covariant relativistic Boltzmann equation which describes the evolution of a system of particles in spacetime evolving with a universal invariant parameter τ. The observed time t of Einstein and Maxwell, in the presence of interaction, is not necessarily a monotonic function of τ. If t(τ) increases with τ, the worldline may be associated with a normal particle, but if it is decreasing in τ, it is observed in the laboratory as an antiparticle. This paper discusses the implications for entropy evolution in this relativistic framework. It is shown that if an ensemble of particles and antiparticles, converge in a region of pair annihilation, the entropy of the antiparticle beam may decreaase in time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号