共查询到11条相似文献,搜索用时 0 毫秒
1.
Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s entropy’s plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon’s entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon’s entropy. 相似文献
2.
Mohamed A. Abd Elgawad Haroon M. Barakat Shengwu Xiong Salem A. Alyami 《Entropy (Basel, Switzerland)》2021,23(3)
In this paper, we study the concomitants of dual generalized order statistics (and consequently generalized order statistics) when the parameters are assumed to be pairwise different from Huang–Kotz Farlie–Gumble–Morgenstern bivariate distribution. Some useful recurrence relations between single and product moments of concomitants are obtained. Moreover, Shannon’s entropy and the Fisher information number measures are derived. Finally, these measures are extensively studied for some well-known distributions such as exponential, Pareto and power distributions. The main motivation of the study of the concomitants of generalized order statistics (as an important practical kind to order the bivariate data) under this general framework is to enable researchers in different fields of statistics to use some of the important models contained in these generalized order statistics only under this general framework. These extended models are frequently used in the reliability theory, such as the progressive type-II censored order statistics. 相似文献
3.
We introduce a quantum key distribution protocol using mean multi-kings’ problem. Using this protocol, a sender can share a bit sequence as a secret key with receivers. We consider a relation between information gain by an eavesdropper and disturbance contained in legitimate users’ information. In BB84 protocol, such relation is known as the so-called information disturbance theorem. We focus on a setting that the sender and two receivers try to share bit sequences and the eavesdropper tries to extract information by interacting legitimate users’ systems and an ancilla system. We derive trade-off inequalities between distinguishability of quantum states corresponding to the bit sequence for the eavesdropper and error probability of the bit sequence shared with the legitimate users. Our inequalities show that eavesdropper’s extracting information regarding the secret keys inevitably induces disturbing the states and increasing the error probability. 相似文献
4.
This paper shows if and how the predictability and complexity of stock market data changed over the last half-century and what influence the M1 money supply has. We use three different machine learning algorithms, i.e., a stochastic gradient descent linear regression, a lasso regression, and an XGBoost tree regression, to test the predictability of two stock market indices, the Dow Jones Industrial Average and the NASDAQ (National Association of Securities Dealers Automated Quotations) Composite. In addition, all data under study are discussed in the context of a variety of measures of signal complexity. The results of this complexity analysis are then linked with the machine learning results to discover trends and correlations between predictability and complexity. Our results show a decrease in predictability and an increase in complexity for more recent years. We find a correlation between approximate entropy, sample entropy, and the predictability of the employed machine learning algorithms on the data under study. This link between the predictability of machine learning algorithms and the mentioned entropy measures has not been shown before. It should be considered when analyzing and predicting complex time series data, e.g., stock market data, to e.g., identify regions of increased predictability. 相似文献
5.
There are various distributions of image histograms where regions form symmetrically or asymmetrically based on the frequency of the intensity levels inside the image. In pure image processing, the process of optimal thresholding tends to accurately separate each region in the image histogram to obtain the segmented image. Otsu’s method is the most used technique in image segmentation. Otsu algorithm performs automatic image thresholding and returns the optimal threshold by maximizing between-class variance using the sum of Gaussian distribution for the intensity level in the histogram. There are various types of images where an intensity level has right-skewed histograms and does not fit with the between-class variance of the original Otsu algorithm. In this paper, we proposed an improvement of the between-class variance based on lognormal distribution, using the mean and the variance of the lognormal. The proposed model aims to handle the drawbacks of asymmetric distribution, especially for images with right-skewed intensity levels. Several images were tested for segmentation in the proposed model in parallel with the original Otsu method and the relevant work, including simulated images and Medical Resonance Imaging (MRI) of brain tumors. Two types of evaluation measures were used in this work based on unsupervised and supervised metrics. The proposed model showed superior results, and the segmented images indicated better threshold estimation against the original Otsu method and the related improvement. 相似文献
6.
Igal Sason 《Entropy (Basel, Switzerland)》2021,23(3)
This paper studies the problem of upper bounding the number of independent sets in a graph, expressed in terms of its degree distribution. For bipartite regular graphs, Kahn (2001) established a tight upper bound using an information-theoretic approach, and he also conjectured an upper bound for general graphs. His conjectured bound was recently proved by Sah et al. (2019), using different techniques not involving information theory. The main contribution of this work is the extension of Kahn’s information-theoretic proof technique to handle irregular bipartite graphs. In particular, when the bipartite graph is regular on one side, but may be irregular on the other, the extended entropy-based proof technique yields the same bound as was conjectured by Kahn (2001) and proved by Sah et al. (2019). 相似文献
7.
Entropy measures the uncertainty associated with a random variable. It has important applications in cybernetics, probability theory, astrophysics, life sciences and other fields. Recently, many authors focused on the estimation of entropy with different life distributions. However, the estimation of entropy for the generalized Bilal (GB) distribution has not yet been involved. In this paper, we consider the estimation of the entropy and the parameters with GB distribution based on adaptive Type-II progressive hybrid censored data. Maximum likelihood estimation of the entropy and the parameters are obtained using the Newton–Raphson iteration method. Bayesian estimations under different loss functions are provided with the help of Lindley’s approximation. The approximate confidence interval and the Bayesian credible interval of the parameters and entropy are obtained by using the delta and Markov chain Monte Carlo (MCMC) methods, respectively. Monte Carlo simulation studies are carried out to observe the performances of the different point and interval estimations. Finally, a real data set has been analyzed for illustrative purposes. 相似文献
8.
In this paper, advanced wall-modeled large eddy simulation (LES) techniques are used to predict conjugate heat transfer processes in turbulent channel flow. Thereby, the thermal energy transfer process involves an interaction of conduction within a solid body and convection from the solid surface by fluid motion. The approaches comprise a two-layer RANS–LES approach (zonal LES), a hybrid RANS–LES representative, the so-called improved delayed detached eddy simulation method (IDDES) and a non-equilibrium wall function model (WFLES), respectively. The results obtained are evaluated in comparison with direct numerical simulation (DNS) data and wall-resolved LES including thermal cases of large Reynolds numbers where DNS data are not available in the literature. It turns out that zonal LES, IDDES and WFLES are able to predict heat and fluid flow statistics along with wall shear stresses and Nusselt numbers accurately and that are physically consistent. Furthermore, it is found that IDDES, WFLES and zonal LES exhibit significantly lower computational costs than wall-resolved LES. Since IDDES and especially zonal LES require considerable extra work to generate numerical grids, this study indicates in particular that WFLES offers a promising near-wall modeling strategy for LES of conjugated heat transfer problems. Finally, an entropy generation analysis using the various models showed that the viscous entropy production is zero inside the solid region, peaks at the solid–fluid interface and decreases rapidly with increasing wall distance within the fluid region. Except inside the solid region, where steep temperature gradients lead to high (thermal) entropy generation rates, a similar behavior is monitored for the entropy generation by heat transfer process. 相似文献
9.
Marcos Revilla-Vallejo Jesús Poza Javier Gomez-Pilar Roberto Hornero Miguel ngel Tola-Arribas Mnica Cano Carlos Gmez 《Entropy (Basel, Switzerland)》2021,23(5)
Alzheimer’s disease (AD) is a neurodegenerative disorder which has become an outstanding social problem. The main objective of this study was to evaluate the alterations that dementia due to AD elicits in the distribution of functional network weights. Functional connectivity networks were obtained using the orthogonalized Amplitude Envelope Correlation (AEC), computed from source-reconstructed resting-state eletroencephalographic (EEG) data in a population formed by 45 cognitive healthy elderly controls, 69 mild cognitive impaired (MCI) patients and 81 AD patients. Our results indicated that AD induces a progressive alteration of network weights distribution; specifically, the Shannon entropy (SE) of the weights distribution showed statistically significant between-group differences (p < 0.05, Kruskal-Wallis test, False Discovery Rate corrected). Furthermore, an in-depth analysis of network weights distributions was performed in delta, alpha, and beta-1 frequency bands to discriminate the weight ranges showing statistical differences in SE. Our results showed that lower and higher weights were more affected by the disease, whereas mid-range connections remained unchanged. These findings support the importance of performing detailed analyses of the network weights distribution to further understand the impact of AD progression on functional brain activity. 相似文献
10.
Sayed Abdel-Khalek Kamal Berrada Eied M. Khalil Abdel-Shafy F. Obada Esraa Reda Hichem Eleuch 《Entropy (Basel, Switzerland)》2021,23(5)
In this work, we introduce the standard Tavis-Cummings model to describe two-qubit system interacting with a single-mode field associated to power-law (PL) potentials. We explore the effect of the time-dependent interaction and the Kerr-like medium. We solve the Schrödinger equation to obtain the density operator that allows us to investigate the dynamical behaviour of some quantumness measures, such as von Neumann entropy, negativity and Mandel’s parameter. We provide how these entanglement measures depend on the system parameters, which paves the way towards better control of entanglement generation in two-qubit systems. We find that the enhancement and preservation of the atoms-field entanglement and atom-atom entanglement can be achieved by a proper choice of the initial parameters of the field in the absence and presence of the time-dependent interaction and Kerr medium. We examine the photons distribution of the field and determine the situations for which the field exhibits super-poissonian, poissonian or sub-poissonian distribution. 相似文献
11.
Vladimir V. Aristov Anatoly S. Buchelnikov Yury D. Nechipurenko 《Entropy (Basel, Switzerland)》2022,24(2)
Some problems of describing biological systems with the use of entropy as a measure of the complexity of these systems are considered. Entropy is studied both for the organism as a whole and for its parts down to the molecular level. Correlation of actions of various parts of the whole organism, intercellular interactions and control, as well as cooperativity on the microlevel lead to a more complex structure and lower statistical entropy. For a multicellular organism, entropy is much lower than entropy for the same mass of a colony of unicellular organisms. Cooperativity always reduces the entropy of the system; a simple example of ligand binding to a macromolecule carrying two reaction centers shows how entropy is consistent with the ambiguity of the result in the Bernoulli test scheme. Particular attention is paid to the qualitative and quantitative relationship between the entropy of the system and the cooperativity of ligand binding to macromolecules. A kinetic model of metabolism. corresponding to Schrödinger’s concept of the maintenance biosystems by “negentropy feeding”, is proposed. This model allows calculating the nonequilibrium local entropy and comparing it with the local equilibrium entropy inherent in non-living matter. 相似文献