共查询到13条相似文献,搜索用时 15 毫秒
1.
Task-nuisance decomposition describes why the information bottleneck loss is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z. When n is a nuisance independent from y, can be decreased by reducing since the latter upper bounds the former. We extend this framework by demonstrating that conditional mutual information provides an alternative upper bound for . This bound is applicable even if z is not a sufficient representation of x, that is, . We used mutual information neural estimation (MINE) to estimate . Experiments demonstrated that is smaller than for layers closer to the input, matching the claim that the former is a tighter bound than the latter. Because of this difference, the information plane differs when is used instead of . 相似文献
2.
In this paper, we introduce and investigate the mutual information and relative entropy on the sequential effect algebra, we also give a comparison of these mutual information and relative entropy with the classical ones by the venn diagrams. Finally, a nice example shows that the entropies of sequential effect algebra depend extremely on the order of its sequential product. 相似文献
3.
In this paper, we introduce and investigate the mutual information and relative entropy on the sequential effect algebra, we also give a comparison of these mutual information and relative entropy with the classical ones by the venn diagrams. Finally, a nice example shows that the entropies of sequential effect algebra depend extremely on the order of its sequential product. 相似文献
4.
Achieving dynamical speedup of evolution in an open quantum system plays a key role in many technological applications. However, how to detect quantum speedup is unclear. In this work, a method to witness quantum speedup through the measure of the mutual information is presented. It is shown that the speed of evolution of a quantum system, can be witnessed by calculating the mutual information variation, whose increase is a clear signature of dynamical speedup. The result is explained by considering the time evolution of two qubits under a one‐sided noisy channel, finding out that the mechanism for the quantum speedup is closely associated with the total exchange of information between the system and its environment, which can be expressed by the variation of mutual information. Quantitatively, the average speed of evolution is shown proportional to the average variation of the mutual information in an interval of time. The conclusion can not only explain why the quantum entanglement (or quantum coherence) and the classical correlation are neither necessary nor sufficient to speed up the quantum evolution, but also give a vital way of detecting quantum speedup in realistic environments. 相似文献
5.
为了自动地进行图像的多值分割,从原始图像与分割图像之间的相互关系出发,以最大互信息为优化分割目标,以互信息熵差作为一种新的分类类数判据,在对传统脉冲耦合神经网络模型改进的基础上,提出了一种基于最大互信息改进型脉冲耦合神经网络图像多值分割算法.理论分析和实验结果表明,该方法能够自动确定最佳分割迭代次数及最佳分割灰度类数,对分割图像具有良好的特征划分能力,且在分割类数较少的情况下,能较好地保持图像细节、纹理及边缘等信息,对不同图像分割准确度高,具有较强的适用性. 相似文献
6.
Belavkin–Staszewski relative entropy can naturally characterize the effects of the possible noncommutativity of quantum states. In this paper, two new conditional entropy terms and four new mutual information terms are first defined by replacing quantum relative entropy with Belavkin–Staszewski relative entropy. Next, their basic properties are investigated, especially in classical-quantum settings. In particular, we show the weak concavity of the Belavkin–Staszewski conditional entropy and obtain the chain rule for the Belavkin–Staszewski mutual information. Finally, the subadditivity of the Belavkin–Staszewski relative entropy is established, i.e., the Belavkin–Staszewski relative entropy of a joint system is less than the sum of that of its corresponding subsystems with the help of some multiplicative and additive factors. Meanwhile, we also provide a certain subadditivity of the geometric Rényi relative entropy. 相似文献
7.
This paper describes a new model for portfolio optimization (PO), using entropy and mutual information instead of variance and covariance as measurements of risk. We also compare the performance in and out of sample of the original Markowitz model against the proposed model and against other state of the art shrinkage methods. It was found that ME (mean-entropy) models do not always outperform their MV (mean-variance) and robust counterparts, although presenting an edge in terms of portfolio diversity measures, especially for portfolio weight entropy. It further shows that when increasing return constraints on portfolio optimization, ME models were more stable overall, showing dampened responses in cumulative returns and Sharpe indexes in comparison to MV and robust methods, but concentrated their portfolios more rapidly as they were more evenly spread initially. Finally, the results suggest that it was also shown that, depending on the market, increasing return constraints may have positive or negative impacts on the out-of-sample performance. 相似文献
8.
Galen Reeves 《Entropy (Basel, Switzerland)》2020,22(11)
This paper explores some applications of a two-moment inequality for the integral of the rth power of a function, where . The first contribution is an upper bound on the Rényi entropy of a random vector in terms of the two different moments. When one of the moments is the zeroth moment, these bounds recover previous results based on maximum entropy distributions under a single moment constraint. More generally, evaluation of the bound with two carefully chosen nonzero moments can lead to significant improvements with a modest increase in complexity. The second contribution is a method for upper bounding mutual information in terms of certain integrals with respect to the variance of the conditional density. The bounds have a number of useful properties arising from the connection with variance decompositions. 相似文献
9.
This paper aims to empirically examine long memory and bi-directional information flow between estimated volatilities of highly volatile time series datasets of five cryptocurrencies. We propose the employment of Garman and Klass (GK), Parkinson’s, Rogers and Satchell (RS), and Garman and Klass-Yang and Zhang (GK-YZ), and Open-High-Low-Close (OHLC) volatility estimators to estimate cryptocurrencies’ volatilities. The study applies methods such as mutual information, transfer entropy (TE), effective transfer entropy (ETE), and Rényi transfer entropy (RTE) to quantify the information flow between estimated volatilities. Additionally, Hurst exponent computations examine the existence of long memory in log returns and OHLC volatilities based on simple R/S, corrected R/S, empirical, corrected empirical, and theoretical methods. Our results confirm the long-run dependence and non-linear behavior of all cryptocurrency’s log returns and volatilities. In our analysis, TE and ETE estimates are statistically significant for all OHLC estimates. We report the highest information flow from BTC to LTC volatility (RS). Similarly, BNB and XRP share the most prominent information flow between volatilities estimated by GK, Parkinson’s, and GK-YZ. The study presents the practicable addition of OHLC volatility estimators for quantifying the information flow and provides an additional choice to compare with other volatility estimators, such as stochastic volatility models. 相似文献
10.
The purpose of this study was to compare non-uniform image quality caused by the anode heel effect between two radiographic systems using a circular step-wedge (CSW) phantom and the normalized mutual information (nMI) metric. Ten repeated radiographic images of the CSW and contrast-detail resolution (CDR) phantoms were acquired from two digital radiographic systems with 16- and 12-degree anode angles, respectively, using various kVp and mAs. To compare non-uniform image quality, the CDR phantom was physically rotated at different orientations, and the directional nMI metrics were calculated from the CSW images. The directional visible ratio (VR) metrics were calculated from the CDR images. Analysis of variance (ANOVA) was performed to understand whether the nMI metric significantly changed with kVp, mAs, and orientations with Bonferroni correction. Mann–Whitney’s U test was performed to compare the metrics between the two systems. Contrary to the VR metrics, the nMI metrics significantly changed with orientations in both radiographic systems. In addition, the system with the 12-degree anode angle exhibited less uniform image quality compared to the system with the 16-degree anode angle. A CSW phantom using the directional nMI metric can be significantly helpful to compare non-uniform image quality between two digital radiographic systems. 相似文献
11.
12.
Mohamed A. Abd Elgawad Haroon M. Barakat Shengwu Xiong Salem A. Alyami 《Entropy (Basel, Switzerland)》2021,23(3)
In this paper, we study the concomitants of dual generalized order statistics (and consequently generalized order statistics) when the parameters are assumed to be pairwise different from Huang–Kotz Farlie–Gumble–Morgenstern bivariate distribution. Some useful recurrence relations between single and product moments of concomitants are obtained. Moreover, Shannon’s entropy and the Fisher information number measures are derived. Finally, these measures are extensively studied for some well-known distributions such as exponential, Pareto and power distributions. The main motivation of the study of the concomitants of generalized order statistics (as an important practical kind to order the bivariate data) under this general framework is to enable researchers in different fields of statistics to use some of the important models contained in these generalized order statistics only under this general framework. These extended models are frequently used in the reliability theory, such as the progressive type-II censored order statistics. 相似文献
13.
Arieh Ben-Naim 《Entropy (Basel, Switzerland)》2022,24(11)
In 2015, I wrote a book with the same title as this article. The book’s subtitle is: “What we know and what we do not know.” On the book’s dedication page, I wrote: “This book is dedicated to readers of popular science books who are baffled, perplexed, puzzled, astonished, confused, and discombobulated by reading about Information, Entropy, Life and the Universe.” In the first part of this article, I will present the definitions of two central concepts: the “Shannon measure of information” (SMI), in Information Theory, and “Entropy”, in Thermodynamics. Following these definitions, I will discuss the framework of their applicability. In the second part of the article, I will examine the question of whether living systems and the entire universe are, or are not within the framework of applicability of the concepts of SMI and Entropy. I will show that much of the confusion that exists in the literature arises because of people’s ignorance about the framework of applicability of these concepts. 相似文献