首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
NIFTy , “Numerical Information Field Theory,” is a software framework designed to ease the development and implementation of field inference algorithms. Field equations are formulated independently of the underlying spatial geometry allowing the user to focus on the algorithmic design. Under the hood, NIFTy ensures that the discretization of the implemented equations is consistent. This enables the user to prototype an algorithm rapidly in 1D and then apply it to high‐dimensional real‐world problems. This paper introduces NIFTy  3, a major upgrade to the original NIFTy  framework. NIFTy  3 allows the user to run inference algorithms on massively parallel high performance computing clusters without changing the implementation of the field equations. It supports n‐dimensional Cartesian spaces, spherical spaces, power spaces, and product spaces as well as transforms to their harmonic counterparts. Furthermore, NIFTy  3 is able to handle non‐scalar fields, such as vector or tensor fields. The functionality and performance of the software package is demonstrated with example code, which implements a mock inference inspired by a real‐world algorithm from the realm of information field theory. NIFTy  3 is open‐source software available under the GNU General Public License v3 (GPL‐3) at https://gitlab.mpcdf.mpg.de/ift/NIFTy/tree/NIFTy_3 .  相似文献   

2.
Information field theory (IFT), the information theory for fields, is a mathematical framework for signal reconstruction and non-parametric inverse problems. Artificial intelligence (AI) and machine learning (ML) aim at generating intelligent systems, including such for perception, cognition, and learning. This overlaps with IFT, which is designed to address perception, reasoning, and inference tasks. Here, the relation between concepts and tools in IFT and those in AI and ML research are discussed. In the context of IFT, fields denote physical quantities that change continuously as a function of space (and time) and information theory refers to Bayesian probabilistic logic equipped with the associated entropic information measures. Reconstructing a signal with IFT is a computational problem similar to training a generative neural network (GNN) in ML. In this paper, the process of inference in IFT is reformulated in terms of GNN training. In contrast to classical neural networks, IFT based GNNs can operate without pre-training thanks to incorporating expert knowledge into their architecture. Furthermore, the cross-fertilization of variational inference methods used in IFT and ML are discussed. These discussions suggest that IFT is well suited to address many problems in AI and ML research and application.  相似文献   

3.
Modeling a causal association as arising from a communication process between cause and effect, simplifies the discovery of causal skeletons. The communication channels enabling these communication processes, are fully characterized by stochastic tensors, and therefore allow us to use linear algebra. This tensor-based approach reduces the dimensionality of the data needed to test for conditional independence, e.g., for systems comprising three variables, pair-wise determined tensors suffice to infer the causal skeleton. The only thing needed is a minor extension to information theory, namely the concept of path information.  相似文献   

4.
This note addresses the problem of localization in quantum field theory; more specifically we contribute to the ongoing discussion about the most appropriate concept of localization which one should use in relativistic quantum field theory: through localized test functions or through the fields directly without localized test functions. In standard quantum field theory, i.e., in relativistic quantum field theory in terms of tempered distributions according to Gårding and Wightman, this is done through localized test functions. In hyperfunction quantum field theory (HFQFT), i.e., relativistic quantum field theory in terms of Fourier hyperfunctions this is done through the fields themselves. In support of the second approach we show here that it has a much wider range of applicability. It can even be applied to relativistic quantum field theories which do not admit compactly supported test functions at all. In our construction of explicit models we rely on basic results from the theory of quasi-analytic functions.  相似文献   

5.
Full waveform inversion is an advantageous technique for obtaining high-resolution subsurface information. In the petroleum industry, mainly in reservoir characterisation, it is common to use information from wells as previous information to decrease the ambiguity of the obtained results. For this, we propose adding a relative entropy term to the formalism of the full waveform inversion. In this context, entropy will be just a nomenclature for regularisation and will have the role of helping the converge to the global minimum. The application of entropy in inverse problems usually involves formulating the problem, so that it is possible to use statistical concepts. To avoid this step, we propose a deterministic application to the full waveform inversion. We will discuss some aspects of relative entropy and show three different ways of using them to add prior information through entropy in the inverse problem. We use a dynamic weighting scheme to add prior information through entropy. The idea is that the prior information can help to find the path of the global minimum at the beginning of the inversion process. In all cases, the prior information can be incorporated very quickly into the full waveform inversion and lead the inversion to the desired solution. When we include the logarithmic weighting that constitutes entropy to the inverse problem, we will suppress the low-intensity ripples and sharpen the point events. Thus, the addition of entropy relative to full waveform inversion can provide a result with better resolution. In regions where salt is present in the BP 2004 model, we obtained a significant improvement by adding prior information through the relative entropy for synthetic data. We will show that the prior information added through entropy in full-waveform inversion formalism will prove to be a way to avoid local minimums.  相似文献   

6.
For the AKNS operator on L 2([0,1],C 2) it is well known that the data of two spectra uniquely determine the corresponding potential a.e. on [0,1] (Borg's type Theorem). We prove that, in the case where is a-priori known on [a,1], then only a part (depending on a) of two spectra determine on [0,1]. Our results include generalizations for Dirac systems of classical results obtained by Hochstadt and Lieberman for the Sturm–Liouville case, where they showed that half of the potential and one spectrum determine all the potential functions. An important ingredient in our strategy is the link between the rate of growth of an entire function and the distribution of its zeros.  相似文献   

7.
We study the scattering theory of fermion systems subject to a smooth local perturbation with a non-vanishing odd part. We introduce a modified free fermion fields which have an appropriate commutation relations with the free Fock fermion fields. We construct the wave operators using the modified field and prove asymptotic completeness. Our work extends former results on Hilbert space asymptotic completeness.  相似文献   

8.
Neuroscience extensively uses the information theory to describe neural communication, among others, to calculate the amount of information transferred in neural communication and to attempt the cracking of its coding. There are fierce debates on how information is represented in the brain and during transmission inside the brain. The neural information theory attempts to use the assumptions of electronic communication; despite the experimental evidence that the neural spikes carry information on non-discrete states, they have shallow communication speed, and the spikes’ timing precision matters. Furthermore, in biology, the communication channel is active, which enforces an additional power bandwidth limitation to the neural information transfer. The paper revises the notions needed to describe information transfer in technical and biological communication systems. It argues that biology uses Shannon’s idea outside of its range of validity and introduces an adequate interpretation of information. In addition, the presented time-aware approach to the information theory reveals pieces of evidence for the role of processes (as opposed to states) in neural operations. The generalized information theory describes both kinds of communication, and the classic theory is the particular case of the generalized theory.  相似文献   

9.
In this Letter we show how the scattering amplitudes of nonrelativistic one-particle Schrödinger operators with a scalar (not necessarily rotation invariant) potential may be obtained from the scattering cross-sections for the system where a scalar potential is added and whose scattering amplitudes are known explicitly.  相似文献   

10.
In solving challenging pattern recognition problems, deep neural networks have shown excellent performance by forming powerful mappings between inputs and targets, learning representations (features) and making subsequent predictions. A recent tool to help understand how representations are formed is based on observing the dynamics of learning on an information plane using mutual information, linking the input to the representation (I(X;T)) and the representation to the target (I(T;Y)). In this paper, we use an information theoretical approach to understand how Cascade Learning (CL), a method to train deep neural networks layer-by-layer, learns representations, as CL has shown comparable results while saving computation and memory costs. We observe that performance is not linked to information–compression, which differs from observation on End-to-End (E2E) learning. Additionally, CL can inherit information about targets, and gradually specialise extracted features layer-by-layer. We evaluate this effect by proposing an information transition ratio, I(T;Y)/I(X;T), and show that it can serve as a useful heuristic in setting the depth of a neural network that achieves satisfactory accuracy of classification.  相似文献   

11.
It is shown here that a strengthening of Wallach's Unentangled Gleason Theorem can be obtained by applying results of the present authors on generalised Gleason theorems for quantum multi-measures arising from investigations of quantum decoherence functionals.  相似文献   

12.
The perturbation theory for the Landau-Lifschitz equation for isotropic chain with correction, which is based on the inverse scattering transform (IST), is developed to treat Landau-Lifschitz equation for a spin chain with axis asymmetry. The time-evolution equation of parameters and a formula for the first-order correction is given by treating the equation with axis symmetry as a perturbation to the isotropic equation. PACS numbers 05.45.Yv, 42.65.-k, 42.50.Md.Supported by the National Science Foundation of China under Grant NO. 10474076 and No. 10375041.  相似文献   

13.
14.
In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses. The maximal information that a black hole can gain is limited by both the Compton wavelength of the object and the diameter of the black hole. When an object falls into a black hole, its information disappears due to the no-hair theorem, and the entropy of the black hole increases correspondingly. The area entropy of a black hole can thus be obtained, which indicates that the Bekenstein–Hawking entropy is information entropy rather than thermodynamic entropy. The quantum corrections of black hole entropy are also obtained according to the limit of Compton wavelength of the captured particles, which makes the mass of a black hole naturally quantized. Our work provides an information-theoretic perspective for understanding the nature of black hole entropy.  相似文献   

15.
Chromodynamic fluctuations in the collisionless quark-gluon plasma are found as a solution of the initial value linearized problem. The stable and unstable plasmas are discussed.  相似文献   

16.
We consider the problem of the existence of soliton-like self-gravitating cylindrically symmetric configurations of a classical spinor field with the nonlinearity F(S) ( , F is an arbitrary function). Soliton-like configurations should have, by definition, a regular axis of symmetry and a flat or string-like geometry far from the axis (i.e., an asymptotically Minkowskian metric with a possible angular defect). It is shown that these conditions can be fulfilled if F(S) is finite as S and decreases faster than S 2 as S 0. The set of field equations is entirely integrated, and some explicit examples are considered. A regularizing role of gravity is discussed.  相似文献   

17.
This review looks at some of the central relationships between artificial intelligence, psychology, and economics through the lens of information theory, specifically focusing on formal models of decision-theory. In doing so we look at a particular approach that each field has adopted and how information theory has informed the development of the ideas of each field. A key theme is expected utility theory, its connection to information theory, the Bayesian approach to decision-making and forms of (bounded) rationality. What emerges from this review is a broadly unified formal perspective derived from three very different starting points that reflect the unique principles of each field. Each of the three approaches reviewed can, in principle at least, be implemented in a computational model in such a way that, with sufficient computational power, they could be compared with human abilities in complex tasks. However, a central critique that can be applied to all three approaches was first put forward by Savage in The Foundations of Statistics and recently brought to the fore by the economist Binmore: Bayesian approaches to decision-making work in what Savage called ‘small worlds’ but cannot work in ‘large worlds’. This point, in various different guises, is central to some of the current debates about the power of artificial intelligence and its relationship to human-like learning and decision-making. Recent work on artificial intelligence has gone some way to bridging this gap but significant questions remain to be answered in all three fields in order to make progress in producing realistic models of human decision-making in the real world in which we live in.  相似文献   

18.
A nonstandard approach to axiomatic quantum field theory is given. Nonstandard axioms for a Hermitian scalar field is proposed, where the field operators act on a hyperfinite-dimensional Hilbert space. The axioms are shown to be equivalent to the Gårding–Wightman axioms. An example of a model of the nonstandard axioms is examined.  相似文献   

19.
In this paper, a general theory on unification of non-Abelian SU(N) gauge interactions and gravitationalinteractions is discussed. SU(N) gauge interactions and gravitational interactions are formulated on the similar basisand are unified in a semi-direct product group GSU(N). Based on this model, we can discuss unification of fundamentalinteractions of Nature.  相似文献   

20.
In recent years, there has been an exponential growth in sequencing projects due to accelerated technological advances, leading to a significant increase in the amount of data and resulting in new challenges for biological sequence analysis. Consequently, the use of techniques capable of analyzing large amounts of data has been explored, such as machine learning (ML) algorithms. ML algorithms are being used to analyze and classify biological sequences, despite the intrinsic difficulty in extracting and finding representative biological sequence methods suitable for them. Thereby, extracting numerical features to represent sequences makes it statistically feasible to use universal concepts from Information Theory, such as Tsallis and Shannon entropy. In this study, we propose a novel Tsallis entropy-based feature extractor to provide useful information to classify biological sequences. To assess its relevance, we prepared five case studies: (1) an analysis of the entropic index q; (2) performance testing of the best entropic indices on new datasets; (3) a comparison made with Shannon entropy and (4) generalized entropies; (5) an investigation of the Tsallis entropy in the context of dimensionality reduction. As a result, our proposal proved to be effective, being superior to Shannon entropy and robust in terms of generalization, and also potentially representative for collecting information in fewer dimensions compared with methods such as Singular Value Decomposition and Uniform Manifold Approximation and Projection.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号