首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 359 毫秒
1.
Image encryption is a confidential strategy to keep the information in digital images from being leaked. Due to excellent chaotic dynamic behavior, self-feedbacked Hopfield networks have been used to design image ciphers. However, Self-feedbacked Hopfield networks have complex structures, large computational amount and fixed parameters; these properties limit the application of them. In this paper, a single neuronal dynamical system in self-feedbacked Hopfield network is unveiled. The discrete form of single neuronal dynamical system is derived from a self-feedbacked Hopfield network. Chaotic performance evaluation indicates that the system has good complexity, high sensitivity, and a large chaotic parameter range. The system is also incorporated into a framework to improve its chaotic performance. The result shows the system is well adapted to this type of framework, which means that there is a lot of room for improvement in the system. To investigate its applications in image encryption, an image encryption scheme is then designed. Simulation results and security analysis indicate that the proposed scheme is highly resistant to various attacks and competitive with some exiting schemes.  相似文献   

2.
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption that the Earth’s complexity could, in principle, be modeled by (surrogated by) a numerical algorithm if enough computing power were granted. Yet, similar numerical algorithms also surrogate different systems having the same processes and dynamics, such as Mars or Jupiter, although being qualitatively different from the Earth system. Here, we argue that understanding the Earth as a complex system requires a consideration of the Gaia hypothesis: the Earth is a complex system because it instantiates life—and therefore an autopoietic, metabolic-repair (M,R) organization—at a planetary scale. This implies that the Earth’s complexity has formal equivalence to a self-referential system that inherently is non-algorithmic and, therefore, cannot be surrogated and simulated in a Turing machine. We discuss the consequences of this, with reference to in-silico climate models, tipping points, planetary boundaries, and planetary feedback loops as units of adaptive evolution and selection.  相似文献   

3.
4.
Assessing where and how information is stored in biological networks (such as neuronal and genetic networks) is a central task both in neuroscience and in molecular genetics, but most available tools focus on the network’s structure as opposed to its function. Here, we introduce a new information-theoretic tool—information fragmentation analysis—that, given full phenotypic data, allows us to localize information in complex networks, determine how fragmented (across multiple nodes of the network) the information is, and assess the level of encryption of that information. Using information fragmentation matrices we can also create information flow graphs that illustrate how information propagates through these networks. We illustrate the use of this tool by analyzing how artificial brains that evolved in silico solve particular tasks, and show how information fragmentation analysis provides deeper insights into how these brains process information and “think”. The measures of information fragmentation and encryption that result from our methods also quantify complexity of information processing in these networks and how this processing complexity differs between primary exposure to sensory data (early in the lifetime) and later routine processing.  相似文献   

5.
With the rapid growth of satellite communication demand and the continuous development of high-throughput satellite systems, the satellite resource allocation problem—also called the dynamic resources management (DRM) problem—has become increasingly complex in recent years. The use of metaheuristic algorithms to obtain acceptable optimal solutions has become a hot topic in research and has the potential to be explored further. In particular, the treatment of invalid solutions is the key to algorithm performance. At present, the unused bandwidth allocation (UBA) method is commonly used to address the bandwidth constraint in the DRM problem. However, this method reduces the algorithm’s flexibility in the solution space, diminishes the quality of the optimized solution, and increases the computational complexity. In this paper, we propose a bandwidth constraint handling approach based on the non-dominated beam coding (NDBC) method, which can eliminate the bandwidth overlap constraint in the algorithm’s population evolution and achieve complete bandwidth flexibility in order to increase the quality of the optimal solution while decreasing the computational complexity. We develop a generic application architecture for metaheuristic algorithms using the NDBC method and successfully apply it to four typical algorithms. The results indicate that NDBC can enhance the quality of the optimized solution by 9–33% while simultaneously reducing computational complexity by 9–21%.  相似文献   

6.
The entropy-based parameters determined from the electrodermal activity (EDA) biosignal evaluate the complexity within the activity of the sympathetic cholinergic system. We focused on the evaluation of the complex sympathetic cholinergic regulation by assessing EDA using conventional indices (skin conductance level (SCL), non-specific skin conductance responses, spectral EDA indices), and entropy-based parameters (approximate, sample, fuzzy, permutation, Shannon, and symbolic information entropies) in newborns during the first three days of postnatal life. The studied group consisted of 50 healthy newborns (21 boys, average gestational age: 39.0 ± 0.2 weeks). EDA was recorded continuously from the feet at rest for three periods (the first day—2 h after birth, the second day—24 h after birth, and the third day—72 h after birth). Our results revealed higher SCL, spectral EDA index in a very-low frequency band, approximate, sample, fuzzy, and permutation entropy during the first compared to second and third days, while Shannon and symbolic information entropies were lower during the first day compared to other periods. In conclusion, EDA parameters seem to be sensitive in the detection of the sympathetic regulation changes in early postnatal life and which can represent an important step towards a non-invasive early diagnosis of the pathological states linked to autonomic dysmaturation in newborns.  相似文献   

7.
We compare and contrast three different, but complementary views of “structure” and “pattern” in spatial processes. For definiteness and analytical clarity, we apply all three approaches to the simplest class of spatial processes: one-dimensional Ising spin systems with finite-range interactions. These noncritical systems are well-suited for this study since the change in structure as a function of system parameters is more subtle than that found in critical systems where, at a phase transition, many observables diverge, thereby making the detection of change in structure obvious. This survey demonstrates that the measures of pattern from information theory and computational mechanics differ from known thermodynamic and statistical mechanical functions. Moreover, they capture important structural features that are otherwise missed. In particular, a type of mutual information called the excess entropy—an information theoretic measure of memory—serves to detect ordered, low entropy density patterns. It is superior in several respects to other functions used to probe structure, such as magnetization and structure factors. ϵ-Machines—the main objects of computational mechanics—are seen to be the most direct approach to revealing the (group and semigroup) symmetries possessed by the spatial patterns and to estimating the minimum amount of memory required to reproduce the configuration ensemble, a quantity known as the statistical complexity. Finally, we argue that the information theoretic and computational mechanical analyses of spatial patterns capture the intrinsic computational capabilities embedded in spin systems—how they store, transmit, and manipulate configurational information to produce spatial structure.  相似文献   

8.
The search for the chemical origins of life represents a long-standing and continuously debated enigma. Despite its exceptional complexity, in the last decades the field has experienced a revival, also owing to the exponential growth of the computing power allowing for efficiently simulating the behavior of matter—including its quantum nature—under disparate conditions found, e.g., on the primordial Earth and on Earth-like planetary systems (i.e., exoplanets). In this minireview, we focus on some advanced computational methods capable of efficiently solving the Schrödinger equation at different levels of approximation (i.e., density functional theory)—such as ab initio molecular dynamics—and which are capable to realistically simulate the behavior of matter under the action of energy sources available in prebiotic contexts. In addition, recently developed metadynamics methods coupled with first-principles simulations are here reviewed and exploited to answer to old enigmas and to propose novel scenarios in the exponentially growing research field embedding the study of the chemical origins of life.  相似文献   

9.
In this treatment of random dynamical systems, we consider the existence—and identification—of conditional independencies at nonequilibrium steady-state. These independencies underwrite a particular partition of states, in which internal states are statistically secluded from external states by blanket states. The existence of such partitions has interesting implications for the information geometry of internal states. In brief, this geometry can be read as a physics of sentience, where internal states look as if they are inferring external states. However, the existence of such partitions—and the functional form of the underlying densities—have yet to be established. Here, using the Lorenz system as the basis of stochastic chaos, we leverage the Helmholtz decomposition—and polynomial expansions—to parameterise the steady-state density in terms of surprisal or self-information. We then show how Markov blankets can be identified—using the accompanying Hessian—to characterise the coupling between internal and external states in terms of a generalised synchrony or synchronisation of chaos. We conclude by suggesting that this kind of synchronisation may provide a mathematical basis for an elemental form of (autonomous or active) sentience in biology.  相似文献   

10.
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single posterior, and also addresses the question of how much less probable other distributions might be, which provides a direct bridge to the theories of fluctuations and large deviations.  相似文献   

11.
Traditional machine-learning methods are inefficient in capturing chaos in nonlinear dynamical systems, especially when the time difference Δt between consecutive steps is so large that the extracted time series looks apparently random. Here, we introduce a new long-short-term-memory (LSTM)-based recurrent architecture by tensorizing the cell-state-to-state propagation therein, maintaining the long-term memory feature of LSTM, while simultaneously enhancing the learning of short-term nonlinear complexity. We stress that the global minima of training can be most efficiently reached by our tensor structure where all nonlinear terms, up to some polynomial order, are treated explicitly and weighted equally. The efficiency and generality of our architecture are systematically investigated and tested through theoretical analysis and experimental examinations. In our design, we have explicitly used two different many-body entanglement structures—matrix product states (MPS) and the multiscale entanglement renormalization ansatz (MERA)—as physics-inspired tensor decomposition techniques, from which we find that MERA generally performs better than MPS, hence conjecturing that the learnability of chaos is determined not only by the number of free parameters but also the tensor complexity—recognized as how entanglement entropy scales with varying matricization of the tensor.  相似文献   

12.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

13.
We study a scheme of thermal management where a three-qubit system assisted with a coherent auxiliary bath (CAB) is employed to implement heat management on a target thermal bath (TTB). We consider the CAB/TTB being ensemble of coherent/thermal two-level atoms (TLAs), and within the framework of collision model investigate the characteristics of steady heat current (also called target heat current (THC)) between the system and the TTB. It demonstrates that with the help of the quantum coherence of ancillae the magnitude and direction of heat current can be controlled only by adjusting the coupling strength of system-CAB. Meanwhile, we also show that the influences of quantum coherence of ancillae on the heat current strongly depend on the coupling strength of system—CAB, and the THC becomes positively/negatively correlated with the coherence magnitude of ancillae when the coupling strength below/over some critical value. Besides, the system with the CAB could serve as a multifunctional device integrating the thermal functions of heat amplifier, suppressor, switcher and refrigerator, while with thermal auxiliary bath it can only work as a thermal suppressor. Our work provides a new perspective for the design of multifunctional thermal device utilizing the resource of quantum coherence from the CAB.  相似文献   

14.
The economy is a system of complex interactions. The COVID-19 pandemic strongly influenced economies, particularly through introduced restrictions, which formed a completely new economic environment. The present work focuses on the changes induced by the COVID-19 epidemic on the correlation network structure. The analysis is performed on a representative set of USA companies—the S&P500 components. Four different network structures are constructed (strong, weak, typically, and significantly connected networks), and the rank entropy, cycle entropy, averaged clustering coefficient, and transitivity evolution are established and discussed. Based on the mentioned structural parameters, four different stages have been distinguished during the COVID-19-induced crisis. The proposed network properties and their applicability to a crisis-distinguishing problem are discussed. Moreover, the optimal time window problem is analysed.  相似文献   

15.
The emergence of opinion polarization within human communities—the phenomenon that individuals within a society tend to develop conflicting attitudes related to the greatest diversity of topics—has been a focus of interest for decades, both from theoretical and modelling points of view. Regarding modelling attempts, an entire scientific field—opinion dynamics—has emerged in order to study this and related phenomena. Within this framework, agents’ opinions are usually represented by a scalar value which undergoes modification due to interaction with other agents. Under certain conditions, these models are able to reproduce polarization—a state increasingly familiar to our everyday experience. In the present paper, an alternative explanation is suggested along with its corresponding model. More specifically, we demonstrate that by incorporating the following two well-known human characteristics into the representation of agents: (1) in the human brain beliefs are interconnected, and (2) people strive to maintain a coherent belief system; polarization immediately occurs under exposure to news and information. Furthermore, the model accounts for the proliferation of fake news, and shows how opinion polarization is related to various cognitive biases.  相似文献   

16.
The research concerns data collected in independent sets—more specifically, in local decision tables. A possible approach to managing these data is to build local classifiers based on each table individually. In the literature, many approaches toward combining the final prediction results of independent classifiers can be found, but insufficient efforts have been made on the study of tables’ cooperation and coalitions’ formation. The importance of such an approach was expected on two levels. First, the impact on the quality of classification—the ability to build combined classifiers for coalitions of tables should allow for the learning of more generalized concepts. In turn, this should have an impact on the quality of classification of new objects. Second, combining tables into coalitions will result in reduced computational complexity—a reduced number of classifiers will be built. The paper proposes a new method for creating coalitions of local tables and generating an aggregated classifier for each coalition. Coalitions are generated by determining certain characteristics of attribute values occurring in local tables and applying the Pawlak conflict analysis model. In the study, the classification and regression trees with Gini index are built based on the aggregated table for one coalition. The system bears a hierarchical structure, as in the next stage the decisions generated by the classifiers for coalitions are aggregated using majority voting. The classification quality of the proposed system was compared with an approach that does not use local data cooperation and coalition creation. The structure of the system is parallel and decision trees are built independently for local tables. In the paper, it was shown that the proposed approach provides a significant improvement in classification quality and execution time. The Wilcoxon test confirmed that differences in accuracy rate of the results obtained for the proposed method and results obtained without coalitions are significant, with a p level = 0.005. The average accuracy rate values obtained for the proposed approach and the approach without coalitions are, respectively: 0.847 and 0.812; so the difference is quite large. Moreover, the algorithm implementing the proposed approach performed up to 21-times faster than the algorithm implementing the approach without using coalitions.  相似文献   

17.
The low complexity domain (LCD) sequence has been defined in terms of entropy using a 12 amino acid sliding window along a protein sequence in the study of disease-related genes. The amyotrophic lateral sclerosis (ALS)-related TDP-43 protein sequence with intra-LCD structural information based on cryo-EM data was published recently. An application of entropy and Higuchi fractal dimension calculations was described using the Znf521 and HAR1 sequences. A computational analysis of the intra-LCD sequence entropy and Higuchi fractal dimension values at the amino acid level and at the ATCG nucleotide level were conducted without the sliding window requirement. The computational results were consistent in predicting the intermediate entropy/fractal dimension value produced when two subsequences at two different entropy/fractal dimension values were combined. The computational method without the application of a sliding-window was extended to an analysis of the recently reported virulent genes—Orf6, Nsp6, and Orf7a—in SARS-CoV-2. The relationship between the virulence functionality and entropy values was found to have correlation coefficients between 0.84 and 0.99, using a 5% uncertainty on the cell viability data. The analysis found that the most virulent Orf6 gene sequence had the lowest nucleotide entropy and the highest protein fractal dimension, in line with extreme value theory. The Orf6 codon usage bias in relation to vaccine design was discussed.  相似文献   

18.
为了验证大型结构体的设计合理性和产品可靠性,掌握结构体负荷承载能力,给出科学评价依据和指导意见,需要科学有效的试验手段为受试对象施加载荷,获取结构体应变、位移等信息,为力学分析提供必要数据支撑。分析了试验平台的主要功能需求,研究了由荷载测控系统、应变测量系统、位移观测系统、试验管理系统和图像监控系统组成的试验平台方案,重点论述了平台的核心关键技术,通过具体应用实例和试验数据,指出了试验平台的应用领域及所能解决的实际问题,最后总结了试验平台的特点和意义。  相似文献   

19.
Dozens of countries are executing national nanotechnology plans. No rigorous evaluation scheme for these plans exists, although stakeholders—especially policy makers, top-level agencies and councils, as well as the society at large—are eager to learn the outcome of these policies. In this article, we recommend an evaluation scheme for national nanotechnology policies that would be used to review the whole or any component part of a national nanotechnology plan. In this scheme, a component at any level of aggregation is evaluated. The component may be part of the plan’s overarching policy goal, which for most countries is to create wealth and improve the quality of life of their nation with nanotechnology. Alternatively, the component may be a programme or an activity related to a programme. The evaluation could be executed at different times in the policy’s life cycle, i.e., before the policy is formulated, during its execution or after its completion. The three criteria for policy evaluation are appropriateness, efficiency and effectiveness. The evaluator should select the appropriate qualitative or quantitative methods to evaluate the various components of national nanotechnology plans.  相似文献   

20.
基于集成分析法的光机热一体化设计   总被引:3,自引:0,他引:3  
杨怿  张伟  陈时锦 《光学技术》2005,31(3):394-397
光机热一体化设计是改进空间光学仪器设计,优化系统总体设计参数的重要方法。集成分析法对设计方案综合性能的动态评估是一体化设计的核心,通过分析系统热光学性能,指导修改设计参数,实现系统光/机/热同步设计。以某空间光学仪器主镜系统的热控设计为例,介绍了这种方法在实际工程中的应用。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号