首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
We consider a recently introduced generalization of the Ising model in which individual spin strength can vary. The model is intended for analysis of ordering in systems comprising agents which, although matching in their binarity (i.e., maintaining the iconic Ising features of ‘+’ or ‘−’, ‘up’ or ‘down’, ‘yes’ or ‘no’), differ in their strength. To investigate the interplay between variable properties of nodes and interactions between them, we study the model on a complex network where both the spin strength and degree distributions are governed by power laws. We show that in the annealed network approximation, thermodynamic functions of the model are self-averaging and we obtain an exact solution for the partition function. This allows us derive the leading temperature and field dependencies of thermodynamic functions, their critical behavior, and logarithmic corrections at the interface of different phases. We find the delicate interplay of the two power laws leads to new universality classes.  相似文献   

2.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

3.
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption that the Earth’s complexity could, in principle, be modeled by (surrogated by) a numerical algorithm if enough computing power were granted. Yet, similar numerical algorithms also surrogate different systems having the same processes and dynamics, such as Mars or Jupiter, although being qualitatively different from the Earth system. Here, we argue that understanding the Earth as a complex system requires a consideration of the Gaia hypothesis: the Earth is a complex system because it instantiates life—and therefore an autopoietic, metabolic-repair (M,R) organization—at a planetary scale. This implies that the Earth’s complexity has formal equivalence to a self-referential system that inherently is non-algorithmic and, therefore, cannot be surrogated and simulated in a Turing machine. We discuss the consequences of this, with reference to in-silico climate models, tipping points, planetary boundaries, and planetary feedback loops as units of adaptive evolution and selection.  相似文献   

4.
I will argue that, in an interdisciplinary study of consciousness, epistemic structural realism (ESR) can offer a feasible philosophical background for the study of consciousness and its associated neurophysiological phenomena in neuroscience and cognitive science while also taking into account the mathematical structures involved in this type of research. Applying the ESR principles also to the study of the neurophysiological phenomena associated with free will (or rather conscious free choice) and with various alterations of consciousness (AOCs) generated by various pathologies such as epilepsy would add explanatory value to the matter. This interdisciplinary approach would be in tune with Quine’s well known idea that philosophy is not simple conceptual analysis but is continuous with science and actually represents an abstract branch of the empirical research. The ESR could thus resonate with scientific models of consciousness such as the global neuronal workspace model (inspired by the global workspace theory—GWT) and the integrated information theory (IIT) model. While structural realism has already been employed in physics or biology, its application as a meta-theory contextualising and relating various scientific findings on consciousness is new indeed. Out of the two variants: ontic structural realism (OSR) and epistemic structural realism (ESR), the latter can be considered more suitable for the study of consciousness and its associated neurophysiological phenomena because it removes the pressure of the still unanswered ‘What is consciousness?’ ontological question and allows us to concentrate instead on the ‘What can we know about consciousness?’ epistemological question.  相似文献   

5.
In previous research, we showed that ‘texts that tell a story’ exhibit a statistical structure that is not Maxwell–Boltzmann but Bose–Einstein. Our explanation is that this is due to the presence of ‘indistinguishability’ in human language as a result of the same words in different parts of the story being indistinguishable from one another, in much the same way that ’indistinguishability’ occurs in quantum mechanics, also there leading to the presence of Bose–Einstein rather than Maxwell–Boltzmann as a statistical structure. In the current article, we set out to provide an explanation for this Bose–Einstein statistics in human language. We show that it is the presence of ‘meaning’ in ‘texts that tell a story’ that gives rise to the lack of independence characteristic of Bose–Einstein, and provides conclusive evidence that ‘words can be considered the quanta of human language’, structurally similar to how ‘photons are the quanta of electromagnetic radiation’. Using several studies on entanglement from our Brussels research group, we also show, by introducing the von Neumann entropy for human language, that it is also the presence of ‘meaning’ in texts that makes the entropy of a total text smaller relative to the entropy of the words composing it. We explain how the new insights in this article fit in with the research domain called ‘quantum cognition’, where quantum probability models and quantum vector spaces are used in human cognition, and are also relevant to the use of quantum structures in information retrieval and natural language processing, and how they introduce ‘quantization’ and ‘Bose–Einstein statistics’ as relevant quantum effects there. Inspired by the conceptuality interpretation of quantum mechanics, and relying on the new insights, we put forward hypotheses about the nature of physical reality. In doing so, we note how this new type of decrease in entropy, and its explanation, may be important for the development of quantum thermodynamics. We likewise note how it can also give rise to an original explanatory picture of the nature of physical reality on the surface of planet Earth, in which human culture emerges as a reinforcing continuation of life.  相似文献   

6.
Various aspects of the structure formation and dynamics of animate and inanimate matter on the nanoscale is a highly interdisciplinary field of rapidly emerging research interest by both experimentalists and theorists. The International Conference on Dynamics of Systems on the Nanoscale (DySoN) is the premier forum to present cutting-edge research in this field. It was established in 2010 and the most recent conference was held in Bad Ems, Germany in October of 2016. This Topical Issue presents original research results from some of the participants, who attended this conference.  相似文献   

7.
Balance impairment is one of the biggest risk factors for falls reducing inactivity, resulting in nursing care. Therefore, balance ability is crucial to maintain the activities of independent daily living of older adults. Many tests to assess balance ability have been developed. However, few reports reveal the structure underlying results of balance performance tests comparing young and older adults. Covariance structure analysis is a tool that is used to test statistically whether factorial structure fits data. This study examined aging effects on the factorial structure underlying balance performance tests. Participants comprised 60 healthy young women aged 22 ± 3 years (young group) and 60 community-dwelling older women aged 69 ± 5 years (older group). Six balance tests: postural sway, one-leg standing, functional reach, timed up and go (TUG), gait, and the EquiTest were employed. Exploratory factor analysis revealed that three clearly interpretable factors were extracted in the young group. The first factor had high loadings on the EquiTest, and was interpreted as ‘Reactive’. The second factor had high loadings on the postural sway test, and was interpreted as ‘Static’. The third factor had high loadings on TUG and gait test, and was interpreted as ‘Dynamic’. Similarly, three interpretable factors were extracted in the older group. The first factor had high loadings on the postural sway test and the EquiTest and therefore was interpreted as ‘Static and Reactive’. The second factor, which had high loadings on the EquiTest, was interpreted as ‘Reactive’. The third factor, which had high loadings on TUG and the gait test, was interpreted as ‘Dynamic’. A covariance structure model was applied to the test data: the second-order factor was balance ability, and the first-order factors were static, dynamic and reactive factors which were assumed to be measured based on the six balance tests. Goodness-of-fit index (GFI) of the models were acceptable (young group, GFI=0.931; older group, GFI=0.923). Static, dynamic and reactive factors relating to balance ability had loadings 0.21, 0.24, and 0.76 in the young group and 0.71, 0.28, and 0.43 in the older group, respectively. It is suggested that the common factorial structure of balance abilities were static, dynamic and reactive, and that for young people reactive balance ability was characterized and explained by balance ability, whereas for older people it was static balance ability.  相似文献   

8.
In theoretical biology, we are often interested in random dynamical systems—like the brain—that appear to model their environments. This can be formalized by appealing to the existence of a (possibly non-equilibrium) steady state, whose density preserves a conditional independence between a biological entity and its surroundings. From this perspective, the conditioning set, or Markov blanket, induces a form of vicarious synchrony between creature and world—as if one were modelling the other. However, this results in an apparent paradox. If all conditional dependencies between a system and its surroundings depend upon the blanket, how do we account for the mnemonic capacity of living systems? It might appear that any shared dependence upon past blanket states violates the independence condition, as the variables on either side of the blanket now share information not available from the current blanket state. This paper aims to resolve this paradox, and to demonstrate that conditional independence does not preclude memory. Our argument rests upon drawing a distinction between the dependencies implied by a steady state density, and the density dynamics of the system conditioned upon its configuration at a previous time. The interesting question then becomes: What determines the length of time required for a stochastic system to ‘forget’ its initial conditions? We explore this question for an example system, whose steady state density possesses a Markov blanket, through simple numerical analyses. We conclude with a discussion of the relevance for memory in cognitive systems like us.  相似文献   

9.
The university town of Camerino in central Italy hosted the Fourteenth International Conference on X-ray Absorption Fine Structure (XAFS-14) in July 2009. This historical town, surrounded by the rolling hills and beautiful countryside of the Marche region, was an ideal location for a stimulating conference, providing an optimal mix of good infrastructure for the meeting spaces with a feeling of remoteness from routine much appreciated by participants. Low registration fees, especially for students, resulted in a high attendance of about 500 participants. The XAFS conferences are organized by the International X-ray Absorption Society (IXAS, http://www.ixasportal.net/ixas) and this edition's chairman was A. Di Cicco, with A. Filipponi acting as co-chair.  相似文献   

10.
Frequent lane changes cause serious traffic safety concerns, which involve fatalities and serious injuries. This phenomenon is affected by several significant factors related to road safety. The detection and classification of significant factors affecting lane changing could help reduce frequent lane changing risk. The principal objective of this research is to estimate and prioritize the nominated crucial criteria and sub-criteria based on participants’ answers on a designated questionnaire survey. In doing so, this paper constructs a hierarchical lane-change model based on the concept of the analytic hierarchy process (AHP) with two levels of the most concerning attributes. Accordingly, the fuzzy analytic hierarchy process (FAHP) procedure was applied utilizing fuzzy scale to evaluate precisely the most influential factors affecting lane changing, which will decrease uncertainty in the evaluation process. Based on the final measured weights for level 1, FAHP model estimation results revealed that the most influential variable affecting lane-changing is ‘traffic characteristics’. In contrast, compared to other specified factors, ‘light conditions’ was found to be the least critical factor related to driver lane-change maneuvers. For level 2, the FAHP model results showed ‘traffic volume’ as the most critical factor influencing the lane changes operations, followed by ‘speed’. The objectivity of the model was supported by sensitivity analyses that examined a range for weights’ values and those corresponding to alternative values. Based on the evaluated results, stakeholders can determine strategic policy by considering and placing more emphasis on the highlighted risk factors associated with lane changing to improve road safety. In conclusion, the finding provides the usefulness of the fuzzy analytic hierarchy process to review lane-changing risks for road safety.  相似文献   

11.
12.
In recent years The Institute of Physics and The Physical Society has given very valuable support to the organization of large international conferences on several branches of physics: e.g. it did so in the case of the International Conference on Magnetism held in Nottingham in September 1964, when nearly 600 participants attended; the papers a t this Conference were duly refereed and published as a large bound volume in association with The Proceedings of the Physical Society. Now, it is not always desirable or expedient to hold such large conferences more than about once in every three years. Nevertheless, there are certain specialized fields of research in magnetism—as indeed there are in many other branches of physics—where the rate of advance of knowledge is such that i t is well worth while to organize smaller conferences with lesser numbers of participants but with a global distribution of the latter. To this need the I.P.P.S. has responded, and the Conference Booklet No. 1 contains the proceedings of the International Conference on High Magnetic Fields and their Applications, organized jointly by its Solid-state Physics Sub-Committee and its Magnetism Group, under the chairmanship of Professor N. Kurti, for 120 participants a t Nottingham, from 17th to 19th September 1969, with the valued support of the International Union of Pure and Applied Physics.  相似文献   

13.
The 2nd International Symposium on Recent advances in Experimental Fluid Mechanics (REAFM2) was held at Koneru Lakshmiah Engineering, Viajayawada, Andhra Pradesh, India, March 3–6, 2008. The conference attracted 120 participants from 12 countries. The conference was organized in 3 parallel sessions on all the 4 days. On all days two Keynote Lectures, one in the fornoon and one in the afternoon were scheduled. Soon after the Keynote Lectures three parallel sessions were scheduled, grouping topics of interest in a particular session.  相似文献   

14.
Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approximate Entropy is determined for series derived from sentence-length values and the distribution of part-of-speech-tags in windows of texts. For comparison, we also include a sample of non-fictional texts. Moreover, we use Shannon Entropy to estimate degrees of (un)predictability due to frequency distributions in the entire text. Our results show that the Approximate Entropy values can better differentiate canonical from non-canonical texts compared with Shannon Entropy, which is not true for the classification of fictional vs. expository prose. Canonical and non-canonical texts thus differ in sequential structure, while inter-genre differences are a matter of the overall distribution of local frequencies. We conclude that canonical fictional texts exhibit a higher degree of (sequential) unpredictability compared with non-canonical texts, corresponding to the popular assumption that they are more ‘demanding’ and ‘richer’. In using Approximate Entropy, we propose a new method for text classification in the context of computational textual aesthetics.  相似文献   

15.
A summary is given of recent work on the electronic properties of quasi-two-dimensional systems, especially inversion layers in metal-insulator-semiconductor structures and electron layers on liquid helium, as reflected in papers presented at the International Conference on the Electronic Properties of Quasi-Two-Dimensional Systems held at Brown University in August, 1975.  相似文献   

16.
Many complex fluids can be described by continuum hydrodynamic field equations, to which noise must be added in order to capture thermal fluctuations. In almost all cases, the resulting coarse-grained stochastic partial differential equations carry a short-scale cutoff, which is also reflected in numerical discretisation schemes. We draw together our recent findings concerning the construction of such schemes and the interpretation of their continuum limits, focusing, for simplicity, on models with a purely diffusive scalar field, such as ‘Model B’ which describes phase separation in binary fluid mixtures. We address the requirement that the steady-state entropy production rate (EPR) must vanish for any stochastic hydrodynamic model in a thermal equilibrium. Only if this is achieved can the given discretisation scheme be relied upon to correctly calculate the nonvanishing EPR for ‘active field theories’ in which new terms are deliberately added to the fluctuating hydrodynamic equations that break detailed balance. To compute the correct probabilities of forward and time-reversed paths (whose ratio determines the EPR), we must make a careful treatment of so-called ‘spurious drift’ and other closely related terms that depend on the discretisation scheme. We show that such subtleties can arise not only in the temporal discretisation (as is well documented for stochastic ODEs with multiplicative noise) but also from spatial discretisation, even when noise is additive, as most active field theories assume. We then review how such noise can become multiplicative via off-diagonal couplings to additional fields that thermodynamically encode the underlying chemical processes responsible for activity. In this case, the spurious drift terms need careful accounting, not just to evaluate correctly the EPR but also to numerically implement the Langevin dynamics itself.  相似文献   

17.
We study the third-order optical nonlinearity for some interesting ϖ-conjugated systems involving sulfur (S) atoms. The static second hyperpolarizabilities (γ) for l, 6, 6a-trithiapentalene and its donor-and acceptor-disubstituted species are calculated by ab initio molecular orbital and density functional methods. Using the second hyperpolarizability density analysis, these molecules are found to exhibit remarkable differences in spatial ϖ-electron contributions of unusual binding structure, i.e., S-S-S bridged structure, to the longitudinal γ.This paper was originally presented at the 5th International Conference on NEAR FIELD OPTICS and RELATED THECHNOLOGIES (NFO-5), which was held on December 6–10, 1998 at Coganoi Bay Hotel, Shirahama, Japan, in cooperation with the Japan Society of Applied Physics and Mombusho Grant-in Aid for Science Research on Priority Areas “Near-field Nano-optics” Project, sponsored by Japan Society for the Promotion of Science.  相似文献   

18.
For a large ensemble of complex systems, a Many-System Problem (MSP) studies how heterogeneity constrains and hides structural mechanisms, and how to uncover and reveal hidden major factors from homogeneous parts. All member systems in an MSP share common governing principles of dynamics, but differ in idiosyncratic characteristics. A typical dynamic is found underlying response features with respect to covariate features of quantitative or qualitative data types. Neither all-system-as-one-whole nor individual system-specific functional structures are assumed in such response-vs-covariate (Re–Co) dynamics. We developed a computational protocol for identifying various collections of major factors of various orders underlying Re–Co dynamics. We first demonstrate the immanent effects of heterogeneity among member systems, which constrain compositions of major factors and even hide essential ones. Secondly, we show that fuller collections of major factors are discovered by breaking heterogeneity into many homogeneous parts. This process further realizes Anderson’s “More is Different” phenomenon. We employ the categorical nature of all features and develop a Categorical Exploratory Data Analysis (CEDA)-based major factor selection protocol. Information theoretical measurements—conditional mutual information and entropy—are heavily used in two selection criteria: C1—confirmable and C2—irreplaceable. All conditional entropies are evaluated through contingency tables with algorithmically computed reliability against the finite sample phenomenon. We study one artificially designed MSP and then two real collectives of Major League Baseball (MLB) pitching dynamics with 62 slider pitchers and 199 fastball pitchers, respectively. Finally, our MSP data analyzing techniques are applied to resolve a scientific issue related to the Rosenberg Self-Esteem Scale.  相似文献   

19.
UVX2010, the 37th International Conference on Vacuum Ultraviolet and X-ray Physics, took place from July 11-16, 2010, on the campus of the University of British Columbia (UBC). This meeting was the first of the merged Vacuum Ultraviolet Radiation Physics and X-ray and Inner Shell Processes conference series. The immediate preceding conferences were VUV15 (Berlin, 2007), and X-08 (Paris, 2008). VUVX2010 brought together scientists from countries all over the world working with synchrotron-, laser-, and plasma-based sources of electromagnetic radiation in the vacuum ultraviolet (VUV), soft X-ray, and hard X-ray regions, and developing novel applications of these sources in a variety of fields. Topics presented ranged from basic physics to materials science and technology, from molecular reactions to the characterization of catalysts under working conditions, from biology to medical diagnostics, from metrology to the development of advanced synchrotron and optical instrumentation. There were over 500 oral and poster presentations, with 480 attendees from 29 different countries. This conference took place on the fiftieth anniversary of the invention of the laser and in the year following the first operation of the Linear Coherent Light Source (LCLS), the world's first accelerator-based X-ray laser. It brought together the global community of VUV and X-ray scientists who use synchrotron-, laser-, and plasma-based sources of vacuum ultraviolet, soft X-ray, and hard X-ray light to explore new phenomena and to develop a better understanding of the physics of the interaction of light and matter.  相似文献   

20.
The 11th International Symposium on Flow Visualization (ISFV) was held at Notre Dame, IN, USA, August 8–12, 2004. The Symposium attracted 236 participants from around the world. The 52 Technical Sessions and two Poster Sessions covered a wide range of topics as indicated in the Keywords. Of the 182 submitted papers, 162 were presented. The presented papers included 8 invited lectures. Each morning and afternoon began with an invited lecture by an outstanding, recognized leader in the field. Toshio Kobayashi received the Leonardo da Vinci Award, an engraved plate, and presented the Leonardo da Vinci Memorial Lecture on “High-performance Computing and Visualization of Unsteady Turbulent Flows.” Toshio Kobayashi is very well known for his outstanding contributions in computational science and flow visualization as well as his leadership in organizing conferences, workshops, and symposia on flow visualization. Ronald J. Adrian discussed “Visualization in Extreme Environments,” Rolf H. Engler described “Pressure-sensitive Paints and Temperature-sensitive Paints in Quantitative Wind Tunnel Studies,” William K. Blake explained “Cavitation as Flow Visualization Seeding,” Giovanni M. Carlomagno discussed “The Use of Colors in Thermo-fluid Dynamic Studies,” Ajit Yoganathan presented “A Gallery of Cardiovascular Fluid Flow Fields: From Heart Valves to Congenital Heart Disease,” Richard B. Miles described “Flow Visualization by Filtered Molecular and Particular Scattering,” and Thomas C. Gruber Jr., displayed a technique for “Visualization of Foreign Gases in Atmospheric Air.” At the end of the last day, Jurgen Kompenhaus from DLR discussed the 12th ISFV to be held in Germany in 2006. After this presentation there was a tour of the Hessert Laboratory for Aerospace Research.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号