首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In previous research, we showed that ‘texts that tell a story’ exhibit a statistical structure that is not Maxwell–Boltzmann but Bose–Einstein. Our explanation is that this is due to the presence of ‘indistinguishability’ in human language as a result of the same words in different parts of the story being indistinguishable from one another, in much the same way that ’indistinguishability’ occurs in quantum mechanics, also there leading to the presence of Bose–Einstein rather than Maxwell–Boltzmann as a statistical structure. In the current article, we set out to provide an explanation for this Bose–Einstein statistics in human language. We show that it is the presence of ‘meaning’ in ‘texts that tell a story’ that gives rise to the lack of independence characteristic of Bose–Einstein, and provides conclusive evidence that ‘words can be considered the quanta of human language’, structurally similar to how ‘photons are the quanta of electromagnetic radiation’. Using several studies on entanglement from our Brussels research group, we also show, by introducing the von Neumann entropy for human language, that it is also the presence of ‘meaning’ in texts that makes the entropy of a total text smaller relative to the entropy of the words composing it. We explain how the new insights in this article fit in with the research domain called ‘quantum cognition’, where quantum probability models and quantum vector spaces are used in human cognition, and are also relevant to the use of quantum structures in information retrieval and natural language processing, and how they introduce ‘quantization’ and ‘Bose–Einstein statistics’ as relevant quantum effects there. Inspired by the conceptuality interpretation of quantum mechanics, and relying on the new insights, we put forward hypotheses about the nature of physical reality. In doing so, we note how this new type of decrease in entropy, and its explanation, may be important for the development of quantum thermodynamics. We likewise note how it can also give rise to an original explanatory picture of the nature of physical reality on the surface of planet Earth, in which human culture emerges as a reinforcing continuation of life.  相似文献   

2.
The dependability of systems and networks has been the target of research for many years now. In the 1970s, what is now known as the top conference on dependability—The IEEE/IFIP International Conference on Dependable Systems and Networks (DSN)—emerged gathering international researchers and sparking the interest of the scientific community. Although it started in niche systems, nowadays dependability is viewed as highly important in most computer systems. The goal of this work is to analyze the research published in the proceedings of well-established dependability conferences (i.e., DSN, International Symposium on Software Reliability Engineering (ISSRE), International Symposium on Reliable Distributed Systems (SRDS), European Dependable Computing Conference (EDCC), Latin-American Symposium on Dependable Computing (LADC), Pacific Rim International Symposium on Dependable Computing (PRDC)), while using Natural Language Processing (NLP) and namely the Latent Dirichlet Allocation (LDA) algorithm to identify active, collapsing, ephemeral, and new lines of research in the dependability field. Results show a strong emphasis on terms, like ‘security’, despite the general focus of the conferences in dependability and new trends that are related with ’machine learning’ and ‘blockchain’. We used the PRDC conference as a use case, which showed similarity with the overall set of conferences, although we also found specific terms, like ‘cyber-physical’, being popular at PRDC and not in the overall dataset.  相似文献   

3.
We consider a recently introduced generalization of the Ising model in which individual spin strength can vary. The model is intended for analysis of ordering in systems comprising agents which, although matching in their binarity (i.e., maintaining the iconic Ising features of ‘+’ or ‘−’, ‘up’ or ‘down’, ‘yes’ or ‘no’), differ in their strength. To investigate the interplay between variable properties of nodes and interactions between them, we study the model on a complex network where both the spin strength and degree distributions are governed by power laws. We show that in the annealed network approximation, thermodynamic functions of the model are self-averaging and we obtain an exact solution for the partition function. This allows us derive the leading temperature and field dependencies of thermodynamic functions, their critical behavior, and logarithmic corrections at the interface of different phases. We find the delicate interplay of the two power laws leads to new universality classes.  相似文献   

4.
Keywords in scientific articles have found their significance in information filtering and classification. In this article, we empirically investigated statistical characteristics and evolutionary properties of keywords in a very famous journal, namely Proceedings of the National Academy of Science of the United States of America (PNAS), including frequency distribution, temporal scaling behavior, and decay factor. The empirical results indicate that the keyword frequency in PNAS approximately follows a Zipf’s law with exponent 0.86. In addition, there is a power-low correlation between the cumulative number of distinct keywords and the cumulative number of keyword occurrences. Extensive empirical analysis on some other journals’ data is also presented, with decaying trends of most popular keywords being monitored. Interestingly, top journals from various subjects share very similar decaying tendency, while the journals of low impact factors exhibit completely different behavior. Those empirical characters may shed some light on the in-depth understanding of semantic evolutionary behaviors. In addition, the analysis of keyword-based system is helpful for the design of corresponding recommender systems.  相似文献   

5.
Computational textual aesthetics aims at studying observable differences between aesthetic categories of text. We use Approximate Entropy to measure the (un)predictability in two aesthetic text categories, i.e., canonical fiction (‘classics’) and non-canonical fiction (with lower prestige). Approximate Entropy is determined for series derived from sentence-length values and the distribution of part-of-speech-tags in windows of texts. For comparison, we also include a sample of non-fictional texts. Moreover, we use Shannon Entropy to estimate degrees of (un)predictability due to frequency distributions in the entire text. Our results show that the Approximate Entropy values can better differentiate canonical from non-canonical texts compared with Shannon Entropy, which is not true for the classification of fictional vs. expository prose. Canonical and non-canonical texts thus differ in sequential structure, while inter-genre differences are a matter of the overall distribution of local frequencies. We conclude that canonical fictional texts exhibit a higher degree of (sequential) unpredictability compared with non-canonical texts, corresponding to the popular assumption that they are more ‘demanding’ and ‘richer’. In using Approximate Entropy, we propose a new method for text classification in the context of computational textual aesthetics.  相似文献   

6.
We introduce a quantum key distribution protocol using mean multi-kings’ problem. Using this protocol, a sender can share a bit sequence as a secret key with receivers. We consider a relation between information gain by an eavesdropper and disturbance contained in legitimate users’ information. In BB84 protocol, such relation is known as the so-called information disturbance theorem. We focus on a setting that the sender and two receivers try to share bit sequences and the eavesdropper tries to extract information by interacting legitimate users’ systems and an ancilla system. We derive trade-off inequalities between distinguishability of quantum states corresponding to the bit sequence for the eavesdropper and error probability of the bit sequence shared with the legitimate users. Our inequalities show that eavesdropper’s extracting information regarding the secret keys inevitably induces disturbing the states and increasing the error probability.  相似文献   

7.
During the physical foundation of his radiation formula in his December 1900 talk and subsequent 1901 article, Planck refers to Boltzmann’s 1877 combinatorial-probabilistic treatment and obtains his quantum distribution function, while Boltzmann did not. For this, Boltzmann’s memoirs are usually ascribed to classical statistical mechanics. Agreeing with Bach, it is shown that Boltzmann’s 1868 and 1877 calculations can lead to a Planckian distribution function, where those of 1868 are even closer to Planck than that of 1877. Boltzmann’s and Planck’s calculations are compared based on Bach’s three-level scheme ‘configuration–occupation–occupancy’. Special attention is paid to the concepts of interchangeability and the indistinguishability of particles and states. In contrast to Bach, the level of exposition is most elementary. I hope to make Boltzmann’s work better known in English and to remove misunderstandings in the literature.  相似文献   

8.
Frequent lane changes cause serious traffic safety concerns, which involve fatalities and serious injuries. This phenomenon is affected by several significant factors related to road safety. The detection and classification of significant factors affecting lane changing could help reduce frequent lane changing risk. The principal objective of this research is to estimate and prioritize the nominated crucial criteria and sub-criteria based on participants’ answers on a designated questionnaire survey. In doing so, this paper constructs a hierarchical lane-change model based on the concept of the analytic hierarchy process (AHP) with two levels of the most concerning attributes. Accordingly, the fuzzy analytic hierarchy process (FAHP) procedure was applied utilizing fuzzy scale to evaluate precisely the most influential factors affecting lane changing, which will decrease uncertainty in the evaluation process. Based on the final measured weights for level 1, FAHP model estimation results revealed that the most influential variable affecting lane-changing is ‘traffic characteristics’. In contrast, compared to other specified factors, ‘light conditions’ was found to be the least critical factor related to driver lane-change maneuvers. For level 2, the FAHP model results showed ‘traffic volume’ as the most critical factor influencing the lane changes operations, followed by ‘speed’. The objectivity of the model was supported by sensitivity analyses that examined a range for weights’ values and those corresponding to alternative values. Based on the evaluated results, stakeholders can determine strategic policy by considering and placing more emphasis on the highlighted risk factors associated with lane changing to improve road safety. In conclusion, the finding provides the usefulness of the fuzzy analytic hierarchy process to review lane-changing risks for road safety.  相似文献   

9.
Balance impairment is one of the biggest risk factors for falls reducing inactivity, resulting in nursing care. Therefore, balance ability is crucial to maintain the activities of independent daily living of older adults. Many tests to assess balance ability have been developed. However, few reports reveal the structure underlying results of balance performance tests comparing young and older adults. Covariance structure analysis is a tool that is used to test statistically whether factorial structure fits data. This study examined aging effects on the factorial structure underlying balance performance tests. Participants comprised 60 healthy young women aged 22 ± 3 years (young group) and 60 community-dwelling older women aged 69 ± 5 years (older group). Six balance tests: postural sway, one-leg standing, functional reach, timed up and go (TUG), gait, and the EquiTest were employed. Exploratory factor analysis revealed that three clearly interpretable factors were extracted in the young group. The first factor had high loadings on the EquiTest, and was interpreted as ‘Reactive’. The second factor had high loadings on the postural sway test, and was interpreted as ‘Static’. The third factor had high loadings on TUG and gait test, and was interpreted as ‘Dynamic’. Similarly, three interpretable factors were extracted in the older group. The first factor had high loadings on the postural sway test and the EquiTest and therefore was interpreted as ‘Static and Reactive’. The second factor, which had high loadings on the EquiTest, was interpreted as ‘Reactive’. The third factor, which had high loadings on TUG and the gait test, was interpreted as ‘Dynamic’. A covariance structure model was applied to the test data: the second-order factor was balance ability, and the first-order factors were static, dynamic and reactive factors which were assumed to be measured based on the six balance tests. Goodness-of-fit index (GFI) of the models were acceptable (young group, GFI=0.931; older group, GFI=0.923). Static, dynamic and reactive factors relating to balance ability had loadings 0.21, 0.24, and 0.76 in the young group and 0.71, 0.28, and 0.43 in the older group, respectively. It is suggested that the common factorial structure of balance abilities were static, dynamic and reactive, and that for young people reactive balance ability was characterized and explained by balance ability, whereas for older people it was static balance ability.  相似文献   

10.
Bibliometric techniques (i.e. citation analysis) are used to evaluate the impact and standing of Solid State Communications (SSC) among its competitor journals covering the field of condensed matter. In most cases, the analysis covers all issues dating back to the journal's inception in 1963. In some cases, however, the analysis only covers articles published after 1973 because of limited access to the previous data under the available search system. A listing of the most cited articles appeared in SSC since its inception is given. Several of them include Nobel laureates among their authors. An analysis of the articles which remained uncited is also presented. Bibliometric data from the Institute for Scientific Information (ISI) such as the Journal Impact Factor (JIF), the Citing Half-Life as well as the Cited Half-Life are compared with those for other journals covering condensed matter and related fields. Furthermore, an analysis of the impact according to the countries of origin of authors is presented. A discussion of the results exhibited in Tables and Figures is given.  相似文献   

11.
After a presentation of Max Born's most salient biographical data, we discuss his contributions to science and science policy, with special emphasis on those related to condensed matter physics. Our discussion includes journal articles as well as books. The methodology used is both qualitative and quantitative, including number of items, number of formal and informal citations, and other bibliometric indicators such as the recently proposed Hirsch index (h‐index). The data are mainly based on the Thomson/ISI Web of Science (WoS) which covers a carefully selected set of the more prestigious journals dating back to 1900. Born's books and articles not published in the journals covered by the WoS can also be evaluated, provided they are cited within the WoS journals. Some anecdotic and historical details, which have come to the fore in the course of our bibliometric investigations, are included.  相似文献   

12.
Many complex fluids can be described by continuum hydrodynamic field equations, to which noise must be added in order to capture thermal fluctuations. In almost all cases, the resulting coarse-grained stochastic partial differential equations carry a short-scale cutoff, which is also reflected in numerical discretisation schemes. We draw together our recent findings concerning the construction of such schemes and the interpretation of their continuum limits, focusing, for simplicity, on models with a purely diffusive scalar field, such as ‘Model B’ which describes phase separation in binary fluid mixtures. We address the requirement that the steady-state entropy production rate (EPR) must vanish for any stochastic hydrodynamic model in a thermal equilibrium. Only if this is achieved can the given discretisation scheme be relied upon to correctly calculate the nonvanishing EPR for ‘active field theories’ in which new terms are deliberately added to the fluctuating hydrodynamic equations that break detailed balance. To compute the correct probabilities of forward and time-reversed paths (whose ratio determines the EPR), we must make a careful treatment of so-called ‘spurious drift’ and other closely related terms that depend on the discretisation scheme. We show that such subtleties can arise not only in the temporal discretisation (as is well documented for stochastic ODEs with multiplicative noise) but also from spatial discretisation, even when noise is additive, as most active field theories assume. We then review how such noise can become multiplicative via off-diagonal couplings to additional fields that thermodynamically encode the underlying chemical processes responsible for activity. In this case, the spurious drift terms need careful accounting, not just to evaluate correctly the EPR but also to numerically implement the Langevin dynamics itself.  相似文献   

13.
14.
The term entropy is used in different meanings in different contexts, sometimes in contradictory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermodynamics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective.  相似文献   

15.
I will argue that, in an interdisciplinary study of consciousness, epistemic structural realism (ESR) can offer a feasible philosophical background for the study of consciousness and its associated neurophysiological phenomena in neuroscience and cognitive science while also taking into account the mathematical structures involved in this type of research. Applying the ESR principles also to the study of the neurophysiological phenomena associated with free will (or rather conscious free choice) and with various alterations of consciousness (AOCs) generated by various pathologies such as epilepsy would add explanatory value to the matter. This interdisciplinary approach would be in tune with Quine’s well known idea that philosophy is not simple conceptual analysis but is continuous with science and actually represents an abstract branch of the empirical research. The ESR could thus resonate with scientific models of consciousness such as the global neuronal workspace model (inspired by the global workspace theory—GWT) and the integrated information theory (IIT) model. While structural realism has already been employed in physics or biology, its application as a meta-theory contextualising and relating various scientific findings on consciousness is new indeed. Out of the two variants: ontic structural realism (OSR) and epistemic structural realism (ESR), the latter can be considered more suitable for the study of consciousness and its associated neurophysiological phenomena because it removes the pressure of the still unanswered ‘What is consciousness?’ ontological question and allows us to concentrate instead on the ‘What can we know about consciousness?’ epistemological question.  相似文献   

16.
The purpose of the review is to provide a concise overview of recent advances in the broadly defined field of Raman spectroscopy as reflected in part by the many articles published each year in the Journal of Raman Spectroscopy (JRS) as well as in trends across all related journals publishing in this research area. Context for the review is provided by considering statistical data on citations for the Thompson Reuters ISI Web of Science by year and by subfield of Raman spectroscopy. Additional statistics of number of papers and posters presented by category at the XXII International Conference on Raman Spectroscopy (ICORS 2010) is also provided. Papers published in JRS in 2009, as reviewed here, reflect trends at the cutting edge of Raman spectroscopy which is expanding rapidly as a sensitive photonic probe of matter at the molecular level with an ever widening sphere of novel applications. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

17.
Hepatic stellate cells (HSCs) are vitamin-A storing collagen-producing cells in hepatic lobules. The three-dimensional structure of HSCs has been demonstrated with the Golgi method, the maceration method for scanning electron microscopy, and confocal laser scanning microscopy. Many thorn-like microprojections or spines extend from the subendothelial processes and make contacts with hepatocytes. One HSC entwines two or more sinusoids and about 20–40 hepatocytes to create a cellular unit, ‘the stellate cell unit’ or ‘stellon’. The Space of Disse is defined as the space between stellate cell-endothelial cell complex and hepatocytes. Intralobular heterogeneity of HSCs is assessed. HSCs develop from mesenchymal cells in the septum transversum. The developmental process of HSCs is reproduced partly in culture. In the lamprey abundant vitamin A is stored not only in HSCs, but in the fibroblast-like cells in the various other splanchnic organs. In vertebrates, the existence of both conventional fibroblast system in somatic tissues and vitamin A-storing cell system in splanchnic organs is suggested.  相似文献   

18.

Objective

The purpose of this study was to evaluate the levels of evidence in the voice literature.

Study Design

Retrospective literature review.

Methods

Retrospective review of all original articles published between January 2004 and December 2009 from four general otolaryngology journals and one subspecialty voice journal. All abstracts related to voice were evaluated and rated as to evidence-based medicine rating, graded levels A–D and 1a–5. Articles were also stratified by time over two consecutive 3-year intervals to assess changes over the time period.

Results

Of the 6052 articles published, 950 (15.6%) were related to voice. Six hundred seventy-three articles (10.2%) were clinical articles, and 277 (4.6%) were basic science. Only 1% of the clinical articles were level A, 17% were level B, 73% were level C, and 9% were level D. No noticeable changes occurred in the levels of evidence over the interval of the first 3 years of the study in comparison to the last 3 years, although there was an increase in the number of basic science articles from 24.4% to 32.4%.

Conclusion

Despite strong recent interest in improving the quality of the evidence in the literature, the voice literature remains primarily level C and D with no appreciable change over the past 6 years.  相似文献   

19.
Summary statistical data on the journal Kvantovaya Élektronika (English translation — Soviet Journal of Quantum Electronics) for the first 15 years of its publication from 1974 through 1988. They include the number of articles published, their distribution by subject, statistics on the authors of the papers, questions dealing with the size of the journal and with the effective utilization of the paper. Relative estimates are made of the contributions of different journals to the common flow of information. The shares of different topics in the sum total of the articles are given.Translated from Preprint No. 166 of the Lebedev Physics Institute, Academy of Sciences of the USSR, Moscow, 1990.  相似文献   

20.
A global event such as the COVID-19 crisis presents new, often unexpected responses that are fascinating to investigate from both scientific and social standpoints. Despite several documented similarities, the coronavirus pandemic is clearly distinct from the 1918 flu pandemic in terms of our exponentially increased, almost instantaneous ability to access/share information, offering an unprecedented opportunity to visualise rippling effects of global events across space and time. Personal devices provide “big data” on people’s movement, the environment and economic trends, while access to the unprecedented flurry in scientific publications and media posts provides a measure of the response of the educated world to the crisis. Most bibliometric (co-authorship, co-citation, or bibliographic coupling) analyses ignore the time dimension, but COVID-19 has made it possible to perform a detailed temporal investigation into the pandemic. Here, we report a comprehensive network analysis based on more than 20,000 published documents on viral epidemics, authored by over 75,000 individuals from 140 nations in the past one year of the crisis. Unlike the 1918 flu pandemic, access to published data over the past two decades enabled a comparison of publishing trends between the ongoing COVID-19 pandemic and those of the 2003 SARS epidemic to study changes in thematic foci and societal pressures dictating research over the course of a crisis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号