首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This article considers a partly philosophical question: What are the ontological and epistemological reasons for using quantum-like models or theories (models and theories based on the mathematical formalism of quantum theory) vs. classical-like ones (based on the mathematics of classical physics), in considering human thinking and decision making? This question is only partly philosophical because it also concerns the scientific understanding of the phenomena considered by the theories that use mathematical models of either type, just as in physics itself, where this question also arises as a physical question. This is because this question is in effect: What are the physical reasons for using, even if not requiring, these types of theories in considering quantum phenomena, which these theories predict fully in accord with the experiment? This is clearly also a physical, rather than only philosophical, question and so is, accordingly, the question of whether one needs classical-like or quantum-like theories or both (just as in physics we use both classical and quantum theories) in considering human thinking in psychology and related fields, such as decision science. It comes as no surprise that many of these reasons are parallel to those that are responsible for the use of QM and QFT in the case of quantum phenomena. Still, the corresponding situations should be understood and justified in terms of the phenomena considered, phenomena defined by human thinking, because there are important differences between these phenomena and quantum phenomena, which this article aims to address. In order to do so, this article will first consider quantum phenomena and quantum theory, before turning to human thinking and decision making, in addressing which it will also discuss two recent quantum-like approaches to human thinking, that by M. G. D’Ariano and F. Faggin and that by A. Khrennikov. Both approaches are ontological in the sense of offering representations, different in character in each approach, of human thinking by the formalism of quantum theory. Whether such a representation, as opposed to only predicting the outcomes of relevant experiments, is possible either in quantum theory or in quantum-like theories of human thinking is one of the questions addressed in this article. The philosophical position adopted in it is that it may not be possible to make this assumption, which, however, is not the same as saying that it is impossible. I designate this view as the reality-without-realism, RWR, view and in considering strictly mental processes as the ideality-without-idealism, IWI, view, in the second case in part following, but also moving beyond, I. Kant’s philosophy.  相似文献   

2.
This paper assesses two different theories for explaining consciousness, a phenomenon that is widely considered amenable to scientific investigation despite its puzzling subjective aspects. I focus on Integrated Information Theory (IIT), which says that consciousness is integrated information (as ϕMax) and says even simple systems with interacting parts possess some consciousness. First, I evaluate IIT on its own merits. Second, I compare it to a more traditionally derived theory called Neurobiological Naturalism (NN), which says consciousness is an evolved, emergent feature of complex brains. Comparing these theories is informative because it reveals strengths and weaknesses of each, thereby suggesting better ways to study consciousness in the future. IIT’s strengths are the reasonable axioms at its core; its strong logic and mathematical formalism; its creative “experience-first” approach to studying consciousness; the way it avoids the mind-body (“hard”) problem; its consistency with evolutionary theory; and its many scientifically testable predictions. The potential weakness of IIT is that it contains stretches of logic-based reasoning that were not checked against hard evidence when the theory was being constructed, whereas scientific arguments require such supporting evidence to keep the reasoning on course. This is less of a concern for the other theory, NN, because it incorporated evidence much earlier in its construction process. NN is a less mature theory than IIT, less formalized and quantitative, and less well tested. However, it has identified its own neural correlates of consciousness (NCC) and offers a roadmap through which these NNCs may answer the questions of consciousness using the hypothesize-test-hypothesize-test steps of the scientific method.  相似文献   

3.
I numerically simulate and compare the entanglement of two quanta using the conventional formulation of quantum mechanics and a time-symmetric formulation that has no collapse postulate. The experimental predictions of the two formulations are identical, but the entanglement predictions are significantly different. The time-symmetric formulation reveals an experimentally testable discrepancy in the original quantum analysis of the Hanbury Brown–Twiss experiment, suggests solutions to some parts of the nonlocality and measurement problems, fixes known time asymmetries in the conventional formulation, and answers Bell’s question “How do you convert an ’and’ into an ’or’?”  相似文献   

4.
Completely locked-in state (CLIS) patients are unable to speak and have lost all muscle movement. From the external view, the internal brain activity of such patients cannot be easily perceived, but CLIS patients are considered to still be conscious and cognitively active. Detecting the current state of consciousness of CLIS patients is non-trivial, and it is difficult to ascertain whether CLIS patients are conscious or not. Thus, it is important to find alternative ways to re-establish communication with these patients during periods of awareness, and one such alternative is through a brain–computer interface (BCI). In this study, multiscale-based methods (multiscale sample entropy, multiscale permutation entropy and multiscale Poincaré plots) were applied to analyze electrocorticogram signals from a CLIS patient to detect the underlying consciousness level. Results from these different methods converge to a specific period of awareness of the CLIS patient in question, coinciding with the period during which the CLIS patient is recorded to have communicated with an experimenter. The aim of the investigation is to propose a methodology that could be used to create reliable communication with CLIS patients.  相似文献   

5.
This review looks at some of the central relationships between artificial intelligence, psychology, and economics through the lens of information theory, specifically focusing on formal models of decision-theory. In doing so we look at a particular approach that each field has adopted and how information theory has informed the development of the ideas of each field. A key theme is expected utility theory, its connection to information theory, the Bayesian approach to decision-making and forms of (bounded) rationality. What emerges from this review is a broadly unified formal perspective derived from three very different starting points that reflect the unique principles of each field. Each of the three approaches reviewed can, in principle at least, be implemented in a computational model in such a way that, with sufficient computational power, they could be compared with human abilities in complex tasks. However, a central critique that can be applied to all three approaches was first put forward by Savage in The Foundations of Statistics and recently brought to the fore by the economist Binmore: Bayesian approaches to decision-making work in what Savage called ‘small worlds’ but cannot work in ‘large worlds’. This point, in various different guises, is central to some of the current debates about the power of artificial intelligence and its relationship to human-like learning and decision-making. Recent work on artificial intelligence has gone some way to bridging this gap but significant questions remain to be answered in all three fields in order to make progress in producing realistic models of human decision-making in the real world in which we live in.  相似文献   

6.
7.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

8.
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption that the Earth’s complexity could, in principle, be modeled by (surrogated by) a numerical algorithm if enough computing power were granted. Yet, similar numerical algorithms also surrogate different systems having the same processes and dynamics, such as Mars or Jupiter, although being qualitatively different from the Earth system. Here, we argue that understanding the Earth as a complex system requires a consideration of the Gaia hypothesis: the Earth is a complex system because it instantiates life—and therefore an autopoietic, metabolic-repair (M,R) organization—at a planetary scale. This implies that the Earth’s complexity has formal equivalence to a self-referential system that inherently is non-algorithmic and, therefore, cannot be surrogated and simulated in a Turing machine. We discuss the consequences of this, with reference to in-silico climate models, tipping points, planetary boundaries, and planetary feedback loops as units of adaptive evolution and selection.  相似文献   

9.
The dependability of systems and networks has been the target of research for many years now. In the 1970s, what is now known as the top conference on dependability—The IEEE/IFIP International Conference on Dependable Systems and Networks (DSN)—emerged gathering international researchers and sparking the interest of the scientific community. Although it started in niche systems, nowadays dependability is viewed as highly important in most computer systems. The goal of this work is to analyze the research published in the proceedings of well-established dependability conferences (i.e., DSN, International Symposium on Software Reliability Engineering (ISSRE), International Symposium on Reliable Distributed Systems (SRDS), European Dependable Computing Conference (EDCC), Latin-American Symposium on Dependable Computing (LADC), Pacific Rim International Symposium on Dependable Computing (PRDC)), while using Natural Language Processing (NLP) and namely the Latent Dirichlet Allocation (LDA) algorithm to identify active, collapsing, ephemeral, and new lines of research in the dependability field. Results show a strong emphasis on terms, like ‘security’, despite the general focus of the conferences in dependability and new trends that are related with ’machine learning’ and ‘blockchain’. We used the PRDC conference as a use case, which showed similarity with the overall set of conferences, although we also found specific terms, like ‘cyber-physical’, being popular at PRDC and not in the overall dataset.  相似文献   

10.
The hard problem of consciousness has been a perennially vexing issue for the study of consciousness, particularly in giving a scientific and naturalized account of phenomenal experience. At the heart of the hard problem is an often-overlooked argument, which is at the core of the hard problem, and that is the structure and dynamics (S&D) argument. In this essay, I will argue that we have good reason to suspect that the S&D argument given by David Chalmers rests on a limited conception of S&D properties, what in this essay I’m calling extrinsic structure and dynamics. I argue that if we take recent insights from the complexity sciences and from recent developments in Integrated Information Theory (IIT) of Consciousness, that we get a more nuanced picture of S&D, specifically, a class of properties I’m calling intrinsic structure and dynamics. This I think opens the door to a broader class of properties with which we might naturally and scientifically explain phenomenal experience, as well as the relationship between syntactic, semantic, and intrinsic notions of information. I argue that Chalmers’ characterization of structure and dynamics in his S&D argument paints them with too broad a brush and fails to account for important nuances, especially when considering accounting for a system’s intrinsic properties. Ultimately, my hope is to vindicate a certain species of explanation from the S&D argument, and by extension dissolve the hard problem of consciousness at its core, by showing that not all structure and dynamics are equal.  相似文献   

11.
Alzheimer’s disease (AD) is characterized by working memory (WM) failures that can be assessed at early stages through administering clinical tests. Ecological neuroimaging, such as Electroencephalography (EEG) and functional Near Infrared Spectroscopy (fNIRS), may be employed during these tests to support AD early diagnosis within clinical settings. Multimodal EEG-fNIRS could measure brain activity along with neurovascular coupling (NC) and detect their modifications associated with AD. Data analysis procedures based on signal complexity are suitable to estimate electrical and hemodynamic brain activity or their mutual information (NC) during non-structured experimental paradigms. In this study, sample entropy of whole-head EEG and frontal/prefrontal cortex fNIRS was evaluated to assess brain activity in early AD and healthy controls (HC) during WM tasks (i.e., Rey–Osterrieth complex figure and Raven’s progressive matrices). Moreover, conditional entropy between EEG and fNIRS was evaluated as indicative of NC. The findings demonstrated the capability of complexity analysis of multimodal EEG-fNIRS to detect WM decline in AD. Furthermore, a multivariate data-driven analysis, performed on these entropy metrics and based on the General Linear Model, allowed classifying AD and HC with an AUC up to 0.88. EEG-fNIRS may represent a powerful tool for the clinical evaluation of WM decline in early AD.  相似文献   

12.
Proper peer review and quality of published articles are often regarded as signs of reliable scientific journals. The aim of this study was to compare whether the quality of statistical reporting and data presentation differs among articles published in ‘predatory dental journals’ and in other dental journals. We evaluated 50 articles published in ‘predatory open access (OA) journals’ and 100 clinical trials published in legitimate dental journals between 2019 and 2020. The quality of statistical reporting and data presentation of each paper was assessed on a scale from 0 (poor) to 10 (high). The mean (SD) quality score of the statistical reporting and data presentation was 2.5 (1.4) for the predatory OA journals, 4.8 (1.8) for the legitimate OA journals, and 5.6 (1.8) for the more visible dental journals. The mean values differed significantly (p < 0.001). The quality of statistical reporting of clinical studies published in predatory journals was found to be lower than in open access and highly cited journals. This difference in quality is a wake-up call to consume study results critically. Poor statistical reporting indicates wider general lower quality in publications where the authors and journals are less likely to be critiqued by peer review.  相似文献   

13.
Balance impairment is one of the biggest risk factors for falls reducing inactivity, resulting in nursing care. Therefore, balance ability is crucial to maintain the activities of independent daily living of older adults. Many tests to assess balance ability have been developed. However, few reports reveal the structure underlying results of balance performance tests comparing young and older adults. Covariance structure analysis is a tool that is used to test statistically whether factorial structure fits data. This study examined aging effects on the factorial structure underlying balance performance tests. Participants comprised 60 healthy young women aged 22 ± 3 years (young group) and 60 community-dwelling older women aged 69 ± 5 years (older group). Six balance tests: postural sway, one-leg standing, functional reach, timed up and go (TUG), gait, and the EquiTest were employed. Exploratory factor analysis revealed that three clearly interpretable factors were extracted in the young group. The first factor had high loadings on the EquiTest, and was interpreted as ‘Reactive’. The second factor had high loadings on the postural sway test, and was interpreted as ‘Static’. The third factor had high loadings on TUG and gait test, and was interpreted as ‘Dynamic’. Similarly, three interpretable factors were extracted in the older group. The first factor had high loadings on the postural sway test and the EquiTest and therefore was interpreted as ‘Static and Reactive’. The second factor, which had high loadings on the EquiTest, was interpreted as ‘Reactive’. The third factor, which had high loadings on TUG and the gait test, was interpreted as ‘Dynamic’. A covariance structure model was applied to the test data: the second-order factor was balance ability, and the first-order factors were static, dynamic and reactive factors which were assumed to be measured based on the six balance tests. Goodness-of-fit index (GFI) of the models were acceptable (young group, GFI=0.931; older group, GFI=0.923). Static, dynamic and reactive factors relating to balance ability had loadings 0.21, 0.24, and 0.76 in the young group and 0.71, 0.28, and 0.43 in the older group, respectively. It is suggested that the common factorial structure of balance abilities were static, dynamic and reactive, and that for young people reactive balance ability was characterized and explained by balance ability, whereas for older people it was static balance ability.  相似文献   

14.
In 1976 we reported our first autopsied case with diffuse Lewy body disease (DLBD), the term of which we proposed in 1984. We also proposed the term “Lewy body disease” (LBD) in1980. Subsequently, we classified LBD into three types according to the distribution pattern of Lewy bodies: a brain stem type, a transitional type and a diffuse type. Later, we added the cerebral type. As we have proposed since 1980, LBD has recently been used as a generic term to include Parkinson’s disease (PD), Parkinson’s disease with dementia (PDD) and dementia with Lewy bodies (DLB), which was proposed in 1996 on the basis of our reports of DLBD.DLB is now known to be the second most frequent dementia following Alzheimer’s disease (AD).In this paper we introduce our studies of DLBD and LBD.  相似文献   

15.
Neurofeedback training (NFT) has shown promising results in recent years as a tool to address the effects of age-related cognitive decline in the elderly. Since previous studies have linked reduced complexity of electroencephalography (EEG) signal to the process of cognitive decline, we propose the use of non-linear methods to characterise changes in EEG complexity induced by NFT. In this study, we analyse the pre- and post-training EEG from 11 elderly subjects who performed an NFT based on motor imagery (MI–NFT). Spectral changes were studied using relative power (RP) from classical frequency bands (delta, theta, alpha, and beta), whilst multiscale entropy (MSE) was applied to assess EEG-induced complexity changes. Furthermore, we analysed the subject’s scores from Luria tests performed before and after MI–NFT. We found that MI–NFT induced a power shift towards rapid frequencies, as well as an increase of EEG complexity in all channels, except for C3. These improvements were most evident in frontal channels. Moreover, results from cognitive tests showed significant enhancement in intellectual and memory functions. Therefore, our findings suggest the usefulness of MI–NFT to improve cognitive functions in the elderly and encourage future studies to use MSE as a metric to characterise EEG changes induced by MI–NFT.  相似文献   

16.
Alzheimer’s disease (AD) is a neurodegenerative disorder which has become an outstanding social problem. The main objective of this study was to evaluate the alterations that dementia due to AD elicits in the distribution of functional network weights. Functional connectivity networks were obtained using the orthogonalized Amplitude Envelope Correlation (AEC), computed from source-reconstructed resting-state eletroencephalographic (EEG) data in a population formed by 45 cognitive healthy elderly controls, 69 mild cognitive impaired (MCI) patients and 81 AD patients. Our results indicated that AD induces a progressive alteration of network weights distribution; specifically, the Shannon entropy (SE) of the weights distribution showed statistically significant between-group differences (p < 0.05, Kruskal-Wallis test, False Discovery Rate corrected). Furthermore, an in-depth analysis of network weights distributions was performed in delta, alpha, and beta-1 frequency bands to discriminate the weight ranges showing statistical differences in SE. Our results showed that lower and higher weights were more affected by the disease, whereas mid-range connections remained unchanged. These findings support the importance of performing detailed analyses of the network weights distribution to further understand the impact of AD progression on functional brain activity.  相似文献   

17.
This paper is devoted to the foundational problems of dendrogramic holographic theory (DH theory). We used the ontic–epistemic (implicate–explicate order) methodology. The epistemic counterpart is based on the representation of data by dendrograms constructed with hierarchic clustering algorithms. The ontic universe is described as a p-adic tree; it is zero-dimensional, totally disconnected, disordered, and bounded (in p-adic ultrametric spaces). Classical–quantum interrelations lose their sharpness; generally, simple dendrograms are “more quantum” than complex ones. We used the CHSH inequality as a measure of quantum-likeness. We demonstrate that it can be violated by classical experimental data represented by dendrograms. The seed of this violation is neither nonlocality nor a rejection of realism, but the nonergodicity of dendrogramic time series. Generally, the violation of ergodicity is one of the basic features of DH theory. The dendrogramic representation leads to the local realistic model that violates the CHSH inequality. We also considered DH theory for Minkowski geometry and monitored the dependence of CHSH violation and nonergodicity on geometry, as well as a Lorentz transformation of data.  相似文献   

18.
At the basis of the problem of explaining non-local quantum correlations lies the tension between two factors: on the one hand, the natural interpretation of correlations as the manifestation of a causal relation; on the other, the resistance on the part of the physics underlying said correlations to adjust to the most essential features of a pre-theoretic notion of causation. In this paper, I argue for the rejection of the first horn of the dilemma, i.e., the assumption that quantum correlations call for a causal explanation. The paper is divided into two parts. The first, destructive, part provides a critical overview of the enterprise of causally interpreting non-local quantum correlations, with the aim of warning against the temptation of an account of causation claiming to cover such correlations ‘for free’. The second, constructive, part introduces the so-called structural explanation (a variety of non-causal explanation that shows how the explanandum is the manifestation of a fundamental structure of the world) and argues that quantum correlations might be explained structurally in the context of an information-theoretic approach to QT.  相似文献   

19.
The conversion of what has been interpreted as “normal brain aging” to Alzheimer’s disease (AD) via transition states, i.e., preclinical AD and mild cognitive impairment, appears to be a continuous process caused primarily by aging-dependent accumulation of amyloid β peptide (Aβ) in the brain. This notion however gives us a hope that, by manipulating the Aβ levels in the brain, we may be able not only to prevent and cure the disease but also to partially control some very significant aspects of brain aging. Aβ is constantly produced from its precursor and immediately catabolized under normal conditions, whereas dysmetabolism of Aβ seems to lead to pathological deposition upon aging. We have focused our attention on elucidation of the unresolved mechanism of Aβ catabolism in the brain. In this review, I describe a new approach to prevent AD development by reducing Aβ burdens in aging brains through up-regulation of the catabolic mechanism involving neprilysin that can degrade both monomeric and oligomeric forms of Aβ. The strategy of combining presymptomatic diagnosis with preventive medicine seems to be the most pragmatic in both medical and socioeconomical terms.  相似文献   

20.
Hepatic stellate cells (HSCs) are vitamin-A storing collagen-producing cells in hepatic lobules. The three-dimensional structure of HSCs has been demonstrated with the Golgi method, the maceration method for scanning electron microscopy, and confocal laser scanning microscopy. Many thorn-like microprojections or spines extend from the subendothelial processes and make contacts with hepatocytes. One HSC entwines two or more sinusoids and about 20–40 hepatocytes to create a cellular unit, ‘the stellate cell unit’ or ‘stellon’. The Space of Disse is defined as the space between stellate cell-endothelial cell complex and hepatocytes. Intralobular heterogeneity of HSCs is assessed. HSCs develop from mesenchymal cells in the septum transversum. The developmental process of HSCs is reproduced partly in culture. In the lamprey abundant vitamin A is stored not only in HSCs, but in the fibroblast-like cells in the various other splanchnic organs. In vertebrates, the existence of both conventional fibroblast system in somatic tissues and vitamin A-storing cell system in splanchnic organs is suggested.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号