首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
I will argue that, in an interdisciplinary study of consciousness, epistemic structural realism (ESR) can offer a feasible philosophical background for the study of consciousness and its associated neurophysiological phenomena in neuroscience and cognitive science while also taking into account the mathematical structures involved in this type of research. Applying the ESR principles also to the study of the neurophysiological phenomena associated with free will (or rather conscious free choice) and with various alterations of consciousness (AOCs) generated by various pathologies such as epilepsy would add explanatory value to the matter. This interdisciplinary approach would be in tune with Quine’s well known idea that philosophy is not simple conceptual analysis but is continuous with science and actually represents an abstract branch of the empirical research. The ESR could thus resonate with scientific models of consciousness such as the global neuronal workspace model (inspired by the global workspace theory—GWT) and the integrated information theory (IIT) model. While structural realism has already been employed in physics or biology, its application as a meta-theory contextualising and relating various scientific findings on consciousness is new indeed. Out of the two variants: ontic structural realism (OSR) and epistemic structural realism (ESR), the latter can be considered more suitable for the study of consciousness and its associated neurophysiological phenomena because it removes the pressure of the still unanswered ‘What is consciousness?’ ontological question and allows us to concentrate instead on the ‘What can we know about consciousness?’ epistemological question.  相似文献   

2.
IIT includes commitments about the very nature of physical reality, a fact both highly unusual for an empirical theory within neuroscience, and surprisingly underappreciated within the literature. These commitments are intimately tied to the theory; they are not incidental. This paper demonstrates as much by raising certain objections in a “naive” way, and then exposing how the principled IIT responses would rely upon metaphysical positions. Along the way we draw on the IIT literature for support for these interpretations, but also point to a need for elaboration and clarification. Section 1 applies the Placement Argument in a way that leads to problem involving zombies, treated in Section 2. Section 3 frames the zombie problem as an apparent dilemma, and addresses that dilemma by drawing on claims in the IIT literature concerning physical reality. Section 4 raises a related dilemma and treats it in a way that dovetails with the treatment in Section 3 of physical reality. All of this underscores not just the breadth of IIT, but the relevance of this breadth to a full consideration of IIT’s merits.  相似文献   

3.
How, if at all, consciousness can be part of the physical universe remains a baffling problem. This article outlines a new, developing philosophical theory of how it could do so, and offers a preliminary mathematical formulation of a physical grounding for key aspects of the theory. Because the philosophical side has radical elements, so does the physical-theory side. The philosophical side is radical, first, in proposing that the productivity or dynamism in the universe that many believe to be responsible for its systematic regularities is actually itself a physical constituent of the universe, along with more familiar entities. Indeed, it proposes that instances of dynamism can themselves take part in physical interactions with other entities, this interaction then being “meta-dynamism” (a type of meta-causation). Secondly, the theory is radical, and unique, in arguing that consciousness is necessarily partly constituted of meta-dynamic auto-sensitivity, in other words it must react via meta-dynamism to its own dynamism, and also in conjecturing that some specific form of this sensitivity is sufficient for and indeed constitutive of consciousness. The article proposes a way for physical laws to be modified to accommodate meta-dynamism, via the radical step of including elements that explicitly refer to dynamism itself. Additionally, laws become, explicitly, temporally non-local in referring directly to quantity values holding at times prior to a given instant of application of the law. The approach therefore implicitly brings in considerations about what information determines states. Because of the temporal non-locality, and also because of the deep connections between dynamism and time-flow, the approach also implicitly connects to the topic of entropy insofar as this is related to time.  相似文献   

4.
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.  相似文献   

5.
The hard problem of consciousness has been a perennially vexing issue for the study of consciousness, particularly in giving a scientific and naturalized account of phenomenal experience. At the heart of the hard problem is an often-overlooked argument, which is at the core of the hard problem, and that is the structure and dynamics (S&D) argument. In this essay, I will argue that we have good reason to suspect that the S&D argument given by David Chalmers rests on a limited conception of S&D properties, what in this essay I’m calling extrinsic structure and dynamics. I argue that if we take recent insights from the complexity sciences and from recent developments in Integrated Information Theory (IIT) of Consciousness, that we get a more nuanced picture of S&D, specifically, a class of properties I’m calling intrinsic structure and dynamics. This I think opens the door to a broader class of properties with which we might naturally and scientifically explain phenomenal experience, as well as the relationship between syntactic, semantic, and intrinsic notions of information. I argue that Chalmers’ characterization of structure and dynamics in his S&D argument paints them with too broad a brush and fails to account for important nuances, especially when considering accounting for a system’s intrinsic properties. Ultimately, my hope is to vindicate a certain species of explanation from the S&D argument, and by extension dissolve the hard problem of consciousness at its core, by showing that not all structure and dynamics are equal.  相似文献   

6.
In this paper, I investigate a connection between a common characterisation of freedom and how uncertainty is managed in a Bayesian hierarchical model. To do this, I consider a distributed factorization of a group’s optimization of free energy, in which each agent is attempting to align with the group and with its own model. I show how this can lead to equilibria for groups, defined by the capacity of the model being used, essentially how many different datasets it can handle. In particular, I show that there is a “sweet spot” in the capacity of a normal model in each agent’s decentralized optimization, and that this “sweet spot” corresponds to minimal free energy for the group. At the sweet spot, an agent can predict what the group will do and the group is not surprised by the agent. However, there is an asymmetry. A higher capacity model for an agent makes it harder for the individual to learn, as there are more parameters. Simultaneously, a higher capacity model for the group, implemented as a higher capacity model for each member agent, makes it easier for a group to integrate a new member. To optimize for a group of agents then requires one to make a trade-off in capacity, as each individual agent seeks to decrease capacity, but there is pressure from the group to increase capacity of all members. This pressure exists because as individual agent’s capacities are reduced, so too are their abilities to model other agents, and thereby to establish pro-social behavioural patterns. I then consider a basic two-level (dual process) Bayesian model of social reasoning and a set of three parameters of capacity that are required to implement such a model. Considering these three capacities as dependent elements in a free energy minimization for a group leads to a “sweet surface” in a three-dimensional space defining the triplet of parameters that each agent must use should they hope to minimize free energy as a group. Finally, I relate these three parameters to three notions of freedom and equality in human social organization, and postulate a correspondence between freedom and model capacity. That is, models with higher capacity, have more freedom as they can interact with more datasets.  相似文献   

7.
The Integrated Information Theory (IIT) of consciousness starts from essential phenomenological properties, which are then translated into postulates that any physical system must satisfy in order to specify the physical substrate of consciousness. We recently introduced an information measure (Barbosa et al., 2020) that captures three postulates of IIT—existence, intrinsicality and information—and is unique. Here we show that the new measure also satisfies the remaining postulates of IIT—integration and exclusion—and create the framework that identifies maximally irreducible mechanisms. These mechanisms can then form maximally irreducible systems, which in turn will specify the physical substrate of conscious experience.  相似文献   

8.
We consider state changes in quantum theory due to “conditional action” and relate these to the discussion of entropy decrease due to interventions of “intelligent beings” and the principles of Szilard and Landauer/Bennett. The mathematical theory of conditional actions is a special case of the theory of “instruments”, which describes changes of state due to general measurements and will therefore be briefly outlined in the present paper. As a detailed example, we consider the imperfect erasure of a qubit that can also be viewed as a conditional action and will be realized by the coupling of a spin to another small spin system in its ground state.  相似文献   

9.
In this work, a framework to boost the efficiency of Bayesian inference in probabilistic models is introduced by embedding a Markov chain sampler within a variational posterior approximation. We call this framework “refined variational approximation”. Its strengths are its ease of implementation and the automatic tuning of sampler parameters, leading to a faster mixing time through automatic differentiation. Several strategies to approximate evidence lower bound (ELBO) computation are also introduced. Its efficient performance is showcased experimentally using state-space models for time-series data, a variational encoder for density estimation and a conditional variational autoencoder as a deep Bayes classifier.  相似文献   

10.
Integrated information theory (IIT) provides a mathematical framework to characterize the cause-effect structure of a physical system and its amount of integrated information (Φ). An accompanying Python software package (“PyPhi”) was recently introduced to implement this framework for the causal analysis of discrete dynamical systems of binary elements. Here, we present an update to PyPhi that extends its applicability to systems constituted of discrete, but multi-valued elements. This allows us to analyze and compare general causal properties of random networks made up of binary, ternary, quaternary, and mixed nodes. Moreover, we apply the developed tools for causal analysis to a simple non-binary regulatory network model (p53-Mdm2) and discuss commonly used binarization methods in light of their capacity to preserve the causal structure of the original system with multi-valued elements.  相似文献   

11.
This article considers a partly philosophical question: What are the ontological and epistemological reasons for using quantum-like models or theories (models and theories based on the mathematical formalism of quantum theory) vs. classical-like ones (based on the mathematics of classical physics), in considering human thinking and decision making? This question is only partly philosophical because it also concerns the scientific understanding of the phenomena considered by the theories that use mathematical models of either type, just as in physics itself, where this question also arises as a physical question. This is because this question is in effect: What are the physical reasons for using, even if not requiring, these types of theories in considering quantum phenomena, which these theories predict fully in accord with the experiment? This is clearly also a physical, rather than only philosophical, question and so is, accordingly, the question of whether one needs classical-like or quantum-like theories or both (just as in physics we use both classical and quantum theories) in considering human thinking in psychology and related fields, such as decision science. It comes as no surprise that many of these reasons are parallel to those that are responsible for the use of QM and QFT in the case of quantum phenomena. Still, the corresponding situations should be understood and justified in terms of the phenomena considered, phenomena defined by human thinking, because there are important differences between these phenomena and quantum phenomena, which this article aims to address. In order to do so, this article will first consider quantum phenomena and quantum theory, before turning to human thinking and decision making, in addressing which it will also discuss two recent quantum-like approaches to human thinking, that by M. G. D’Ariano and F. Faggin and that by A. Khrennikov. Both approaches are ontological in the sense of offering representations, different in character in each approach, of human thinking by the formalism of quantum theory. Whether such a representation, as opposed to only predicting the outcomes of relevant experiments, is possible either in quantum theory or in quantum-like theories of human thinking is one of the questions addressed in this article. The philosophical position adopted in it is that it may not be possible to make this assumption, which, however, is not the same as saying that it is impossible. I designate this view as the reality-without-realism, RWR, view and in considering strictly mental processes as the ideality-without-idealism, IWI, view, in the second case in part following, but also moving beyond, I. Kant’s philosophy.  相似文献   

12.
A case for the project of excising of confusion and obfuscation in the contemporary quantum theory initiated and promoted by David Deutsch has been made. It has been argued that at least some theoretical entities which are conventionally labelled as “interpretations” of quantum mechanics are in fact full-blooded physical theories in their own right, and as such are falsifiable, at least in principle. The most pertinent case is the one of the so-called “Many-Worlds Interpretation” (MWI) of Everett and others. This set of idea differs from other “interpretations” since it does not accept reality of the collapse of Schrödinger’s wavefunction. A survey of several important proposals for discrimination between quantum theories with and without wavefunction collapse appearing from time to time in the literature has been made, and the possibilities discussed in the framework of a wider taxonomy.  相似文献   

13.
14.
15.
16.
“No free lunch” results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling, which is more or less realistic for a given problem. Some models are “expensive” (strong assumptions, such as sub-Gaussian tails), others are “cheap” (simply finite variance). As it is well known, the more you pay, the more you get: in other words, the most expensive models yield the more interesting bounds. Recent advances in robust statistics have investigated procedures to obtain tight bounds while keeping the cost of assumptions minimal. The present paper explores and exhibits what the limits are for obtaining tight probably approximately correct (PAC)-Bayes bounds in a robust setting for cheap models.  相似文献   

17.
Causal Geometry     
Information geometry has offered a way to formally study the efficacy of scientific models by quantifying the impact of model parameters on the predicted effects. However, there has been little formal investigation of causation in this framework, despite causal models being a fundamental part of science and explanation. Here, we introduce causal geometry, which formalizes not only how outcomes are impacted by parameters, but also how the parameters of a model can be intervened upon. Therefore, we introduce a geometric version of “effective information”—a known measure of the informativeness of a causal relationship. We show that it is given by the matching between the space of effects and the space of interventions, in the form of their geometric congruence. Therefore, given a fixed intervention capability, an effective causal model is one that is well matched to those interventions. This is a consequence of “causal emergence,” wherein macroscopic causal relationships may carry more information than “fundamental” microscopic ones. We thus argue that a coarse-grained model may, paradoxically, be more informative than the microscopic one, especially when it better matches the scale of accessible interventions—as we illustrate on toy examples.  相似文献   

18.
The use of chaotic systems in electronics, such as Pseudo-Random Number Generators (PRNGs), is very appealing. Among them, continuous-time ones are used less because, in addition to having strong temporal correlations, they require further computations to obtain the discrete solutions. Here, the time step and discretization method selection are first studied by conducting a detailed analysis of their effect on the systems’ statistical and chaotic behavior. We employ an approach based on interpreting the time step as a parameter of the new “maps”. From our analysis, it follows that to use them as PRNGs, two actions should be achieved (i) to keep the chaotic oscillation and (ii) to destroy the inner and temporal correlations. We then propose a simple methodology to achieve chaos-based PRNGs with good statistical characteristics and high throughput, which can be applied to any continuous-time chaotic system. We analyze the generated sequences by means of quantifiers based on information theory (permutation entropy, permutation complexity, and causal entropy × complexity plane). We show that the proposed PRNG generates sequences that successfully pass Marsaglia Diehard and NIST (National Institute of Standards and Technology) tests. Finally, we show that its hardware implementation requires very few resources.  相似文献   

19.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

20.
Throughout my research life, I experienced to discover the causes of some neurological diseases in Japan.
  1. SMON (subacute myelo-optico-neuropathy). Since the early 1960s, a peculiar neurological disease became prevalent throughout Japan. Through the chemical analysis of the green urine, characteristic of this disease, it was found that this disease was caused by intoxication of the administered clioquinol, an anti-diarrheal drug. This discovery is a big topic in the history of Japanese medicine.
  2. In early 1970s, I experienced many young patients with oedema and polyneuropathy in Kagoshima. Finally it was found that the disease was the long-forgotten beriberi, which had disappeared several decades ago. We must always be aware of beriberi even now, as far as we eat well-polished rice.
  3. In 1972, we noticed a group of sporadic paraparesis in Kagoshima, which was 20 years later confirmed to be induced by human T lymphotropic virus type-I (HTLV-I). We named this disease as “HTLV-I associated myelopathy” (HAM). It gave a strong impact that the causative virus of adult T cell leukemia (ATL) can induce entirely different diseases, in terms of both the clinical course and the pathological features. It was also proven that HAM was identical with tropical spastic paraparesis, (TSP), which had been prevalent in many areas of tropical zones.
These experiences are good examples of our slogan “to keep in mind to send message of scientific progress from the local area to the international stage”.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号