首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
To take into account the temporal dimension of uncertainty in stock markets, this paper introduces a cross-sectional estimation of stock market volatility based on the intrinsic entropy model. The proposed cross-sectional intrinsic entropy (CSIE) is defined and computed as a daily volatility estimate for the entire market, grounded on the daily traded prices—open, high, low, and close prices (OHLC)—along with the daily traded volume for all symbols listed on The New York Stock Exchange (NYSE) and The National Association of Securities Dealers Automated Quotations (NASDAQ). We perform a comparative analysis between the time series obtained from the CSIE and the historical volatility as provided by the estimators: close-to-close, Parkinson, Garman–Klass, Rogers–Satchell, Yang–Zhang, and intrinsic entropy (IE), defined and computed from historical OHLC daily prices of the Standard & Poor’s 500 index (S&P500), Dow Jones Industrial Average (DJIA), and the NASDAQ Composite index, respectively, for various time intervals. Our study uses an approximate 6000-day reference point, starting 1 January 2001, until 23 January 2022, for both the NYSE and the NASDAQ. We found that the CSIE market volatility estimator is consistently at least 10 times more sensitive to market changes, compared to the volatility estimate captured through the market indices. Furthermore, beta values confirm a consistently lower volatility risk for market indices overall, between 50% and 90% lower, compared to the volatility risk of the entire market in various time intervals and rolling windows.  相似文献   

2.
This paper assesses two different theories for explaining consciousness, a phenomenon that is widely considered amenable to scientific investigation despite its puzzling subjective aspects. I focus on Integrated Information Theory (IIT), which says that consciousness is integrated information (as ϕMax) and says even simple systems with interacting parts possess some consciousness. First, I evaluate IIT on its own merits. Second, I compare it to a more traditionally derived theory called Neurobiological Naturalism (NN), which says consciousness is an evolved, emergent feature of complex brains. Comparing these theories is informative because it reveals strengths and weaknesses of each, thereby suggesting better ways to study consciousness in the future. IIT’s strengths are the reasonable axioms at its core; its strong logic and mathematical formalism; its creative “experience-first” approach to studying consciousness; the way it avoids the mind-body (“hard”) problem; its consistency with evolutionary theory; and its many scientifically testable predictions. The potential weakness of IIT is that it contains stretches of logic-based reasoning that were not checked against hard evidence when the theory was being constructed, whereas scientific arguments require such supporting evidence to keep the reasoning on course. This is less of a concern for the other theory, NN, because it incorporated evidence much earlier in its construction process. NN is a less mature theory than IIT, less formalized and quantitative, and less well tested. However, it has identified its own neural correlates of consciousness (NCC) and offers a roadmap through which these NNCs may answer the questions of consciousness using the hypothesize-test-hypothesize-test steps of the scientific method.  相似文献   

3.
Time is a key element of consciousness as it includes multiple timescales from shorter to longer ones. This is reflected in our experience of various short-term phenomenal contents at discrete points in time as part of an ongoing, more continuous, and long-term ‘stream of consciousness’. Can Integrated Information Theory (IIT) account for this multitude of timescales of consciousness? According to the theory, the relevant spatiotemporal scale for consciousness is the one in which the system reaches the maximum cause-effect power; IIT currently predicts that experience occurs on the order of short timescales, namely, between 100 and 300 ms (theta and alpha frequency range). This can well account for the integration of single inputs into a particular phenomenal content. However, such short timescales leave open the temporal relation of specific phenomenal contents to others during the course of the ongoing time, that is, the stream of consciousness. For that purpose, we converge the IIT with the Temporo-spatial Theory of Consciousness (TTC), which, assuming a multitude of different timescales, can take into view the temporal integration of specific phenomenal contents with other phenomenal contents over time. On the neuronal side, this is detailed by considering those neuronal mechanisms driving the non-additive interaction of pre-stimulus activity with the input resulting in stimulus-related activity. Due to their non-additive interaction, the single input is not only integrated with others in the short-term timescales of 100–300 ms (alpha and theta frequencies) (as predicted by IIT) but, at the same time, also virtually expanded in its temporal (and spatial) features; this is related to the longer timescales (delta and slower frequencies) that are carried over from pre-stimulus to stimulus-related activity. Such a non-additive pre-stimulus-input interaction amounts to temporo-spatial expansion as a key mechanism of TTC for the constitution of phenomenal contents including their embedding or nesting within the ongoing temporal dynamic, i.e., the stream of consciousness. In conclusion, we propose converging the short-term integration of inputs postulated in IIT (100–300 ms as in the alpha and theta frequency range) with the longer timescales (in delta and slower frequencies) of temporo-spatial expansion in TTC.  相似文献   

4.
I will argue that, in an interdisciplinary study of consciousness, epistemic structural realism (ESR) can offer a feasible philosophical background for the study of consciousness and its associated neurophysiological phenomena in neuroscience and cognitive science while also taking into account the mathematical structures involved in this type of research. Applying the ESR principles also to the study of the neurophysiological phenomena associated with free will (or rather conscious free choice) and with various alterations of consciousness (AOCs) generated by various pathologies such as epilepsy would add explanatory value to the matter. This interdisciplinary approach would be in tune with Quine’s well known idea that philosophy is not simple conceptual analysis but is continuous with science and actually represents an abstract branch of the empirical research. The ESR could thus resonate with scientific models of consciousness such as the global neuronal workspace model (inspired by the global workspace theory—GWT) and the integrated information theory (IIT) model. While structural realism has already been employed in physics or biology, its application as a meta-theory contextualising and relating various scientific findings on consciousness is new indeed. Out of the two variants: ontic structural realism (OSR) and epistemic structural realism (ESR), the latter can be considered more suitable for the study of consciousness and its associated neurophysiological phenomena because it removes the pressure of the still unanswered ‘What is consciousness?’ ontological question and allows us to concentrate instead on the ‘What can we know about consciousness?’ epistemological question.  相似文献   

5.
The economy is a system of complex interactions. The COVID-19 pandemic strongly influenced economies, particularly through introduced restrictions, which formed a completely new economic environment. The present work focuses on the changes induced by the COVID-19 epidemic on the correlation network structure. The analysis is performed on a representative set of USA companies—the S&P500 components. Four different network structures are constructed (strong, weak, typically, and significantly connected networks), and the rank entropy, cycle entropy, averaged clustering coefficient, and transitivity evolution are established and discussed. Based on the mentioned structural parameters, four different stages have been distinguished during the COVID-19-induced crisis. The proposed network properties and their applicability to a crisis-distinguishing problem are discussed. Moreover, the optimal time window problem is analysed.  相似文献   

6.
The interaction between the flow of sentiment expressed on blogs and media and the dynamics of the stock market prices are analyzed through an information-theoretic measure, the transfer entropy, to quantify causality relations. We analyzed daily stock price and daily social media sentiment for the top 50 companies in the Standard & Poor (S&P) index during the period from November 2018 to November 2020. We also analyzed news mentioning these companies during the same period. We found that there is a causal flux of information that links those companies. The largest fraction of significant causal links is between prices and between sentiments, but there is also significant causal information which goes both ways from sentiment to prices and from prices to sentiment. We observe that the strongest causal signal between sentiment and prices is associated with the Tech sector.  相似文献   

7.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

8.
9.
Combined with the B-P (breakpoint) test and VAR–DCC–GARCH model, the relationship between WTI crude oil futures and S&P 500 index futures or CSI 300 index futures was investigated and compared. The results show that breakpoints exist in the relationship in the mean between WTI crude oil futures market and Chinese stock index futures market or US stock index futures market. The relationship in mean between WTI crude oil futures prices and S&P 500 stock index futures, or CSI 300 stock index futures is weakening. Meanwhile, there is a decreasing dynamic conditional correlation between the WTI crude oil futures market and Chinese stock index futures market or US stock index futures market after the breakpoint in the price series. The Chinese stock index futures are less affected by short-term fluctuations in crude oil futures returns than US stock index futures.  相似文献   

10.
Modal interpretations of quantum mechanics propose to solve the measurement problem by rejecting the orthodox view that in entangled states of a system which are nontrivial superpositions of an observable's eigenstates, it is meaningless to speak of that observable as having a value or corresponding to a property of the system. Though denying this is reminiscent of how hidden-variable interpreters have challenged orthodox views about superposition, modal interpreters also argue that their proposals avoid any of the objectionable features of physical properties that beset hidden-variable interpretations, like contextualism and nonlocality. Even so, I shall prove that modal interpreters of quantum mechanics are still committed to giving up at least one of the following three conditions characteristic of classical reasoning about physical properties: (1) Properties certain to be found on measuring a system should be counted as intrinsic properties of the system. (2) If two propositions stating the possession of two intrinsic properties by the system are regarded as meaningful, then their conjunction should also correspond to a meaningful proposition about the system possessing a certain intrinsic property; and similarly for disjunction and negation. (3) The intrinsic properties of a composite system should at least include (though need not be exhausted by) the intrinsic properties of its parts. Conditions 1–3 are by no means undeniable. But the onus seems to be on modal interpreters to tell us why rejecting one of these is preferable to an ontology of properties incorporating contextualism and nonlocality.  相似文献   

11.
In this paper, I argue that the Shrapnel–Costa no-go theorem undermines the last remaining viability of the view that the fundamental ontology of quantum mechanics is essentially classical: that is, the view that physical reality is underpinned by objectively real, counterfactually definite, uniquely spatiotemporally defined, local, dynamical entities with determinate valued properties, and where typically ‘quantum’ behaviour emerges as a function of our own in-principle ignorance of such entities. Call this view Einstein–Bell realism. One can show that the causally symmetric local hidden variable approach to interpreting quantum theory is the most natural interpretation that follows from Einstein–Bell realism, where causal symmetry plays a significant role in circumventing the nonclassical consequences of the traditional no-go theorems. However, Shrapnel and Costa argue that exotic causal structures, such as causal symmetry, are incapable of explaining quantum behaviour as arising as a result of noncontextual ontological properties of the world. This is particularly worrying for Einstein–Bell realism and classical ontology. In the first instance, the obvious consequence of the theorem is a straightforward rejection of Einstein–Bell realism. However, more than this, I argue that, even where there looks to be a possibility of accounting for contextual ontic variables within a causally symmetric framework, the cost of such an account undermines a key advantage of causal symmetry: that accepting causal symmetry is more economical than rejecting a classical ontology. Either way, it looks like we should give up on classical ontology.  相似文献   

12.
Volatility, which represents the magnitude of fluctuating asset prices or returns, is used in the problems of finance to design optimal asset allocations and to calculate the price of derivatives. Since volatility is unobservable, it is identified and estimated by latent variable models known as volatility fluctuation models. Almost all conventional volatility fluctuation models are linear time-series models and thus are difficult to capture nonlinear and/or non-Gaussian properties of volatility dynamics. In this study, we propose an entropy based Student’s t-process Dynamical model (ETPDM) as a volatility fluctuation model combined with both nonlinear dynamics and non-Gaussian noise. The ETPDM estimates its latent variables and intrinsic parameters by a robust particle filtering based on a generalized H-theorem for a relative entropy. To test the performance of the ETPDM, we implement numerical experiments for financial time-series and confirm the robustness for a small number of particles by comparing with the conventional particle filtering.  相似文献   

13.
This research aims to compare the performance of ARIMA as a linear model with that of the combination of ARIMA and GARCH family models to forecast S&P500 log returns in order to construct algorithmic investment strategies on this index. We used the data collected from Yahoo Finance with daily frequency for the period from 1 January 2000 to 31 December 2019. By using a rolling window approach, we compared ARIMA with the hybrid models to examine whether hybrid ARIMA-SGARCH and ARIMA-EGARCH can really reflect the specific time-series characteristics and have better predictive power than the simple ARIMA model. In order to assess the precision and quality of these models in forecasting, we compared their equity lines, their forecasting error metrics (MAE, MAPE, RMSE, MAPE), and their performance metrics (annualized return compounded, annualized standard deviation, maximum drawdown, information ratio, and adjusted information ratio). The main contribution of this research is to show that the hybrid models outperform ARIMA and the benchmark (Buy&Hold strategy on S&P500 index) over the long term. These results are not sensitive to varying window sizes, the type of distribution, and the type of the GARCH model.  相似文献   

14.
Predicting the values of a financial time series is mainly a function of its price history, which depends on several factors, internal and external. With this history, it is possible to build an ∊-machine for predicting the financial time series. This work proposes considering the influence of a financial series through the transfer of entropy when the values of the other financial series are known. A method is proposed that considers the transfer of entropy for breaking the ties that occur when calculating the prediction with the ∊-machine. This analysis is carried out using data from six financial series: two American, the S&P 500 and the Nasdaq; two Asian, the Hang Seng and the Nikkei 225; and two European, the CAC 40 and the DAX. This work shows that it is possible to influence the prediction of the closing value of a series if the value of the influencing series is known. This work showed that the series that transfer the most information through entropy transfer are the American S&P 500 and Nasdaq, followed by the European DAX and CAC 40, and finally the Asian Nikkei 225 and Hang Seng.  相似文献   

15.
16.
This article discusses the self assembly of conjugated thiol molecular wires on Au(111) substrates and their charge transport studied by scanning tunneling microscopy and spectroscopy. Molecular resolution imaging of the conjugated thiols show that differences in their structure and inter molecular interactions result in an ordering on gold that is different from the hexagonal symmetry found in alkanethiols. Tunneling spectroscopy on the molecular wires provides information about their intrinsic electronic properties such as the origin of the observed conductance gap and asymmetry in the I–Vs. Further by concurrent topographic and tunneling spectroscopic studies on a conjugated thiol molecule self assembled with and without molecular order, we show that packing and order determine the response of the monolayer to various competing interactions and that the presence of molecular order is very important for reproducible transport measurements. Competing forces between the electric field, intermolecular interactions, tip-molecule physisorption and substrate-molecule chemisorption impact the transport measurements and its reliability. This study points to the fact that molecular electronic devices should be designed to be tolerant to such fluctuations and dynamics. PACS 68.37.Ef; 73.63.-b; 81.16.Dn  相似文献   

17.
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption that the Earth’s complexity could, in principle, be modeled by (surrogated by) a numerical algorithm if enough computing power were granted. Yet, similar numerical algorithms also surrogate different systems having the same processes and dynamics, such as Mars or Jupiter, although being qualitatively different from the Earth system. Here, we argue that understanding the Earth as a complex system requires a consideration of the Gaia hypothesis: the Earth is a complex system because it instantiates life—and therefore an autopoietic, metabolic-repair (M,R) organization—at a planetary scale. This implies that the Earth’s complexity has formal equivalence to a self-referential system that inherently is non-algorithmic and, therefore, cannot be surrogated and simulated in a Turing machine. We discuss the consequences of this, with reference to in-silico climate models, tipping points, planetary boundaries, and planetary feedback loops as units of adaptive evolution and selection.  相似文献   

18.
I explore the processes of equilibration exhibited by the Adapted Caldeira–Leggett (ACL) model, a small unitary “toy model” developed for numerical studies of quantum decoherence between an SHO and an environment. I demonstrate how dephasing allows equilibration to occur in a wide variety of situations. While the finite model size and other “unphysical” aspects prevent the notions of temperature and thermalization from being generally applicable, certain primitive aspects of thermalization can be realized for particular parameter values. I link the observed behaviors to intrinsic properties of the global energy eigenstates, and argue that the phenomena I observe contain elements which might be key ingredients that lead to ergodic behavior in larger more realistic systems. The motivations for this work range from curiosity about phenomena observed in earlier calculations with the ACL model to much larger questions related to the nature of equilibrium, thermalization, and the emergence of physical laws.  相似文献   

19.
A hybrid ray-tracing method is developed for the solution to the radiative transfer in a plane-parallel participating medium having one specular surface and another diffuse surface. By this method, radiative transfer coefficients (RTCs) for specular–diffuse (S–D) surfaces are deduced. The medium surfaces are considered to be semitransparent. The effects of convection–radiation parameter, conduction–radiation parameter and refractive index on transient coupled heat transfer are investigated. Results show that the temperature curves of the medium having S–D surfaces is higher than those of the medium having S–S surfaces (two specular surfaces); the total heat flux at steady state for the S–D surfaces is lower than that for the S–S surfaces.  相似文献   

20.
At the basis of the problem of explaining non-local quantum correlations lies the tension between two factors: on the one hand, the natural interpretation of correlations as the manifestation of a causal relation; on the other, the resistance on the part of the physics underlying said correlations to adjust to the most essential features of a pre-theoretic notion of causation. In this paper, I argue for the rejection of the first horn of the dilemma, i.e., the assumption that quantum correlations call for a causal explanation. The paper is divided into two parts. The first, destructive, part provides a critical overview of the enterprise of causally interpreting non-local quantum correlations, with the aim of warning against the temptation of an account of causation claiming to cover such correlations ‘for free’. The second, constructive, part introduces the so-called structural explanation (a variety of non-causal explanation that shows how the explanandum is the manifestation of a fundamental structure of the world) and argues that quantum correlations might be explained structurally in the context of an information-theoretic approach to QT.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号