首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 234 毫秒
1.
The task of reconstructing the system’s state from the measurements results, known as the Pauli problem, usually requires repetition of two successive steps. Preparation in an initial state to be determined is followed by an accurate measurement of one of the several chosen operators in order to provide the necessary “Pauli data”. We consider a similar yet more general problem of recovering Feynman’s transition (path) amplitudes from the results of at least three consecutive measurements. The three-step histories of a pre- and post-selected quantum system are subjected to a type of interference not available to their two-step counterparts. We show that this interference can be exploited, and if the intermediate measurement is “fuzzy”, the path amplitudes can be successfully recovered. The simplest case of a two-level system is analysed in detail. The “weak measurement” limit and the usefulness of the path amplitudes are also discussed.  相似文献   

2.
Since the “high stock dividend” of A-share companies in China often leads to the short-term stock price increase, this phenomenon’s prediction has been widely concerned by academia and industry. In this study, a new multi-layer stacking ensemble algorithm is proposed. Unlike the classic stacking ensemble algorithm that focused on the differentiation of base models, this paper used the equal weight comprehensive feature evaluation method to select features before predicting the base model and used a genetic algorithm to match the optimal feature subset for each base model. After the base model’s output prediction, the LightGBM (LGB) model was added to the algorithm as a secondary information extraction layer. Finally, the algorithm inputs the extracted information into the Logistic Regression (LR) model to complete the prediction of the “high stock dividend” phenomenon. Using the A-share market data from 2010 to 2019 for simulation and evaluation, the proposed model improves the AUC (Area Under Curve) and F1 score by 0.173 and 0.303, respectively, compared to the baseline model. The prediction results shed light on event-driven investment strategies.  相似文献   

3.
In this paper, a new parametric compound G family of continuous probability distributions called the Poisson generalized exponential G (PGEG) family is derived and studied. Relevant mathematical properties are derived. Some new bivariate G families using the theorems of “Farlie-Gumbel-Morgenstern copula”, “the modified Farlie-Gumbel-Morgenstern copula”, “the Clayton copula”, and “the Renyi’s entropy copula” are presented. Many special members are derived, and a special attention is devoted to the exponential and the one parameter Pareto type II model. The maximum likelihood method is used to estimate the model parameters. A graphical simulation is performed to assess the finite sample behavior of the estimators of the maximum likelihood method. Two real-life data applications are proposed to illustrate the importance of the new family.  相似文献   

4.
Wigner’s friend scenarios involve an Observer, or Observers, measuring a Friend, or Friends, who themselves make quantum measurements. In recent discussions, it has been suggested that quantum mechanics may not always be able to provide a consistent account of a situation involving two Observers and two Friends. We investigate this problem by invoking the basic rules of quantum mechanics as outlined by Feynman in the well-known “Feynman Lectures on Physics”. We show here that these “Feynman rules” constrain the a priori assumptions which can be made in generalised Wigner’s friend scenarios, because the existence of the probabilities of interest ultimately depends on the availability of physical evidence (material records) of the system’s past. With these constraints obeyed, a non-ambiguous and consistent account of all measurement outcomes is obtained for all agents, taking part in various Wigner’s Friend scenarios.  相似文献   

5.
I numerically simulate and compare the entanglement of two quanta using the conventional formulation of quantum mechanics and a time-symmetric formulation that has no collapse postulate. The experimental predictions of the two formulations are identical, but the entanglement predictions are significantly different. The time-symmetric formulation reveals an experimentally testable discrepancy in the original quantum analysis of the Hanbury Brown–Twiss experiment, suggests solutions to some parts of the nonlocality and measurement problems, fixes known time asymmetries in the conventional formulation, and answers Bell’s question “How do you convert an ’and’ into an ’or’?”  相似文献   

6.
The lack of adequate indicators in the research of digital economy may lead to the shortage of data support on decision making for governments. To solve this problem, first we establish a digital economy indicator evaluation system by dividing the digital economy into four types: “basic type”, “technology type”, “integration type” and “service type” and select 5 indicators for each type. On this basis, the weight of each indicator is calculated to find the deficiencies in the development of some digital economic fields by the improved entropy method. By drawing on the empowerment idea of Analytic Hierarchy Process, the improved entropy method firstly compares the difference coefficient of indicators in pairs and maps the comparison results to the scales 1–9. Then, the judgment matrix is constructed based on the information entropy, which can solve as much as possible the problem that the difference among the weight of each indicator is too large in traditional entropy method. The results indicate that: the development of digital economy in Guangdong Province was relatively balanced from 2015 to 2018 and will be better in the future while the development of rural e-commerce in Guangdong Province is relatively backward, and there is an obvious digital gap between urban and rural areas. Next we extract two new variables respectively to replace the 20 indicators we select through principal component analysis and factor analysis methods in multivariate statistical analysis, which can retain the original information to the greatest extent and provide convenience for further research in the future. Finally, we and provide constructive comments of digital economy in Guangdong Province from 2015 to 2018.  相似文献   

7.
The consensus regarding quantum measurements rests on two statements: (i) von Neumann’s standard quantum measurement theory leaves undetermined the basis in which observables are measured, and (ii) the environmental decoherence of the measuring device (the “meter”) unambiguously determines the measuring (“pointer”) basis. The latter statement means that the environment monitors (measures) selected observables of the meter and (indirectly) of the system. Equivalently, a measured quantum state must end up in one of the “pointer states” that persist in the presence of the environment. We find that, unless we restrict ourselves to projective measurements, decoherence does not necessarily determine the pointer basis of the meter. Namely, generalized measurements commonly allow the observer to choose from a multitude of alternative pointer bases that provide the same information on the observables, regardless of decoherence. By contrast, the measured observable does not depend on the pointer basis, whether in the presence or in the absence of decoherence. These results grant further support to our notion of Quantum Lamarckism, whereby the observer’s choices play an indispensable role in quantum mechanics.  相似文献   

8.
In a previous article we presented an argument to obtain (or rather infer) Born’s rule, based on a simple set of axioms named “Contexts, Systems and Modalities" (CSM). In this approach, there is no “emergence”, but the structure of quantum mechanics can be attributed to an interplay between the quantized number of modalities that is accessible to a quantum system and the continuum of contexts that are required to define these modalities. The strong link of this derivation with Gleason’s theorem was emphasized, with the argument that CSM provides a physical justification for Gleason’s hypotheses. Here, we extend this result by showing that an essential one among these hypotheses—the need of unitary transforms to relate different contexts—can be removed and is better seen as a necessary consequence of Uhlhorn’s theorem.  相似文献   

9.
This article reconsiders the concept of physical reality in quantum theory and the concept of quantum measurement, following Bohr, whose analysis of quantum measurement led him to his concept of a (quantum) “phenomenon,” referring to “the observations obtained under the specified circumstances,” in the interaction between quantum objects and measuring instruments. This situation makes the terms “observation” and “measurement,” as conventionally understood, inapplicable. These terms are remnants of classical physics or still earlier history, from which classical physics inherited it. As defined here, a quantum measurement does not measure any preexisting property of the ultimate constitution of the reality responsible for quantum phenomena. An act of measurement establishes a quantum phenomenon by an interaction between the instrument and the quantum object or in the present view the ultimate constitution of the reality responsible for quantum phenomena and, at the time of measurement, also quantum objects. In the view advanced in this article, in contrast to that of Bohr, quantum objects, such as electrons or photons, are assumed to exist only at the time of measurement and not independently, a view that redefines the concept of quantum object as well. This redefinition becomes especially important in high-energy quantum regimes and quantum field theory and allows this article to define a new concept of quantum field. The article also considers, now following Bohr, the quantum measurement as the entanglement between quantum objects and measurement instruments. The argument of the article is grounded in the concept “reality without realism” (RWR), as underlying quantum measurement thus understood, and the view, the RWR view, of quantum theory defined by this concept. The RWR view places a stratum of physical reality thus designated, here the reality ultimately responsible for quantum phenomena, beyond representation or knowledge, or even conception, and defines the corresponding set of interpretations quantum mechanics or quantum field theory, such as the one assumed in this article, in which, again, not only quantum phenomena but also quantum objects are (idealizations) defined by measurement. As such, the article also offers a broadly conceived response to J. Bell’s argument “against ‘measurement’”.  相似文献   

10.
In this cross-sectional study, the relationship between noninvasively measured neurocardiovascular signal entropy and physical frailty was explored in a sample of community-dwelling older adults from The Irish Longitudinal Study on Ageing (TILDA). The hypothesis under investigation was that dysfunction in the neurovascular and cardiovascular systems, as quantified by short-length signal complexity during a lying-to-stand test (active stand), could provide a marker for frailty. Frailty status (i.e., “non-frail”, “pre-frail”, and “frail”) was based on Fried’s criteria (i.e., exhaustion, unexplained weight loss, weakness, slowness, and low physical activity). Approximate entropy (ApEn) and sample entropy (SampEn) were calculated during resting (lying down), active standing, and recovery phases. There was continuously measured blood pressure/heart rate data from 2645 individuals (53.0% female) and frontal lobe tissue oxygenation data from 2225 participants (52.3% female); both samples had a mean (SD) age of 64.3 (7.7) years. Results revealed statistically significant associations between neurocardiovascular signal entropy and frailty status. Entropy differences between non-frail and pre-frail/frail were greater during resting state compared with standing and recovery phases. Compared with ApEn, SampEn seemed to have better discriminating power between non-frail and pre-frail/frail individuals. The quantification of entropy in short length neurocardiovascular signals could provide a clinically useful marker of the multiple physiological dysregulations that underlie physical frailty.  相似文献   

11.
The present study investigates the similarity problem associated with the onset of the Mach reflection of Zel’dovich–von Neumann–Döring (ZND) detonations in the near field. The results reveal that the self-similarity in the frozen-limit regime is strictly valid only within a small scale, i.e., of the order of the induction length. The Mach reflection becomes non-self-similar during the transition of the Mach stem from “frozen” to “reactive” by coupling with the reaction zone. The triple-point trajectory first rises from the self-similar result due to compressive waves generated by the “hot spot”, and then decays after establishment of the reactive Mach stem. It is also found, by removing the restriction, that the frozen limit can be extended to a much larger distance than expected. The obtained results elucidate the physical origin of the onset of Mach reflection with chemical reactions, which has previously been observed in both experiments and numerical simulations.  相似文献   

12.
Finding the critical factor and possible “Newton’s laws” in financial markets has been an important issue. However, with the development of information and communication technologies, financial models are becoming more realistic but complex, contradicting the objective law “Greatest truths are the simplest.” Therefore, this paper presents an evolutionary model independent of micro features and attempts to discover the most critical factor. In the model, information is the only critical factor, and stock price is the emergence of collective behavior. The statistical properties of the model are significantly similar to the real market. It also explains the correlations of stocks within an industry, which provides a new idea for studying critical factors and core structures in the financial markets.  相似文献   

13.
With the growing availability of position data in sports, spatiotemporal analysis in soccer is a topic of rising interest. The aim of this study is to validate a performance indicator, namely D-Def, measuring passing effectiveness. D-Def calculates the change of the teams’ centroid, centroids of formation lines (e.g., defensive line), teams’ surface area, and teams’ spread in the following three seconds after a pass and therefore results in a measure of disruption of the opponents’ defense following a pass. While this measure was introduced earlier, in this study we aim to prove the usefulness to evaluate attacking sequences. In this study, 258 games of Dutch Eredivisie season 2018/19 were included, resulting in 13,094 attacks. D-Def, pass length, pass velocity, and pass angle of the last four passes of each attack were calculated and compared between successful and unsuccessful attacks. D-Def showed higher values for passes of successful compared to unsuccessful attacks (0.001 < p ≤ 0.029, 0.06 ≤ d ≤ 0.23). This difference showed the highest effects sizes in the penultimate pass (d = 0.23) and the maximal D-Def value of an attack (d = 0.23). Passing length (0.001 < p ≤ 0.236, 0.08 ≤ d ≤ 0.17) and passing velocity (0.001 < p ≤ 0.690, −0.09 ≤ d ≤ 0.12) showed inconsistent results in discriminating between successful and unsuccessful attacks. The results indicate that D-Def is a useful indicator for the measurement of pass effectiveness in attacking sequences, highlighting that successful attacks are connected to disruptive passing. Within successful attacks, at least one high disruptive action (pass with D-Def > 28) needs to be present. In addition, the penultimate pass (“hockey assist”) of an attack seems crucial in characterizing successful attacks.  相似文献   

14.
We present a new experiment demonstrating destructive interference in customers’ estimates of conditional probabilities of product failure. We take the perspective of a manufacturer of consumer products and consider two situations of cause and effect. Whereas, individually, the effect of the causes is similar, it is observed that when combined, the two causes produce the opposite effect. Such negative interference of two or more product features may be exploited for better modeling of the cognitive processes taking place in customers’ minds. Doing so can enhance the likelihood that a manufacturer will be able to design a better product, or a feature within it. Quantum probability has been used to explain some commonly observed “non-classical” effects, such as the disjunction effect, question order effect, violation of the sure-thing principle, and the Machina and Ellsberg paradoxes. In this work, we present results from a survey on the impact of multiple observed symptoms on the drivability of a vehicle. The symptoms are assumed to be conditionally independent. We demonstrate that the response statistics cannot be directly explained using classical probability, but quantum formulation easily models it, as it allows for both positive and negative “interference” between events. Since quantum formalism also accounts for classical probability’s predictions, it serves as a richer paradigm for modeling decision making behavior in engineering design and behavioral economics.  相似文献   

15.
This paper starts from Schrödinger’s famous question “what is life” and elucidates answers that invoke, in particular, Friston’s free energy principle and its relation to the method of Bayesian inference and to Synergetics 2nd foundation that utilizes Jaynes’ maximum entropy principle. Our presentation reflects the shift from the emphasis on physical principles to principles of information theory and Synergetics. In view of the expected general audience of this issue, we have chosen a somewhat tutorial style that does not require special knowledge on physics but familiarizes the reader with concepts rooted in information theory and Synergetics.  相似文献   

16.
In this paper, we focus on the critical periods in the economy that are characterized by unusual and large fluctuations in macroeconomic indicators, like those measuring inflation and unemployment. We analyze U.S. data for 70 years from 1948 until 2018. To capture their fluctuation essence, we concentrate on the non-Gaussianity of their distributions. We investigate how the non-Gaussianity of these variables affects the coupling structure of them. We distinguish “regular” from “rare” events, in calculating the correlation coefficient, emphasizing that both cases might lead to a different response of the economy. Through the “multifractal random wall” model, one can see that the non-Gaussianity depends on time scales. The non-Gaussianity of unemployment is noticeable only for periods shorter than one year; for longer periods, the fluctuation distribution tends to a Gaussian behavior. In contrast, the non-Gaussianities of inflation fluctuations persist for all time scales. We observe through the “bivariate multifractal random walk” that despite the inflation features, the non-Gaussianity of the coupled structure is finite for scales less than one year, drops for periods larger than one year, and becomes small for scales greater than two years. This means that the footprint of the monetary policies intentionally influencing the inflation and unemployment couple is observed only for time horizons smaller than two years. Finally, to improve some understanding of the effect of rare events, we calculate high moments of the variables’ increments for various q orders and various time scales. The results show that coupling with high moments sharply increases during crises.  相似文献   

17.
In this paper, I investigate a connection between a common characterisation of freedom and how uncertainty is managed in a Bayesian hierarchical model. To do this, I consider a distributed factorization of a group’s optimization of free energy, in which each agent is attempting to align with the group and with its own model. I show how this can lead to equilibria for groups, defined by the capacity of the model being used, essentially how many different datasets it can handle. In particular, I show that there is a “sweet spot” in the capacity of a normal model in each agent’s decentralized optimization, and that this “sweet spot” corresponds to minimal free energy for the group. At the sweet spot, an agent can predict what the group will do and the group is not surprised by the agent. However, there is an asymmetry. A higher capacity model for an agent makes it harder for the individual to learn, as there are more parameters. Simultaneously, a higher capacity model for the group, implemented as a higher capacity model for each member agent, makes it easier for a group to integrate a new member. To optimize for a group of agents then requires one to make a trade-off in capacity, as each individual agent seeks to decrease capacity, but there is pressure from the group to increase capacity of all members. This pressure exists because as individual agent’s capacities are reduced, so too are their abilities to model other agents, and thereby to establish pro-social behavioural patterns. I then consider a basic two-level (dual process) Bayesian model of social reasoning and a set of three parameters of capacity that are required to implement such a model. Considering these three capacities as dependent elements in a free energy minimization for a group leads to a “sweet surface” in a three-dimensional space defining the triplet of parameters that each agent must use should they hope to minimize free energy as a group. Finally, I relate these three parameters to three notions of freedom and equality in human social organization, and postulate a correspondence between freedom and model capacity. That is, models with higher capacity, have more freedom as they can interact with more datasets.  相似文献   

18.
With the increasing number of connected devices, complex systems such as smart homes record a multitude of events of various types, magnitude and characteristics. Current systems struggle to identify which events can be considered more memorable than others. In contrast, humans are able to quickly categorize some events as being more “memorable” than others. They do so without relying on knowledge of the system’s inner working or large previous datasets. Having this ability would allow the system to: (i) identify and summarize a situation to the user by presenting only memorable events; (ii) suggest the most memorable events as possible hypotheses in an abductive inference process. Our proposal is to use Algorithmic Information Theory to define a “memorability” score by retrieving events using predicative filters. We use smart-home examples to illustrate how our theoretical approach can be implemented in practice.  相似文献   

19.
Integrated information has been recently suggested as a possible measure to identify a necessary condition for a system to display conscious features. Recently, we have shown that astrocytes contribute to the generation of integrated information through the complex behavior of neuron–astrocyte networks. Still, it remained unclear which underlying mechanisms governing the complex behavior of a neuron–astrocyte network are essential to generating positive integrated information. This study presents an analytic consideration of this question based on exact and asymptotic expressions for integrated information in terms of exactly known probability distributions for a reduced mathematical model (discrete-time, discrete-state stochastic model) reflecting the main features of the “spiking–bursting” dynamics of a neuron–astrocyte network. The analysis was performed in terms of the empirical “whole minus sum” version of integrated information in comparison to the “decoder based” version. The “whole minus sum” information may change sign, and an interpretation of this transition in terms of “net synergy” is available in the literature. This motivated our particular interest in the sign of the “whole minus sum” information in our analytical considerations. The behaviors of the “whole minus sum” and “decoder based” information measures are found to bear a lot of similarity—they have mutual asymptotic convergence as time-uncorrelated activity increases, and the sign transition of the “whole minus sum” information is associated with a rapid growth in the “decoder based” information. The study aims at creating a theoretical framework for using the spiking–bursting model as an analytically tractable reference point for applying integrated information concepts to systems exhibiting similar bursting behavior. The model can also be of interest as a new discrete-state test bench for different formulations of integrated information.  相似文献   

20.
By assimilating biological systems, both structural and functional, into multifractal objects, their behavior can be described in the framework of the scale relativity theory, in any of its forms (standard form in Nottale’s sense and/or the form of the multifractal theory of motion). By operating in the context of the multifractal theory of motion, based on multifractalization through non-Markovian stochastic processes, the main results of Nottale’s theory can be generalized (specific momentum conservation laws, both at differentiable and non-differentiable resolution scales, specific momentum conservation law associated with the differentiable–non-differentiable scale transition, etc.). In such a context, all results are explicated through analyzing biological processes, such as acute arterial occlusions as scale transitions. Thus, we show through a biophysical multifractal model that the blocking of the lumen of a healthy artery can happen as a result of the “stopping effect” associated with the differentiable-non-differentiable scale transition. We consider that blood entities move on continuous but non-differentiable (multifractal) curves. We determine the biophysical parameters that characterize the blood flow as a Bingham-type rheological fluid through a normal arterial structure assimilated with a horizontal “pipe” with circular symmetry. Our model has been validated based on experimental clinical data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号