首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
I numerically simulate and compare the entanglement of two quanta using the conventional formulation of quantum mechanics and a time-symmetric formulation that has no collapse postulate. The experimental predictions of the two formulations are identical, but the entanglement predictions are significantly different. The time-symmetric formulation reveals an experimentally testable discrepancy in the original quantum analysis of the Hanbury Brown–Twiss experiment, suggests solutions to some parts of the nonlocality and measurement problems, fixes known time asymmetries in the conventional formulation, and answers Bell’s question “How do you convert an ’and’ into an ’or’?”  相似文献   

2.
Based on the analysis and measurement of the overall situation, import and export structure and international competitiveness of the various sectors of service trade in the Guangdong–Hong Kong–Macao Greater Bay Area, with the help of MATLAB and Gray System Modeling software, the synergy degree model was established to quantitatively analyze the synergy level of service trade in the Greater Bay Area with the help of grey correlation analysis method and entropy weight method. The results show that the overall development trend of service trade in the Guangdong–Hong Kong–Macao Greater Bay Area is good. The service trade industries in different regions are highly complementary and have a high degree of correlation. The potential for the coordinated development of internal service trade is excellent, and the overall situation of service trade in the Greater Bay Area is in a stage of transition from a moderate level of synergy to a high level of synergy. The Greater Bay Area can achieve industrial synergy by accelerating industrial integration and green transformation, establishing a coordinated development mechanism, sharing market platform, strengthening personnel security, and further enhancing the international competitiveness of service trade. The established model better reflects the current coordination of service trade in the Guangdong–Hong Kong–Macao Greater Bay Area and has good applicability. In the future, more economic, technological, geographic, and policy data and information can be comprehensively used to study the spatial pattern, evolution rules, and mechanisms of coordinated development in the broader area.  相似文献   

3.
The inverted Topp–Leone distribution is a new, appealing model for reliability analysis. In this paper, a new distribution, named new exponential inverted Topp–Leone (NEITL) is presented, which adds an extra shape parameter to the inverted Topp–Leone distribution. The graphical representations of its density, survival, and hazard rate functions are provided. The following properties are explored: quantile function, mixture representation, entropies, moments, and stress–strength reliability. We plotted the skewness and kurtosis measures of the proposed model based on the quantiles. Three different estimation procedures are suggested to estimate the distribution parameters, reliability, and hazard rate functions, along with their confidence intervals. Additionally, stress–strength reliability estimators for the NEITL model were obtained. To illustrate the findings of the paper, two real datasets on engineering and medical fields have been analyzed.  相似文献   

4.
Entropy estimation faces numerous challenges when applied to various real-world problems. Our interest is in divergence and entropy estimation algorithms which are capable of rapid estimation for natural sequence data such as human and synthetic languages. This typically requires a large amount of data; however, we propose a new approach which is based on a new rank-based analytic Zipf–Mandelbrot–Li probabilistic model. Unlike previous approaches, which do not consider the nature of the probability distribution in relation to language; here, we introduce a novel analytic Zipfian model which includes linguistic constraints. This provides more accurate distributions for natural sequences such as natural or synthetic emergent languages. Results are given which indicates the performance of the proposed ZML model. We derive an entropy estimation method which incorporates the linguistic constraint-based Zipf–Mandelbrot–Li into a new non-equiprobable coincidence counting algorithm which is shown to be effective for tasks such as entropy rate estimation with limited data.  相似文献   

5.
Ordinal patterns classifying real vectors according to the order relations between their components are an interesting basic concept for determining the complexity of a measure-preserving dynamical system. In particular, as shown by C. Bandt, G. Keller and B. Pompe, the permutation entropy based on the probability distributions of such patterns is equal to Kolmogorov–Sinai entropy in simple one-dimensional systems. The general reason for this is that, roughly speaking, the system of ordinal patterns obtained for a real-valued “measuring arrangement” has high potential for separating orbits. Starting from a slightly different approach of A. Antoniouk, K. Keller and S. Maksymenko, we discuss the generalizations of ordinal patterns providing enough separation to determine the Kolmogorov–Sinai entropy. For defining these generalized ordinal patterns, the idea is to substitute the basic binary relation ≤ on the real numbers by another binary relation. Generalizing the former results of I. Stolz and K. Keller, we establish conditions that the binary relation and the dynamical system have to fulfill so that the obtained generalized ordinal patterns can be used for estimating the Kolmogorov–Sinai entropy.  相似文献   

6.
Unemployment has risen as the economy has shrunk. The coronavirus crisis has affected many sectors in Romania, some companies diminishing or even ceasing their activity. Making forecasts of the unemployment rate has a fundamental impact and importance on future social policy strategies. The aim of the paper is to comparatively analyze the forecast performances of different univariate time series methods with the purpose of providing future predictions of unemployment rate. In order to do that, several forecasting models (seasonal model autoregressive integrated moving average (SARIMA), self-exciting threshold autoregressive (SETAR), Holt–Winters, ETS (error, trend, seasonal), and NNAR (neural network autoregression)) have been applied, and their forecast performances have been evaluated on both the in-sample data covering the period January 2000–December 2017 used for the model identification and estimation and the out-of-sample data covering the last three years, 2018–2020. The forecast of unemployment rate relies on the next two years, 2021–2022. Based on the in-sample forecast assessment of different methods, the forecast measures root mean squared error (RMSE), mean absolute error (MAE), and mean absolute percent error (MAPE) suggested that the multiplicative Holt–Winters model outperforms the other models. For the out-of-sample forecasting performance of models, RMSE and MAE values revealed that the NNAR model has better forecasting performance, while according to MAPE, the SARIMA model registers higher forecast accuracy. The empirical results of the Diebold–Mariano test at one forecast horizon for out-of-sample methods revealed differences in the forecasting performance between SARIMA and NNAR, of which the best model of modeling and forecasting unemployment rate was considered to be the NNAR model.  相似文献   

7.
Although the sizes of business firms have been a subject of intensive research, the definition of a “size” of a firm remains unclear. In this study, we empirically characterize in detail the scaling relations between size measures of business firms, analyzing them based on allometric scaling. Using a large dataset of Japanese firms that tracked approximately one million firms annually for two decades (1994–2015), we examined up to the trivariate relations between corporate size measures: annual sales, capital stock, total assets, and numbers of employees and trading partners. The data were examined using a multivariate generalization of a previously proposed method for analyzing bivariate scalings. We found that relations between measures other than the capital stock are marked by allometric scaling relations. Power–law exponents for scalings and distributions of multiple firm size measures were mostly robust throughout the years but had fluctuations that appeared to correlate with national economic conditions. We established theoretical relations between the exponents. We expect these results to allow direct estimation of the effects of using alternative size measures of business firms in regression analyses, to facilitate the modeling of firms, and to enhance the current theoretical understanding of complex systems.  相似文献   

8.
This paper analyses the complexity of electroencephalogram (EEG) signals in different temporal scales for the analysis and classification of focal and non-focal EEG signals. Futures from an original multiscale permutation Lempel–Ziv complexity measure (MPLZC) were obtained. MPLZC measure combines a multiscale structure, ordinal analysis, and permutation Lempel–Ziv complexity for quantifying the dynamic changes of an electroencephalogram (EEG). We also show the dependency of MPLZC on several straight-forward signal processing concepts, which appear in biomedical EEG activity via a set of synthetic signals. The main material of the study consists of EEG signals, which were obtained from the Bern-Barcelona EEG database. The signals were divided into two groups: focal EEG signals (n = 100) and non-focal EEG signals (n = 100); statistical analysis was performed by means of non-parametric Mann–Whitney test. The mean value of MPLZC results in the non-focal group are significantly higher than those in the focal group for scales above 1 (p < 0.05). The result indicates that the non-focal EEG signals are more complex. MPLZC feature sets are used for the least squares support vector machine (LS-SVM) classifier to classify into the focal and non-focal EEG signals. Our experimental results confirmed the usefulness of the MPLZC method for distinguishing focal and non-focal EEG signals with a classification accuracy of 86%.  相似文献   

9.
With the aim of improving the reconstruction of stochastic evolution equations from empirical time-series data, we derive a full representation of the generator of the Kramers–Moyal operator via a power-series expansion of the exponential operator. This expansion is necessary for deriving the different terms in a stochastic differential equation. With the full representation of this operator, we are able to separate finite-time corrections of the power-series expansion of arbitrary order into terms with and without derivatives of the Kramers–Moyal coefficients. We arrive at a closed-form solution expressed through conditional moments, which can be extracted directly from time-series data with a finite sampling intervals. We provide all finite-time correction terms for parametric and non-parametric estimation of the Kramers–Moyal coefficients for discontinuous processes which can be easily implemented—employing Bell polynomials—in time-series analyses of stochastic processes. With exemplary cases of insufficiently sampled diffusion and jump-diffusion processes, we demonstrate the advantages of our arbitrary-order finite-time corrections and their impact in distinguishing diffusion and jump-diffusion processes strictly from time-series data.  相似文献   

10.
Based on elastic mechanics, the fluid–structure coupling theory and the finite element method, a high-speed railway wheel-rail rolling-aerodynamic noise model is established to realize the combined simulation and prediction of the vibrations, rolling noise and aerodynamic noise in wheel-rail systems. The field test data of the Beijing–Shenyang line are considered to verify the model reliability. In addition, the directivity of each sound source at different frequencies is analyzed. Based on this analysis, noise reduction measures are proposed. At a low frequency of 300 Hz, the wheel-rail area mainly contributes to the aerodynamic noise, and as the frequency increases, the wheel-rail rolling noise becomes dominant. When the frequency is less than 1000 Hz, the radiated noise fluctuates around the cylindrical surface, and the directivity of the sound is ambiguous. When the frequency is in the middle- and high-frequency bands, exceeding 1000 Hz, both the rolling and total noise exhibit a notable directivity in the directions of 20–30° and 70–90°, and thus, noise reduction measures can be implemented in these directions.  相似文献   

11.
In this paper, we introduce new divergences called Jensen–Sharma–Mittal and Jeffreys–Sharma–Mittal in relation to convex functions. Some theorems, which give the lower and upper bounds for two new introduced divergences, are provided. The obtained results imply some new inequalities corresponding to known divergences. Some examples, which show that these are the generalizations of Rényi, Tsallis, and Kullback–Leibler types of divergences, are provided in order to show a few applications of new divergences.  相似文献   

12.
13.
With the intensification of people’s production and life behaviors, the systemic risks of water, energy and food in the Yangtze River Basin have become increasingly prominent, which has become a bottleneck for sustainable development of social, economic and ecological in the basin. Therefore, studying the symbiotic coordination between water, energy and food is of great significance to promoting regional sustainable development. First, from the perspective of water–energy–food symbiosis, with the water–energy–food ecosystem conceptual model as the nexus, the two-step measurement model of the symbiotic index and the symbiotic level index is used to study the water–energy–food symbiosis of the Yangtze River. Then, we use the BP-DEMATEL-GTCW model to identify the key influencing factors that affect the symbiotic security of the water–energy–food ecosystem. In this research, it is found that the average value of the symbiotic degree of the water–energy–food ecosystem of the 11 provinces or municipalities in the Yangtze River Basin only reached the risk grade. It can also be seen from the identification results of key influencing factors that energy microsystem-related indicators have a greater impact on the symbiotic development of the entire WEF ecosystem. Therefore, special attention needs to be paid to increasing energy sources and reducing expenditure. Relevant departments need to effectively develop primary energy production and expand energy-saving investment through multiple channels to expand energy self-sufficiency and ultimately promote the coordinated and effective development of water, energy and food in the Yangtze River Basin.  相似文献   

14.
This paper presents a difference-type lower bound for the Bayes risk as a difference-type extension of the Borovkov–Sakhanenko bound. The resulting bound asymptotically improves the Bobrovsky–Mayor–Wolf–Zakai bound which is difference-type extension of the Van Trees bound. Some examples are also given.  相似文献   

15.
Finding the proper entropy-like Lyapunov functional associated with the inelastic Boltzmann equation for an isolated freely cooling granular gas is a still unsolved challenge. The original H-theorem hypotheses do not fit here and the H-functional presents some additional measure problems that are solved by the Kullback–Leibler divergence (KLD) of a reference velocity distribution function from the actual distribution. The right choice of the reference distribution in the KLD is crucial for the latter to qualify or not as a Lyapunov functional, the asymptotic “homogeneous cooling state” (HCS) distribution being a potential candidate. Due to the lack of a formal proof far from the quasielastic limit, the aim of this work is to support this conjecture aided by molecular dynamics simulations of inelastic hard disks and spheres in a wide range of values for the coefficient of restitution (α) and for different initial conditions. Our results reject the Maxwellian distribution as a possible reference, whereas they reinforce the HCS one. Moreover, the KLD is used to measure the amount of information lost on using the former rather than the latter, revealing a non-monotonic dependence with α.  相似文献   

16.
In the thermodynamic equilibrium of dipolar-coupled spin systems under the influence of a Dzyaloshinskii–Moriya (D–M) interaction along the z-axis, the current study explores the quantum-memory-assisted entropic uncertainty relation (QMA-EUR), entropy mixedness and the concurrence two-spin entanglement. Quantum entanglement is reduced at increased temperature values, but inflation uncertainty and mixedness are enhanced. The considered quantum effects are stabilized to their stationary values at high temperatures. The two-spin entanglement is entirely repressed if the D–M interaction is disregarded, and the entropic uncertainty and entropy mixedness reach their maximum values for equal coupling rates. Rather than the concurrence, the entropy mixedness can be a proper indicator of the nature of the entropic uncertainty. The effect of model parameters (D–M coupling and dipole–dipole spin) on the quantum dynamic effects in thermal environment temperature is explored. The results reveal that the model parameters cause significant variations in the predicted QMA-EUR.  相似文献   

17.
This paper explains a thorough exergy analysis of the most important reactions in soil–plant interactions. Soil, which is a prime mover of gases, metals, structural crystals, and electrolytes, constantly resembles an electric field of charge and discharge. The second law of thermodynamics reflects the deterioration of resources through the destruction of exergy. In this study, we developed a new method to assess the exergy of soil and plant formation processes. Depending on the types of soil, one may assess the efficiency and degradation of resources by incorporating or using biomass storage. According to the results of this study, during different processes from the mineralization process to nutrient uptake by the plant, about 62.5% of the input exergy will be destroyed because of the soil solution reactions. Most of the exergy destruction occurs in the biota–atmosphere subsystem, especially in the photosynthesis reaction, due to its low efficiency (about 15%). Humus and protonation reactions, with 14% and 13% exergy destruction, respectively, are the most exergy destroying reactions. Respiratory, weathering, and reverse weathering reactions account for the lowest percentage of exergy destruction and less than one percent of total exergy destruction in the soil system. The total exergy yield of the soil system is estimated at about 37.45%.  相似文献   

18.
Entropy measures the uncertainty associated with a random variable. It has important applications in cybernetics, probability theory, astrophysics, life sciences and other fields. Recently, many authors focused on the estimation of entropy with different life distributions. However, the estimation of entropy for the generalized Bilal (GB) distribution has not yet been involved. In this paper, we consider the estimation of the entropy and the parameters with GB distribution based on adaptive Type-II progressive hybrid censored data. Maximum likelihood estimation of the entropy and the parameters are obtained using the Newton–Raphson iteration method. Bayesian estimations under different loss functions are provided with the help of Lindley’s approximation. The approximate confidence interval and the Bayesian credible interval of the parameters and entropy are obtained by using the delta and Markov chain Monte Carlo (MCMC) methods, respectively. Monte Carlo simulation studies are carried out to observe the performances of the different point and interval estimations. Finally, a real data set has been analyzed for illustrative purposes.  相似文献   

19.
In the paper, an approach for decision rules construction is proposed. It is studied from the point of view of the supervised machine learning task, i.e., classification, and from the point of view of knowledge representation. Generated rules provide comparable classification results to the dynamic programming approach for optimization of decision rules relative to length or support. However, the proposed algorithm is based on transformation of decision table into entity–attribute–value (EAV) format. Additionally, standard deviation function for computation of averages’ values of attributes in particular decision classes was introduced. It allows to select from the whole set of attributes only these which provide the highest degree of information about the decision. Construction of decision rules is performed based on idea of partitioning of a decision table into corresponding subtables. In opposite to dynamic programming approach, not all attributes need to be taken into account but only these with the highest values of standard deviation per decision classes. Consequently, the proposed solution is more time efficient because of lower computational complexity. In the framework of experimental results, support and length of decision rules were computed and compared with the values of optimal rules. The classification error for data sets from UCI Machine Learning Repository was also obtained and compared with the ones for dynamic programming approach. Performed experiments show that constructed rules are not far from the optimal ones and classification results are comparable to these obtained in the framework of the dynamic programming extension.  相似文献   

20.
Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号