首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 390 毫秒
1.
In this article, the “truncated-composed” scheme was applied to the Burr X distribution to motivate a new family of univariate continuous-type distributions, called the truncated Burr X generated family. It is mathematically simple and provides more modeling freedom for any parental distribution. Additional functionality is conferred on the probability density and hazard rate functions, improving their peak, asymmetry, tail, and flatness levels. These characteristics are represented analytically and graphically with three special distributions of the family derived from the exponential, Rayleigh, and Lindley distributions. Subsequently, we conducted asymptotic, first-order stochastic dominance, series expansion, Tsallis entropy, and moment studies. Useful risk measures were also investigated. The remainder of the study was devoted to the statistical use of the associated models. In particular, we developed an adapted maximum likelihood methodology aiming to efficiently estimate the model parameters. The special distribution extending the exponential distribution was applied as a statistical model to fit two sets of actuarial and financial data. It performed better than a wide variety of selected competing non-nested models. Numerical applications for risk measures are also given.  相似文献   

2.
This paper addresses the problem of frequency stability prediction (FSP) following active power disturbances in power systems by proposing a vision transformer (ViT) method that predicts frequency stability in real time. The core idea of the FSP approach employing the ViT is to use the time-series data of power system operations as ViT inputs to perform FSP accurately and quickly so that operators can decide frequency control actions, minimizing the losses caused by incidents. Additionally, due to the high-dimensional and redundant input data of the power system and the O(N2) computational complexity of the transformer, feature selection based on copula entropy (CE) is used to construct image-like data with fixed dimensions from power system operation data and remove redundant information. Moreover, no previous FSP study has taken safety margins into consideration, which may threaten the secure operation of power systems. Therefore, a frequency security index (FSI) is used to form the sample labels, which are categorized as “insecurity”, “relative security”, and “absolute security”. Finally, various case studies are carried out on a modified New England 39-bus system and a modified ACTIVSg500 system for projected 0% to 40% nonsynchronous system penetration levels. The simulation results demonstrate that the proposed method achieves state-of-the-art (SOTA) performance on normal, noisy, and incomplete datasets in comparison with eight machine-learning methods.  相似文献   

3.
Integrated information has been recently suggested as a possible measure to identify a necessary condition for a system to display conscious features. Recently, we have shown that astrocytes contribute to the generation of integrated information through the complex behavior of neuron–astrocyte networks. Still, it remained unclear which underlying mechanisms governing the complex behavior of a neuron–astrocyte network are essential to generating positive integrated information. This study presents an analytic consideration of this question based on exact and asymptotic expressions for integrated information in terms of exactly known probability distributions for a reduced mathematical model (discrete-time, discrete-state stochastic model) reflecting the main features of the “spiking–bursting” dynamics of a neuron–astrocyte network. The analysis was performed in terms of the empirical “whole minus sum” version of integrated information in comparison to the “decoder based” version. The “whole minus sum” information may change sign, and an interpretation of this transition in terms of “net synergy” is available in the literature. This motivated our particular interest in the sign of the “whole minus sum” information in our analytical considerations. The behaviors of the “whole minus sum” and “decoder based” information measures are found to bear a lot of similarity—they have mutual asymptotic convergence as time-uncorrelated activity increases, and the sign transition of the “whole minus sum” information is associated with a rapid growth in the “decoder based” information. The study aims at creating a theoretical framework for using the spiking–bursting model as an analytically tractable reference point for applying integrated information concepts to systems exhibiting similar bursting behavior. The model can also be of interest as a new discrete-state test bench for different formulations of integrated information.  相似文献   

4.
Entropic dynamics is a framework in which the laws of dynamics are derived as an application of entropic methods of inference. Its successes include the derivation of quantum mechanics and quantum field theory from probabilistic principles. Here, we develop the entropic dynamics of a system, the state of which is described by a probability distribution. Thus, the dynamics unfolds on a statistical manifold that is automatically endowed by a metric structure provided by information geometry. The curvature of the manifold has a significant influence. We focus our dynamics on the statistical manifold of Gibbs distributions (also known as canonical distributions or the exponential family). The model includes an “entropic” notion of time that is tailored to the system under study; the system is its own clock. As one might expect that entropic time is intrinsically directional; there is a natural arrow of time that is led by entropic considerations. As illustrative examples, we discuss dynamics on a space of Gaussians and the discrete three-state system.  相似文献   

5.
To estimate the amount of evapotranspiration in a river basin, the “short period water balance method” was formulated. Then, by introducing the “complementary relationship method,” the amount of evapotranspiration was estimated seasonally, and with reasonable accuracy, for both small and large areas. Moreover, to accurately estimate river discharge in the low water season, the “weighted statistical unit hydrograph method” was proposed and a procedure for the calculation of the unit hydrograph was developed. Also, a new model, based on the “equivalent roughness method,” was successfully developed for the estimation of flood runoff from newly reclaimed farmlands. Based on the results of this research, a “composite reservoir model” was formulated to analyze the repeated use of irrigation water in large spatial areas. The application of this model to a number of watershed areas provided useful information with regard to the realities of water demand-supply systems in watersheds predominately dedicated to paddy fields, in Japan.  相似文献   

6.
Finding the critical factor and possible “Newton’s laws” in financial markets has been an important issue. However, with the development of information and communication technologies, financial models are becoming more realistic but complex, contradicting the objective law “Greatest truths are the simplest.” Therefore, this paper presents an evolutionary model independent of micro features and attempts to discover the most critical factor. In the model, information is the only critical factor, and stock price is the emergence of collective behavior. The statistical properties of the model are significantly similar to the real market. It also explains the correlations of stocks within an industry, which provides a new idea for studying critical factors and core structures in the financial markets.  相似文献   

7.
In this article, a new one parameter survival model is proposed using the Kavya–Manoharan (KM) transformation family and the inverse length biased exponential (ILBE) distribution. Statistical properties are obtained: quantiles, moments, incomplete moments and moment generating function. Different types of entropies such as Rényi entropy, Tsallis entropy, Havrda and Charvat entropy and Arimoto entropy are computed. Different measures of extropy such as extropy, cumulative residual extropy and the negative cumulative residual extropy are computed. When the lifetime of the item under use is assumed to follow the Kavya–Manoharan inverse length biased exponential (KMILBE) distribution, the progressive-stress accelerated life tests are considered. Some estimating approaches, such as the maximum likelihood, maximum product of spacing, least squares, and weighted least square estimations, are taken into account while using progressive type-II censoring. Furthermore, interval estimation is accomplished by determining the parameters’ approximate confidence intervals. The performance of the estimation approaches is investigated using Monte Carlo simulation. The relevance and flexibility of the model are demonstrated using two real datasets. The distribution is very flexible, and it outperforms many known distributions such as the inverse length biased, the inverse Lindley model, the Lindley, the inverse exponential, the sine inverse exponential and the sine inverse Rayleigh model.  相似文献   

8.
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.  相似文献   

9.
This paper starts from Schrödinger’s famous question “what is life” and elucidates answers that invoke, in particular, Friston’s free energy principle and its relation to the method of Bayesian inference and to Synergetics 2nd foundation that utilizes Jaynes’ maximum entropy principle. Our presentation reflects the shift from the emphasis on physical principles to principles of information theory and Synergetics. In view of the expected general audience of this issue, we have chosen a somewhat tutorial style that does not require special knowledge on physics but familiarizes the reader with concepts rooted in information theory and Synergetics.  相似文献   

10.
This study aimed to investigate consumers’ visual image evaluation of wrist wearables based on Kansei engineering. A total of 8 representative samples were screened from 99 samples using the multidimensional scaling (MDS) method. Five groups of adjectives were identified to allow participants to express their visual impressions of wrist wearable devices through a questionnaire survey and factor analysis. The evaluation of eight samples using the five groups of adjectives was analyzed utilizing the triangle fuzzy theory. The results showed a relatively different evaluation of the eight samples in the groups of “fashionable and individual” and “rational and decent”, but little distinction in the groups of “practical and durable”, “modern and smart” and “convenient and multiple”. Furthermore, wrist wearables with a shape close to a traditional watch dial (round), with a bezel and mechanical buttons (moderate complexity) and asymmetric forms received a higher evaluation. The acceptance of square- and elliptical-shaped wrist wearables was relatively low. Among the square- and rectangular-shaped wrist wearables, the greater the curvature of the chamfer, the higher the acceptance. Apparent contrast between the color of the screen and the casing had good acceptance. The influence of display size on consumer evaluations was relatively small. Similar results were obtained in the evaluation of preferences and willingness to purchase. The results of this study objectively and effectively reflect consumers’ evaluation and potential demand for the visual images of wrist wearables and provide a reference for designers and industry professionals.  相似文献   

11.
We consider state changes in quantum theory due to “conditional action” and relate these to the discussion of entropy decrease due to interventions of “intelligent beings” and the principles of Szilard and Landauer/Bennett. The mathematical theory of conditional actions is a special case of the theory of “instruments”, which describes changes of state due to general measurements and will therefore be briefly outlined in the present paper. As a detailed example, we consider the imperfect erasure of a qubit that can also be viewed as a conditional action and will be realized by the coupling of a spin to another small spin system in its ground state.  相似文献   

12.
The lack of adequate indicators in the research of digital economy may lead to the shortage of data support on decision making for governments. To solve this problem, first we establish a digital economy indicator evaluation system by dividing the digital economy into four types: “basic type”, “technology type”, “integration type” and “service type” and select 5 indicators for each type. On this basis, the weight of each indicator is calculated to find the deficiencies in the development of some digital economic fields by the improved entropy method. By drawing on the empowerment idea of Analytic Hierarchy Process, the improved entropy method firstly compares the difference coefficient of indicators in pairs and maps the comparison results to the scales 1–9. Then, the judgment matrix is constructed based on the information entropy, which can solve as much as possible the problem that the difference among the weight of each indicator is too large in traditional entropy method. The results indicate that: the development of digital economy in Guangdong Province was relatively balanced from 2015 to 2018 and will be better in the future while the development of rural e-commerce in Guangdong Province is relatively backward, and there is an obvious digital gap between urban and rural areas. Next we extract two new variables respectively to replace the 20 indicators we select through principal component analysis and factor analysis methods in multivariate statistical analysis, which can retain the original information to the greatest extent and provide convenience for further research in the future. Finally, we and provide constructive comments of digital economy in Guangdong Province from 2015 to 2018.  相似文献   

13.
The task of reconstructing the system’s state from the measurements results, known as the Pauli problem, usually requires repetition of two successive steps. Preparation in an initial state to be determined is followed by an accurate measurement of one of the several chosen operators in order to provide the necessary “Pauli data”. We consider a similar yet more general problem of recovering Feynman’s transition (path) amplitudes from the results of at least three consecutive measurements. The three-step histories of a pre- and post-selected quantum system are subjected to a type of interference not available to their two-step counterparts. We show that this interference can be exploited, and if the intermediate measurement is “fuzzy”, the path amplitudes can be successfully recovered. The simplest case of a two-level system is analysed in detail. The “weak measurement” limit and the usefulness of the path amplitudes are also discussed.  相似文献   

14.
Vitamin D was discovered as an anti-rachitic agent, but even at present, there is no direct evidence to support the concept that vitamin D directly stimulates osteoblastic bone formation and mineralization. It appears to be paradoxical, but vitamin D functions in the process of osteoclastic bone resorption. Osteoclasts, the only cells responsible for bone resorption, develop from hematopoietic cells of the monocyte-macrophage lineage. In 1992, we hypothesized that a membrane-bound factor, designated as “osteoclast differentiation factor (ODF)”, is expressed on the plasma membrane of osteoblasts/stromal cells in response to osteotropic factors including the active form of vitamin D3, 1α,25-dihydroxyvitamin D3 [1α,25(OH)2D3]. Recently, four research groups including ours independently identified three key molecules (RANKL, RANK, and OPG) responsible for osteoclastogenesis. A long-sought-after ligand, ODF, was identical to RANKL. RANKL was a member of the membrane-associated TNF ligand family, which induced differentiation of spleen cells (osteoclast progenitors) into osteoclasts in the presence of M-CSF. RANK, a member of the TNF receptor family, was a signaling receptor essential for the RANKL-mediated osteoclastogenesis. OPG, a secreted member of the TNF receptor family, was a decoy receptor for RANKL. The discovery of RANKL, RANK and OPG opens a new era in the study of bone biology and the therapy of several metabolic bone diseases such as osteoporosis, rheumatoid arthritis, and periodontal diseases.  相似文献   

15.
In the paper, we begin with introducing a novel scale mixture of normal distribution such that its leptokurticity and fat-tailedness are only local, with this “locality” being separately controlled by two censoring parameters. This new, locally leptokurtic and fat-tailed (LLFT) distribution makes a viable alternative for other, globally leptokurtic, fat-tailed and symmetric distributions, typically entertained in financial volatility modelling. Then, we incorporate the LLFT distribution into a basic stochastic volatility (SV) model to yield a flexible alternative for common heavy-tailed SV models. For the resulting LLFT-SV model, we develop a Bayesian statistical framework and effective MCMC methods to enable posterior sampling of the parameters and latent variables. Empirical results indicate the validity of the LLFT-SV specification for modelling both “non-standard” financial time series with repeating zero returns, as well as more “typical” data on the S&P 500 and DAX indices. For the former, the LLFT-SV model is also shown to markedly outperform a common, globally heavy-tailed, t-SV alternative in terms of density forecasting. Applications of the proposed distribution in more advanced SV models seem to be easily attainable.  相似文献   

16.
The cerebral cortex performs its computations with many six-layered fundamental units, collectively spreading along the cortical sheet. What is the local network structure and the operating dynamics of such a fundamental unit? Previous investigations of primary sensory areas revealed a classic “canonical” circuit model, leading to an expectation of similar circuit organization and dynamics throughout the cortex. This review clarifies the different circuit dynamics at play in the higher association cortex of primates that implements computation for high-level cognition such as memory and attention. Instead of feedforward processing of response selectivity through Layers 4 to 2/3 that the classic canonical circuit stipulates, memory recall in primates occurs in Layer 5/6 with local backward projection to Layer 2/3, after which the retrieved information is sent back from Layer 6 to lower-level cortical areas for further retrieval of nested associations of target attributes. In this review, a novel “dynamic multimode module (D3M)” in the primate association cortex is proposed, as a new “canonical” circuit model performing this operation.  相似文献   

17.
In the assessment of most complex socioeconomic phenomena with the use of multicriteria methods, continuous data are used, the source of which are most often public statistics. However, there are complex phenomena such as quality of life and quality of services in the assessment, for which questionnaire surveys and ordinal measurement scales are used. In this case, the use of classic multicriteria methods is very difficult, taking into account the way of presenting this type of data by official statistics, as well as their permissible transformations and arithmetic operations. Therefore, the main purpose of this study was the presentation of a novel framework which can be applied for assessing socioeconomic phenomena on the basis of survey data. It was assumed that the object assessments may contain positive or negative opinions and an element of uncertainty expressed in the form a “no”, “difficult to say”, or “no opinion” answers. For this reason, the intuitionistic fuzzy TOPSIS (IF-TOPSIS) method is proposed. To demonstrate the potential of this solution, the results of measuring the subjective quality of life of the inhabitants of 83 cities in EU countries, EFTA countries, the UK, the Western Balkans, and Turkey are presented. For most cities, a high level of subjective quality of life was observed using the proposed approach. The highest level of quality of life was observed in Zurich, whereas the lowest was observed in Palermo.  相似文献   

18.
The heterogeneous graphical Granger model (HGGM) for causal inference among processes with distributions from an exponential family is efficient in scenarios when the number of time observations is much greater than the number of time series, normally by several orders of magnitude. However, in the case of “short” time series, the inference in HGGM often suffers from overestimation. To remedy this, we use the minimum message length principle (MML) to determinate the causal connections in the HGGM. The minimum message length as a Bayesian information-theoretic method for statistical model selection applies Occam’s razor in the following way: even when models are equal in their measure of fit-accuracy to the observed data, the one generating the most concise explanation of data is more likely to be correct. Based on the dispersion coefficient of the target time series and on the initial maximum likelihood estimates of the regression coefficients, we propose a minimum message length criterion to select the subset of causally connected time series with each target time series and derive its form for various exponential distributions. We propose two algorithms—the genetic-type algorithm (HMMLGA) and exHMML to find the subset. We demonstrated the superiority of both algorithms in synthetic experiments with respect to the comparison methods Lingam, HGGM and statistical framework Granger causality (SFGC). In the real data experiments, we used the methods to discriminate between pregnancy and labor phase using electrohysterogram data of Islandic mothers from Physionet databasis. We further analysed the Austrian climatological time measurements and their temporal interactions in rain and sunny days scenarios. In both experiments, the results of HMMLGA had the most realistic interpretation with respect to the comparison methods. We provide our code in Matlab. To our best knowledge, this is the first work using the MML principle for causal inference in HGGM.  相似文献   

19.
The properties of decays that take place during jet formation cannot be easily deduced from the final distribution of particles in a detector. In this work, we first simulate a system of particles with well-defined masses, decay channels, and decay probabilities. This presents the “true system” for which we want to reproduce the decay probability distributions. Assuming we only have the data that this system produces in the detector, we decided to employ an iterative method which uses a neural network as a classifier between events produced in the detector by the “true system” and some arbitrary “test system”. In the end, we compare the distributions obtained with the iterative method to the “true” distributions.  相似文献   

20.
Causal Geometry     
Information geometry has offered a way to formally study the efficacy of scientific models by quantifying the impact of model parameters on the predicted effects. However, there has been little formal investigation of causation in this framework, despite causal models being a fundamental part of science and explanation. Here, we introduce causal geometry, which formalizes not only how outcomes are impacted by parameters, but also how the parameters of a model can be intervened upon. Therefore, we introduce a geometric version of “effective information”—a known measure of the informativeness of a causal relationship. We show that it is given by the matching between the space of effects and the space of interventions, in the form of their geometric congruence. Therefore, given a fixed intervention capability, an effective causal model is one that is well matched to those interventions. This is a consequence of “causal emergence,” wherein macroscopic causal relationships may carry more information than “fundamental” microscopic ones. We thus argue that a coarse-grained model may, paradoxically, be more informative than the microscopic one, especially when it better matches the scale of accessible interventions—as we illustrate on toy examples.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号