首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this article, we propose the exponentiated sine-generated family of distributions. Some important properties are demonstrated, such as the series representation of the probability density function, quantile function, moments, stress-strength reliability, and Rényi entropy. A particular member, called the exponentiated sine Weibull distribution, is highlighted; we analyze its skewness and kurtosis, moments, quantile function, residual mean and reversed mean residual life functions, order statistics, and extreme value distributions. Maximum likelihood estimation and Bayes estimation under the square error loss function are considered. Simulation studies are used to assess the techniques, and their performance gives satisfactory results as discussed by the mean square error, confidence intervals, and coverage probabilities of the estimates. The stress-strength reliability parameter of the exponentiated sine Weibull model is derived and estimated by the maximum likelihood estimation method. Also, nonparametric bootstrap techniques are used to approximate the confidence interval of the reliability parameter. A simulation is conducted to examine the mean square error, standard deviations, confidence intervals, and coverage probabilities of the reliability parameter. Finally, three real applications of the exponentiated sine Weibull model are provided. One of them considers stress-strength data.  相似文献   

2.
This paper deals with different bootstrap approaches and bootstrap confidence intervals in the fractionally autoregressive moving average (ARFIMA(p,d,q))(ARFIMA(p,d,q)) process [J. Hosking, Fractional differencing, Biometrika 68(1) (1981) 165–175] using parametric and semi-parametric estimation techniques for the memory parameter d. The bootstrap procedures considered are: the classical bootstrap in the residuals of the fitted model [B. Efron, R. Tibshirani, An Introduction to the Bootstrap, Chapman and Hall, New York, 1993], the bootstrap in the spectral density function [E. Paparoditis, D.N Politis, The local bootstrap for periodogram statistics. J. Time Ser. Anal. 20(2) (1999) 193–222], the bootstrap in the residuals resulting from the regression equation of the semi-parametric estimators [G.C Franco, V.A Reisen, Bootstrap techniques in semiparametric estimation methods for ARFIMA models: a comparison study, Comput. Statist. 19 (2004) 243–259] and the Sieve bootstrap [P. Bühlmann, Sieve bootstrap for time series, Bernoulli 3 (1997) 123–148]. The performance of these procedures and confidence intervals for d in the stationary and non-stationary ranges are empirically obtained through Monte Carlo experiments. The bootstrap confidence intervals here proposed are alternative procedures with some accuracy to obtain confidence intervals for d.  相似文献   

3.
Expected Shortfall (ES), the average loss above a high quantile, is the current financial regulatory market risk measure. Its estimation and optimization are highly unstable against sample fluctuations and become impossible above a critical ratio r=N/T, where N is the number of different assets in the portfolio, and T is the length of the available time series. The critical ratio depends on the confidence level α, which means we have a line of critical points on the αr plane. The large fluctuations in the estimation of ES can be attenuated by the application of regularizers. In this paper, we calculate ES analytically under an 1 regularizer by the method of replicas borrowed from the statistical physics of random systems. The ban on short selling, i.e., a constraint rendering all the portfolio weights non-negative, is a special case of an asymmetric 1 regularizer. Results are presented for the out-of-sample and the in-sample estimator of the regularized ES, the estimation error, the distribution of the optimal portfolio weights, and the density of the assets eliminated from the portfolio by the regularizer. It is shown that the no-short constraint acts as a high volatility cutoff, in the sense that it sets the weights of the high volatility elements to zero with higher probability than those of the low volatility items. This cutoff renormalizes the aspect ratio r=N/T, thereby extending the range of the feasibility of optimization. We find that there is a nontrivial mapping between the regularized and unregularized problems, corresponding to a renormalization of the order parameters.  相似文献   

4.
We consider whether the new horizon-first law works in higher-dimensional f(R) theory. We firstly obtain the general formulas to calculate the entropy and the energy of a general spherically-symmetric black hole in D-dimensional f(R) theory. For applications, we compute the entropies and the energies of some black hokes in some interesting higher-dimensional f(R) theories.  相似文献   

5.
6.
This study deals with drift parameters estimation problems in the sub-fractional Vasicek process given by dxt=θ(μxt)dt+dStH, with θ>0, μR being unknown and t0; here, SH represents a sub-fractional Brownian motion (sfBm). We introduce new estimators θ^ for θ and μ^ for μ based on discrete time observations and use techniques from Nordin–Peccati analysis. For the proposed estimators θ^ and μ^, strong consistency and the asymptotic normality were established by employing the properties of SH. Moreover, we provide numerical simulations for sfBm and related Vasicek-type process with different values of the Hurst index H.  相似文献   

7.
In the nervous system, information is conveyed by sequence of action potentials, called spikes-trains. As MacKay and McCulloch suggested, spike-trains can be represented as bits sequences coming from Information Sources (IS). Previously, we studied relations between spikes’ Information Transmission Rates (ITR) and their correlations, and frequencies. Now, I concentrate on the problem of how spikes fluctuations affect ITR. The IS are typically modeled as stationary stochastic processes, which I consider here as two-state Markov processes. As a spike-trains’ fluctuation measure, I assume the standard deviation σ, which measures the average fluctuation of spikes around the average spike frequency. I found that the character of ITR and signal fluctuations relation strongly depends on the parameter s being a sum of transitions probabilities from a no spike state to spike state. The estimate of the Information Transmission Rate was found by expressions depending on the values of signal fluctuations and parameter s. It turned out that for smaller s<1, the quotient ITRσ has a maximum and can tend to zero depending on transition probabilities, while for 1<s, the ITRσ is separated from 0. Additionally, it was also shown that ITR quotient by variance behaves in a completely different way. Similar behavior was observed when classical Shannon entropy terms in the Markov entropy formula are replaced by their approximation with polynomials. My results suggest that in a noisier environment (1<s), to get appropriate reliability and efficiency of transmission, IS with higher tendency of transition from the no spike to spike state should be applied. Such selection of appropriate parameters plays an important role in designing learning mechanisms to obtain networks with higher performance.  相似文献   

8.
Active optical media leading to interaction Hamiltonians of the form H=λ˜(a+a)ζ represent a crucial resource for quantum optical technology. In this paper, we address the characterization of those nonlinear media using quantum probes, as opposed to semiclassical ones. In particular, we investigate how squeezed probes may improve individual and joint estimation of the nonlinear coupling λ˜ and of the nonlinearity order ζ. Upon using tools from quantum estimation, we show that: (i) the two parameters are compatible, i.e., the may be jointly estimated without additional quantum noise; (ii) the use of squeezed probes improves precision at fixed overall energy of the probe; (iii) for low energy probes, squeezed vacuum represent the most convenient choice, whereas for increasing energy an optimal squeezing fraction may be determined; (iv) using optimized quantum probes, the scaling of the corresponding precision with energy improves, both for individual and joint estimation of the two parameters, compared to semiclassical coherent probes. We conclude that quantum probes represent a resource to enhance precision in the characterization of nonlinear media, and foresee potential applications with current technology.  相似文献   

9.
Simple SummaryIn the early Universe, both QCD and EW eras play an essential role in laying seeds for nucleosynthesis and even dictating the cosmological large-scale structure. Taking advantage of recent developments in ultrarelativistic nuclear experiments and nonperturbative and perturbative lattice simulations, various thermodynamic quantities including pressure, energy density, bulk viscosity, relaxation time, and temperature have been calculated up to the TeV-scale, in which the possible influence of finite bulk viscosity is characterized for the first time and the analytical dependence of Hubble parameter on the scale factor is also introduced.AbstractBased on recent perturbative and non-perturbative lattice calculations with almost quark flavors and the thermal contributions from photons, neutrinos, leptons, electroweak particles, and scalar Higgs bosons, various thermodynamic quantities, at vanishing net-baryon densities, such as pressure, energy density, bulk viscosity, relaxation time, and temperature have been calculated up to the TeV-scale, i.e., covering hadron, QGP, and electroweak (EW) phases in the early Universe. This remarkable progress motivated the present study to determine the possible influence of the bulk viscosity in the early Universe and to understand how this would vary from epoch to epoch. We have taken into consideration first- (Eckart) and second-order (Israel–Stewart) theories for the relativistic cosmic fluid and integrated viscous equations of state in Friedmann equations. Nonlinear nonhomogeneous differential equations are obtained as analytical solutions. For Israel–Stewart, the differential equations are very sophisticated to be solved. They are outlined here as road-maps for future studies. For Eckart theory, the only possible solution is the functionality, H(a(t)), where H(t) is the Hubble parameter and a(t) is the scale factor, but none of them so far could to be directly expressed in terms of either proper or cosmic time t. For Eckart-type viscous background, especially at finite cosmological constant, non-singular H(t) and a(t) are obtained, where H(t) diverges for QCD/EW and asymptotic EoS. For non-viscous background, the dependence of H(a(t)) is monotonic. The same conclusion can be drawn for an ideal EoS. We also conclude that the rate of decreasing H(a(t)) with increasing a(t) varies from epoch to epoch, at vanishing and finite cosmological constant. These results obviously help in improving our understanding of the nucleosynthesis and the cosmological large-scale structure.  相似文献   

10.
We study the stability and the solvability of a family of problems (ϕ(x))=g(t,x,x,u)+f* with Dirichlet boundary conditions, where ϕ, u, f* are allowed to vary as well. Applications for boundary value problems involving the p-Laplacian operator are highlighted.  相似文献   

11.
In this paper, we study the entropy functions on extreme rays of the polymatroidal region which contain a matroid, i.e., matroidal entropy functions. We introduce variable strength orthogonal arrays indexed by a connected matroid M and positive integer v which can be regarded as expanding the classic combinatorial structure orthogonal arrays. It is interesting that they are equivalent to the partition-representations of the matroid M with degree v and the (M,v) almost affine codes. Thus, a synergy among four fields, i.e., information theory, matroid theory, combinatorial design, and coding theory is developed, which may lead to potential applications in information problems such as network coding and secret-sharing. Leveraging the construction of variable strength orthogonal arrays, we characterize all matroidal entropy functions of order n5 with the exception of log10·U2,5 and logv·U3,5 for some v.  相似文献   

12.
The Multi-Armed Bandit (MAB) problem has been extensively studied in order to address real-world challenges related to sequential decision making. In this setting, an agent selects the best action to be performed at time-step t, based on the past rewards received by the environment. This formulation implicitly assumes that the expected payoff for each action is kept stationary by the environment through time. Nevertheless, in many real-world applications this assumption does not hold and the agent has to face a non-stationary environment, that is, with a changing reward distribution. Thus, we present a new MAB algorithm, named f-Discounted-Sliding-Window Thompson Sampling (f-dsw TS), for non-stationary environments, that is, when the data streaming is affected by concept drift. The f-dsw TS algorithm is based on Thompson Sampling (TS) and exploits a discount factor on the reward history and an arm-related sliding window to contrast concept drift in non-stationary environments. We investigate how to combine these two sources of information, namely the discount factor and the sliding window, by means of an aggregation function f(.). In particular, we proposed a pessimistic (f=min), an optimistic (f=max), as well as an averaged (f=mean) version of the f-dsw TS algorithm. A rich set of numerical experiments is performed to evaluate the f-dsw TS algorithm compared to both stationary and non-stationary state-of-the-art TS baselines. We exploited synthetic environments (both randomly-generated and controlled) to test the MAB algorithms under different types of drift, that is, sudden/abrupt, incremental, gradual and increasing/decreasing drift. Furthermore, we adapt four real-world active learning tasks to our framework—a prediction task on crimes in the city of Baltimore, a classification task on insects species, a recommendation task on local web-news, and a time-series analysis on microbial organisms in the tropical air ecosystem. The f-dsw TS approach emerges as the best performing MAB algorithm. At least one of the versions of f-dsw TS performs better than the baselines in synthetic environments, proving the robustness of f-dsw TS under different concept drift types. Moreover, the pessimistic version (f=min) results as the most effective in all real-world tasks.  相似文献   

13.
Wormholes (WHs) are hypothetical topologically non-trivial spacetime structures that can be freely traversed by observers and connect two asymptotic regions or infinities. From the current theoretical development, the prospect of their existence is challenging but cannot be excluded. In this paper, generalized Ellis–Bronikov (GEB) traversable WH geometries for static and spherically symmetric spacetime in the background of f ( R ) $f(R)$ gravity is explored. First, the Tsujikawa-like f ( R ) $f(R)$ model and the shape function for the GEB model is considered, which depend on a sequence of simple Lorentzian WHs with two parameters: a free even integer exponent, n, besides the throat radius, r0. One also consider that these WHs are generated by dark matter galactic halos (DMGHs), based on the three most common phenomenological models, viz., Navarro–Frenk–White (NFW), Thomas–Fermi (TF), and pseudo-isothermal (PI). In this concern, the satisfaction of the energy conditions (ECs) which are dependent on the dark matter (DM) models, viz., dominant energy condition (DEC) and strong energy condition (SEC) and those which are not dependent viz., null energy condition (NEC) and WEC at the WH throat and its neighborhood is investigated. Finally, the presence of exotic matter is confirmed by the violation of the NEC in all cases, revealing the supremacy and physical acceptability to support the existence of the WHs and making them compatible and traversable in Tsujikawa's-like f ( R ) $f(R)$  model.  相似文献   

14.
Aims: Bubble entropy (bEn) is an entropy metric with a limited dependence on parameters. bEn does not directly quantify the conditional entropy of the series, but it assesses the change in entropy of the ordering of portions of its samples of length m, when adding an extra element. The analytical formulation of bEn for autoregressive (AR) processes shows that, for this class of processes, the relation between the first autocorrelation coefficient and bEn changes for odd and even values of m. While this is not an issue, per se, it triggered ideas for further investigation. Methods: Using theoretical considerations on the expected values for AR processes, we examined a two-steps-ahead estimator of bEn, which considered the cost of ordering two additional samples. We first compared it with the original bEn estimator on a simulated series. Then, we tested it on real heart rate variability (HRV) data. Results: The experiments showed that both examined alternatives showed comparable discriminating power. However, for values of 10<m<20, where the statistical significance of the method was increased and improved as m increased, the two-steps-ahead estimator presented slightly higher statistical significance and more regular behavior, even if the dependence on parameter m was still minimal. We also investigated a new normalization factor for bEn, which ensures that bEn =1 when white Gaussian noise (WGN) is given as the input. Conclusions: The research improved our understanding of bubble entropy, in particular in the context of HRV analysis, and we investigated interesting details regarding the definition of the estimator.  相似文献   

15.
16.
Large negative magnetoresistance (MR)(|ΔR/R|=|[R(H)-Rmax]/Rmax|>50%)(|ΔR/R|=|[R(H)-Rmax]/Rmax|>50%) has been observed in the broken cold-pressed CrO2 powder samples near the room temperature, which is ascribed to the magnetic field-induced mechanical contact variation. This large, room-temperature negative MR might easily lead to a wrong conclusion that the spin-polarization is still very high at high temperatures. This observation may also point to the possibility to develop field sensors based on the mechanical MR.  相似文献   

17.
The feasibility of applying the semi-superjunction (Semi-SJ) with SiGe-pillar (SGP) concept to Power MOSFET is studied in this paper. The electrical performances of SGP are compared with the conventional Power MOSFET through 3D device simulation work in terms of specific-on resistance (Ron)(Ron), breakdown-voltage (BV  ), the effect to change the Ge mole fraction in the SGP and the thermal stabilization. The results show that the RonRon is reduced by 44% on the base of BVs   reducing only 4.8%, tradeoff RonRon vs. BV and thermal stabilization of SGP are superior to that of conventional Semi-SJ since the strain effect inducing into the SGP structure in the low power device application.  相似文献   

18.
Through the research presented herein, it is quite clear that there are two thermodynamically distinct types (A and B) of energetic processes naturally occurring on Earth. Type A, such as glycolysis and the tricarboxylic acid cycle, apparently follows the second law well; Type B, as exemplified by the thermotrophic function with transmembrane electrostatically localized protons presented here, does not necessarily have to be constrained by the second law, owing to its special asymmetric function. This study now, for the first time, numerically shows that transmembrane electrostatic proton localization (Type-B process) represents a negative entropy event with a local protonic entropy change (ΔSL) in a range from −95 to −110 J/K∙mol. This explains the relationship between both the local protonic entropy change (ΔSL) and the mitochondrial environmental temperature (T) and the local protonic Gibbs free energy (ΔGL=TΔSL) in isothermal environmental heat utilization. The energy efficiency for the utilization of total protonic Gibbs free energy (ΔGT including ΔGL=TΔSL) in driving the synthesis of ATP is estimated to be about 60%, indicating that a significant fraction of the environmental heat energy associated with the thermal motion kinetic energy (kBT) of transmembrane electrostatically localized protons is locked into the chemical form of energy in ATP molecules. Fundamentally, it is the combination of water as a protonic conductor, and thus the formation of protonic membrane capacitor, with asymmetric structures of mitochondrial membrane and cristae that makes this amazing thermotrophic feature possible. The discovery of energy Type-B processes has inspired an invention (WO 2019/136037 A1) for energy renewal through isothermal environmental heat energy utilization with an asymmetric electron-gated function to generate electricity, which has the potential to power electronic devices forever, including mobile phones and laptops. This invention, as an innovative Type-B mimic, may have many possible industrial applications and is likely to be transformative in energy science and technologies for sustainability on Earth.  相似文献   

19.
20.
Using Monte Carlo simulations with the Metropolis algorithm, we have studied the influence of crystal-field interaction on the critical behavior of magnetic spin-1 Ising film on a cubic lattice structure. The phase diagrams in the (kBTc/J,R=Js/J)(kBTc/J,R=Js/J) plane are obtained for different values of the crystal-field interaction. We found that the special point Rsp(Rc)Rsp(Rc), at which the critical temperature is independent of the film thickness N, is independent of the crystal-field interaction and that the system may exhibit a tricritical behavior.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号