首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A new approach to the analysis of positron-annihilation long-slit angular distributions which can provide a more direct reflection of Fermi surface profile has been described in a recent publication. The analysis involves the periodic superposition of angular distributions which has the effect of converting a momentum distribution into a distribution in reduced Bloch wave vectorsk. When applied to data for a Cu crystal, having the resolved momentum component parallel to the [110] direction, the analysis results in excellent agreement with the computed surface of Halse and also provides a guide to the form and intensity of the angular distribution for core electron annihilation. A similar treatment of the results for a [100] orientation, at first sight, appears less encouraging. However, a more careful appraisal supports the general value of the approach, the validity of the analysis for the [110] crystal orientation, and gives further clues to the form and anisotropy of the core distributions for Cu single crystals.  相似文献   

2.
We study numerically and by scaling methods the distributions and moments of several structural properties of percolation clusters in two and three dimensions. The clusters are generated at criticality and properties such as the distribution of the mass as a function of linear size or chemical distance are studied. Our results suggest that the hierarchy of moments can be represented by a single gap exponent. Using a scaling approach, we obtain analytical forms for the different distribution functions which agree very well with the numerical data.  相似文献   

3.
The problem of light scattering properties of spheroid particles is studied, and a general approach is presented for calculating the single particle light scattering of spheroids. In this approach, the extinction efficiency of spheroid particles can be calculated by combining the spline interpolation of T matrix method and ADA (anomalous diffraction approximation) theory. Furthermore, the retrieval of spheroid particle size distribution is performed in the dependent mode and a selection method about the optical extinction data is proposed based on PCA (principle component analysis) of first derivative corresponding to the raw optical extinction. By calculating the contribution rate of first derivative corresponding to the raw optical extinction, the optical extinction with more significant features can be selected as the inversion optical extinction data. In this way, the selected optical extinction has less information redundancy and higher capacity of resisting noise disturbance. Simulation experiments indicate that the spheroid particle size distributions obtained with the proposed method coincide fairly well with the given distributions, which provides a simple, reliable and efficient method to retrieve the spheroid particle size distribution using the optical extinction data.  相似文献   

4.
核磁自旋回波串的液体分量分解快速反演法(英文)   总被引:1,自引:1,他引:0  
该文叙述核磁自旋回波串的液体分量分解快速反演法.此方法假定液体,无论是在散装形式或饱和多孔介质中,可以用一个或一组核磁弛豫线形来表征.对一维核磁共振的拉普拉斯反演,它可以是预先确定的一个或一组T2或T1分布.对二维核磁共振的拉普拉斯反演,它可以是一个或一组预先确定的( D, T2)或( T1, T2)二维分布.对三维核磁共振的拉普拉斯反演,它可以是一个或一组预先设定的( D, T1, T2)三维分布.这些预先确定的线形,可以是高斯、B样条或预先由实验或经验确定的任何线形.这种方法可以显着降低核磁共振数据反演的计算时间,特别是从石油核磁共振测井采集的多维数据反演,它不需牺牲反演所得的分布的平滑性和准确性.这种方法的另一个新应用是作为一种约束求解方法来过滤相邻深度所采集的数据噪音.核磁共振测井的噪音信号,往往造成在相邻深度的同一岩性岩层有不同的T2分布.在此情况下, T2分布就不能用来识别岩性.通过非一般的矩阵操作,作者成功实现了对相邻深度的回波串实施约束求解方法,从而使得T2分布成为一种可靠的岩性识别指标.  相似文献   

5.
We study the problem of measurement-induced decoherence using the phase-space approach employing the Gaussian-smoothed Wigner distribution function. Our investigation is based on the notion that measurement-induced decoherence is represented by the transition from the Wigner distribution to the Gaussian-smoothed Wigner distribution with the widths of the smoothing function identified as measurement errors. We also compare the smoothed Wigner distribution with the corresponding distribution resulting from the classical analysis. The distributions we computed are the phase-space distributions for simple one-dimensional dynamical systems such as a particle in a square-well potential and a particle moving under the influence of a step potential, and the time-frequency distributions for high-harmonic radiation emitted from an atom irradiated by short, intense laser pulses.  相似文献   

6.
Nuclear magnetic resonance water proton relaxometry is widely used to investigate pore size distributions and pore connectivity in brine-saturated porous rocks and construction materials. In this paper we show that, by replacing water with acetone, a similar method can be used to probe the porous structure of freeze-dried starch gels and therefore the ice crystal size distribution in frozen starch gels. The method relies on the observation that the starch surface acts as a powerful relaxation sink for acetone proton transverse magnetization so that Brownstein-Tarr theory can be used to extract the pore size distribution from the relaxation data. In addition the relaxation time distribution is found to depend on the spectrometer frequency and the Carr-Purcell-Meiboom-Gill pulse spacing, consistent with the existence of large susceptibility-induced field gradients within the pores. The potential of this approach for noninvasively measuring ice crystal size distributions during freezing and pore size distributions during freeze-drying in other food systems is discussed.  相似文献   

7.
In this paper, a new statistical method to model patterns emerging in complex systems is proposed. A framework for shape analysis of 2? dimensional landmark data is introduced, in which each landmark is represented by a bivariate Gaussian distribution. From Information Geometry we know that Fisher-Rao metric endows the statistical manifold of parameters of a family of probability distributions with a Riemannian metric. Thus this approach allows to reconstruct the intermediate steps in the evolution between observed shapes by computing the geodesic, with respect to the Fisher-Rao metric, between the corresponding distributions. Furthermore, the geodesic path can be used for shape predictions. As application, we study the evolution of the rat skull shape. A future application in Ophthalmology is introduced.  相似文献   

8.
本文用Lagrange方法结合起跳沙粒初始运动状态分布模拟了稳态风沙输运过程。根据已有的对地表沙粒撞击起跳现象的研究成果,列出四种典型的沙粒起跳初始运动状态分布形式。将在各分布形式下模拟得到的宏观量与风洞实验测量得到的宏观量的变化规律相比较,通过考察依据各分布形式所作数值模拟得到的风沙宏观运动的特征量与实验测量结果的一致程度,得到了其中较合理的分布形式。分析表明稳定风沙输运中起跳沙粒的初始速度和角度的分布曲线均应为一单调下降曲线,根据实验数据本文构造出由指数分布和正态分布组成的分段函数形式来描述这一曲线。并通过进一步的实验验证了这一分布形式的合理性。  相似文献   

9.
As part of a recent analysis of exclusive two-photon production of W+W-pairs at the LHC, the CMS experiment used di-lepton data to obtain an "effective" photon-photon luminosity. We show how the CMS analysis on their 8 TeV data, along with some assumptions about the likelihood for events in which the proton breaks up to pass the selection criteria, can be used to significantly constrain the photon parton distribution functions, such as those from the CTEQ, MRST, and NNPDF collaborations. We compare the data with predictions using these photon distributions, as well as the new LUXqed photon distribution. We study the impact of including these data on the NNPDF2.3 QED, NNPDF3.0 QED and CT14 QEDinc fits. We find that these data place a useful and complementary cross-check on the photon distribution, which is consistent with the LUXqed prediction while suggesting that the NNPDF photon error band should be significantly reduced. Additionally, we propose a simple model for describing the two-photon production of W+W-at the LHC. Using this model, we constrain the number of inelastic photons that remain after the experimental cuts are applied.  相似文献   

10.
We develop the wave-optics approach for calculating the diffraction distribution of gradient refractiveindex lenses and observing the diffraction pattern of gradient refractive-index lenses in the experiments. The results of our calculation are in good agreement with the experimental results obtained. We show that the diffraction can be regarded as a method to check the quality of the refractive-index distributions of gradient refractive-index lenses.  相似文献   

11.
In the large momentum transfer limit, generalized parton distributions can be calculated through a QCD factorization theorem which involves perturbatively calculable hard kernels and light-cone parton distribution amplitudes of hadrons. We illustrate this through the H(q)(x,xi,t) distribution for the pion and the proton, presenting the hard kernels at leading order. As a result, experimental data on the generalized parton distributions in this regime can be used to determine the functional form of the parton distribution amplitudes which has thus far been quite challenging to obtain. Our result can also be used as a constraint in phenomenological generalized parton distribution parametrizations.  相似文献   

12.
Pulse dipolar electron paramagnetic resonance spectroscopy provides means of distance measurements in the range of ~ 1.5–10 nm between two spin labels tethered to a biological system. However, the extraction of distance distribution between spin labels is an ill-posed mathematical problem. The most common approach for obtaining distance distribution employs Tikhonov regularization method, where a regularization parameter characterizing the smoothness of distribution is introduced. However, in case of multi-modal distance distributions with peaks of different widths, the use of a single regularization parameter might lead to certain distortions of actual distribution shapes. Recently, a multi-Gaussian Monte Carlo approach was proposed for eliminating this drawback and verified for model biradicals [1]. In the present work, we for the first time test this approach on complicated biological systems exhibiting multi-modal distance distributions. We apply multi-Gaussian analysis to pulsed electron–electron double resonance data of supramolecular ribosomal complexes, where the 11-mer oligoribonucleotide (MR) bearing two nitroxide labels at its termini is used as a reporter. Calculated distance distributions reveal the same conformations of MR as those obtained by Tikhonov regularization, but feature the peaks having different widths, which leads to a better resolution in several cases. The advantages, complications, and further perspectives of application of Monte-Carlo-based multi-Gaussian approach to real biological systems are discussed.  相似文献   

13.
We present applications of polar plots for analyzing fluorescence lifetime data acquired in the frequency domain. This graphical, analytical method is especially useful for rapid FLIM measurements. The usual method for sorting out and determining the underlying lifetime components from a complex fluorescence signal is to carry out the measurement at multiple frequencies. When it is not possible to measure at more than one frequency, such as rapid lifetime imaging, specific features of the polar plot analysis yield valuable information, and provide a diagnostic visualization of the participating fluorescent species underlying a complex lifetime distributions. Data are presented where this polar plot presentation is useful to derive valuable, unique information about the underlying component distributions. We also discuss artifacts of photolysis and how this method can also be applied to samples where each fluorescence species shows a continuous distribution of lifetimes. Polar plots of frequency-domain data are commonly used for analysis of dielectric relaxation experiments (Cole–Cole plots), which have proved to be exceptionally useful in that field for decades. We compare this analytical tool that is well developed and extensively used in dielectric relaxation and chemical kinetics to fluorescence measurements.  相似文献   

14.
The minimization of Fisher’s information (MFI) approach of Frieden et al. [Phys. Rev. E 60, 48 (1999)] is applied to the study of size distributions in social groups on the basis of a recently established analogy between scale invariant systems and classical gases [Phys. A 389, 490 (2010)]. Going beyond the ideal gas scenario is seen to be tantamount to simulating the interactions taking place, for a competitive cluster growth process, in a scale-free ideal network – a non-correlated network with a connection-degree’s distribution that mimics the scale-free ideal gas density distribution. We use a scaling rule that allows one to classify the final cluster-size distributions using only one parameter that we call the competitiveness, which can be seen as a measure of the strength of the interactions. We find that both empirical city-size distributions and electoral results can be thus reproduced and classified according to this competitiveness-parameter, that also allow us to infer the maximum number of stable social relationships that one person can maintain, known as the Dunbar number, together with its standard deviation. We discuss the importance of this number in connection with the empirical phenomenon known as “six-degrees of separation”. Finally, we show that scaled city-size distributions of large countries follow, in general, the same universal distribution.  相似文献   

15.
16.
We discuss the equivalence between kinetic wealth-exchange models, in which agents exchange wealth during trades, and mechanical models of particles, exchanging energy during collisions. The universality of the underlying dynamics is shown both through a variational approach based on the minimization of the Boltzmann entropy and a microscopic analysis of the collision dynamics of molecules in a gas. In various relevant cases, the equilibrium distribution is well-approximated by a gamma-distribution with suitably defined temperature and number of dimensions. This in turn allows one to quantify the inequalities observed in the wealth distributions and suggests that their origin should be traced back to very general underlying mechanisms, for instance, the fact that smaller the fraction of the relevant quantity (e.g. wealth) that agent can exchange during an interaction, the closer the corresponding equilibrium distribution is to a fair distribution.  相似文献   

17.
Probabilistic predictions with machine learning are important in many applications. These are commonly done with Bayesian learning algorithms. However, Bayesian learning methods are computationally expensive in comparison with non-Bayesian methods. Furthermore, the data used to train these algorithms are often distributed over a large group of end devices. Federated learning can be applied in this setting in a communication-efficient and privacy-preserving manner but does not include predictive uncertainty. To represent predictive uncertainty in federated learning, our suggestion is to introduce uncertainty in the aggregation step of the algorithm by treating the set of local weights as a posterior distribution for the weights of the global model. We compare our approach to state-of-the-art Bayesian and non-Bayesian probabilistic learning algorithms. By applying proper scoring rules to evaluate the predictive distributions, we show that our approach can achieve similar performance as the benchmark would achieve in a non-distributed setting.  相似文献   

18.
In the parameter estimation of limit extreme value distributions, most employed methods only use some of the available data. Using the peaks-over-threshold method for Generalized Pareto Distribution (GPD), only the observations above a certain threshold are considered; therefore, a big amount of information is wasted. The aim of this work is to make the most of the information provided by the observations in order to improve the accuracy of Bayesian parameter estimation. We present two new Bayesian methods to estimate the parameters of the GPD, taking into account the whole data set from the baseline distribution and the existing relations between the baseline and the limit GPD parameters in order to define highly informative priors. We make a comparison between the Bayesian Metropolis–Hastings algorithm with data over the threshold and the new methods when the baseline distribution is a stable distribution, whose properties assure we can reduce the problem to study standard distributions and also allow us to propose new estimators for the parameters of the tail distribution. Specifically, three cases of stable distributions were considered: Normal, Lévy and Cauchy distributions, as main examples of the different behaviors of the tails of a distribution. Nevertheless, the methods would be applicable to many other baseline distributions through finding relations between baseline and GPD parameters via studies of simulations. To illustrate this situation, we study the application of the methods with real data of air pollution in Badajoz (Spain), whose baseline distribution fits a Gamma, and show that the baseline methods improve estimates compared to the Bayesian Metropolis–Hastings algorithm.  相似文献   

19.
The reliability of procedures for extracting the distance distribution between spins from the dipolar evolution function is studied with particular emphasis on broad distributions. A new numerically stable procedure for fitting distance distributions with polynomial interpolation between sampling points is introduced and compared to Tikhonov regularization in the dipolar frequency and distance domains and to approximate Pake transformation. Distance distribution with only narrow peaks are most reliably extracted by distance-domain Tikhonov regularization, while frequency-domain Tikhonov regularization is favorable for distributions with only broad peaks. For the quantification of distributions by their mean distance and variance, Hermite polynomial interpolation provides the best results. Distributions that contain both broad and narrow peaks are most difficult to analyze. In this case a high signal-to-noise ratio is strictly required and approximate Pake transformation should be applied. A procedure is given for renormalizing primary experimental data from protein preparations with slightly different degrees of spin labelling, so that they can be compared directly. Performance of all the data analysis procedures is demonstrated on experimental data for a shape-persistent biradical with a label-to-label distance of 5 nm, for a [2]catenane with a broad distance distribution, and for a doubly spin-labelled double mutant of plant light harvesting complex II  相似文献   

20.
The gamma-type distributions in adsorption energies of sites, subject to the Langmuir probability of occupancy, are analyzed with the aim to develop criteria for the reliability of estimates of the energy distribution from experimental adsorption data. For broad distributions this model leads to preferential occupation of sites with highest adsorption energies, which causes the breakdown of the virial expansion of adsorption isotherms. The resulting isotherms can be discriminated from simple Langmuir isotherms only for not too narrow distributions and the critical criterion is given. Finally, it is pointed out that experimental data do not allow to discern between a continuous and discrete distribution. All the conclusions can be generalized to other types of distribution in adsorption energies, including composite distributions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号