首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
地声参数及传播损失不确定性估计与建模   总被引:1,自引:0,他引:1       下载免费PDF全文
地声参数的不确定性对水声传播具有重要的影响。通过贝叶斯理论建立水声环境不确定性推理模型,理论推导了地声参数的似然函数以及地声参数和传播损失的后验概率密度,并采用MCMC(Markov Chain Monte Carlo)进行了仿真计算,给出了地声参数的二维后验联合概率密度和一维边缘概率密度,在此基础上对传播损失的不确定性进行了估计,得到了传播损失80%的可信区间。仿真和实验结果表明,该方法适用于地声参数反演和不确定性估计,并能获取因地声参数不确定性导致的传播损失不确定性估计。  相似文献   

2.
This is the Editorial article summarizing the scope of the Special Issue: Approximate Bayesian Inference.  相似文献   

3.
Formal Bayesian comparison of two competing models, based on the posterior odds ratio, amounts to estimation of the Bayes factor, which is equal to the ratio of respective two marginal data density values. In models with a large number of parameters and/or latent variables, they are expressed by high-dimensional integrals, which are often computationally infeasible. Therefore, other methods of evaluation of the Bayes factor are needed. In this paper, a new method of estimation of the Bayes factor is proposed. Simulation examples confirm good performance of the proposed estimators. Finally, these new estimators are used to formally compare different hybrid Multivariate Stochastic Volatility–Multivariate Generalized Autoregressive Conditional Heteroskedasticity (MSV-MGARCH) models which have a large number of latent variables. The empirical results show, among other things, that the validity of reduction of the hybrid MSV-MGARCH model to the MGARCH specification depends on the analyzed data set as well as on prior assumptions about model parameters.  相似文献   

4.
From its inception in the 1950s to the modern frontiers of applied statistics, Markov chain Monte Carlo has been one of the most ubiquitous and successful methods in statistical computing. The development of the method in that time has been fueled by not only increasingly difficult problems but also novel techniques adopted from physics. Here, the history of Markov chain Monte Carlo is reviewed from its inception with the Metropolis method to the contemporary state‐of‐the‐art in Hamiltonian Monte Carlo, focusing on the evolving interplay between the statistical and physical perspectives of the method.  相似文献   

5.
6.
A Markov chain Monte Carlo (MCMC) algorithm has been reported which is capable of determining the probabilistic orientation of two-fibre populations from high angular resolution diffusion-weighted data (HARDI). We present and critically discuss the application of this algorithm to in vivo human datasets acquired in clinically realistic times. We show that by appropriate model selection areas of multiple fibre populations can be identified that correspond with those predicted from known anatomy. Quantitative maps of fibre orientation probability are derived and shown for one- and two-fibre models of neural architecture. Fibre crossings in the pons, the internal capsule and the corona radiata are shown. In addition, we demonstrate that the relative proportion of anisotropic signal may be a more appropriate measure of anisotropy than summary measures derived from the tensor model such as fractional anisotropy in areas with multi-fibre populations.  相似文献   

7.
In the paper, we begin with introducing a novel scale mixture of normal distribution such that its leptokurticity and fat-tailedness are only local, with this “locality” being separately controlled by two censoring parameters. This new, locally leptokurtic and fat-tailed (LLFT) distribution makes a viable alternative for other, globally leptokurtic, fat-tailed and symmetric distributions, typically entertained in financial volatility modelling. Then, we incorporate the LLFT distribution into a basic stochastic volatility (SV) model to yield a flexible alternative for common heavy-tailed SV models. For the resulting LLFT-SV model, we develop a Bayesian statistical framework and effective MCMC methods to enable posterior sampling of the parameters and latent variables. Empirical results indicate the validity of the LLFT-SV specification for modelling both “non-standard” financial time series with repeating zero returns, as well as more “typical” data on the S&P 500 and DAX indices. For the former, the LLFT-SV model is also shown to markedly outperform a common, globally heavy-tailed, t-SV alternative in terms of density forecasting. Applications of the proposed distribution in more advanced SV models seem to be easily attainable.  相似文献   

8.
Ozone concentrations are key indicators of air quality. Modeling ozone concentrations is challenging because they change both spatially and temporally with complicated structures. Missing data bring even more difficulties. One of our interests in this paper is to model ozone concentrations in a region in the presence of missing data. We propose a method without any assumptions on the correlation structure to estimate the covariance matrix through a dimension expansion method for modeling the semivariograms in nonstationary fields based on the estimations from the hierarchical Bayesian spatio-temporal modeling technique (Le and Zidek). Further, we apply an entropy criterion (Jin et al.) based on a predictive model to decide if new stations need to be added. This entropy criterion helps to solve the environmental network design problem. For demonstration, we apply the method to the ozone concentrations at 25 stations in the Pittsburgh region studied. The comparison of the proposed method and the one is provided through leave-one-out cross-validation, which shows that the proposed method is more general and applicable.  相似文献   

9.
Entropy measures the uncertainty associated with a random variable. It has important applications in cybernetics, probability theory, astrophysics, life sciences and other fields. Recently, many authors focused on the estimation of entropy with different life distributions. However, the estimation of entropy for the generalized Bilal (GB) distribution has not yet been involved. In this paper, we consider the estimation of the entropy and the parameters with GB distribution based on adaptive Type-II progressive hybrid censored data. Maximum likelihood estimation of the entropy and the parameters are obtained using the Newton–Raphson iteration method. Bayesian estimations under different loss functions are provided with the help of Lindley’s approximation. The approximate confidence interval and the Bayesian credible interval of the parameters and entropy are obtained by using the delta and Markov chain Monte Carlo (MCMC) methods, respectively. Monte Carlo simulation studies are carried out to observe the performances of the different point and interval estimations. Finally, a real data set has been analyzed for illustrative purposes.  相似文献   

10.
High levels of the so-called community noise may produce hazardous effect on the health of a population exposed to them for large periods of time. Hence, the study of the behaviour of those noise measurements is very important. In this work we analyse that in terms of the probability of exceeding a given threshold level a certain number of times in a time interval of interest. Since the datasets considered contain missing measurements, we use a time series model to estimate the missing values and complete the datasets. Once the data is complete, we use a non-homogeneous Poisson model with multiple change-points to estimate the probability of interest. Estimation of the parameters of the models are made using the usual time series methodology as well as the Bayesian point of view via Markov chain Monte Carlo algorithms. The models are applied to data obtained from two measuring sites in Messina, Italy.  相似文献   

11.
Since the coronavirus disease 2019 (COVID-19) pandemic, most professional sports events have been held without spectators. It is generally believed that home teams deprived of enthusiastic support from their home fans experience reduced benefits of playing on their home fields, thus becoming less likely to win. This study attempts to confirm if this belief is true in four major European football leagues through statistical analysis. This study proposes a Bayesian hierarchical Poisson model to estimate parameters reflecting the home advantage and the change in such advantage. These parameters are used to improve the performance of machine-learning-based prediction models for football matches played after the COVID-19 break. The study describes the statistical analysis on the impact of the COVID-19 pandemic on football match results in terms of the expected score and goal difference. It also shows that estimated parameters from the proposed model reflect the changed home advantage. Finally, the study verifies that these parameters, when included as additional features, enhance the performance of various football match prediction models. The home advantage in European football matches has changed because of the behind-closed-doors policy implemented due to the COVID-19 pandemic. Using parameters reflecting the pandemic’s impact, it is possible to predict more precise results of spectator-free matches after the COVID-19 break.  相似文献   

12.
13.
The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher efficiency and of a consistent conceptual basis for dealing with the problem of induction in the presence of uncertainty. This trend is likely to continue in the future, when the way we collect, manipulate and analyse observations and compare them with theoretical models will assume an even more central role in cosmology.

This review is an introduction to Bayesian methods in cosmology and astrophysics and recent results in the field. I first present Bayesian probability theory and its conceptual underpinnings, Bayes' Theorem and the role of priors. I discuss the problem of parameter inference and its general solution, along with numerical techniques such as Monte Carlo Markov Chain methods. I then review the theory and application of Bayesian model comparison, discussing the notions of Bayesian evidence and effective model complexity, and how to compute and interpret those quantities. Recent developments in cosmological parameter extraction and Bayesian cosmological model building are summarised, highlighting the challenges that lie ahead.  相似文献   

14.
We extend the majority model (introduced by Tsallis in 1982) in the sense that the required majority might be different from the simple majority. We simulate these models for typical cases which include simple and 2/3 majorities. We exhibit the average cluster size as well as the order parameter as functions ofp, the concentration of one of the two possible constituents. No crossover exists between the simple- and non-simple-majority models.  相似文献   

15.
Extracting latent nonlinear dynamics from observed time-series data is important for understanding a dynamic system against the background of the observed data. A state space model is a probabilistic graphical model for time-series data, which describes the probabilistic dependence between latent variables at subsequent times and between latent variables and observations. Since, in many situations, the values of the parameters in the state space model are unknown, estimating the parameters from observations is an important task. The particle marginal Metropolis–Hastings (PMMH) method is a method for estimating the marginal posterior distribution of parameters obtained by marginalization over the distribution of latent variables in the state space model. Although, in principle, we can estimate the marginal posterior distribution of parameters by iterating this method infinitely, the estimated result depends on the initial values for a finite number of times in practice. In this paper, we propose a replica exchange particle marginal Metropolis–Hastings (REPMMH) method as a method to improve this problem by combining the PMMH method with the replica exchange method. By using the proposed method, we simultaneously realize a global search at a high temperature and a local fine search at a low temperature. We evaluate the proposed method using simulated data obtained from the Izhikevich neuron model and Lévy-driven stochastic volatility model, and we show that the proposed REPMMH method improves the problem of the initial value dependence in the PMMH method, and realizes efficient sampling of parameters in the state space models compared with existing methods.  相似文献   

16.
17.
虫草氨基酸的人工神经网络-近红外光谱快速测定方法   总被引:12,自引:6,他引:12  
提出了用近红外漫反射光谱技术快速检测发酵冬虫夏草中氨基酸含量的新方法。采用比色法测定虫草菌粉中氨基酸含量。用BP神经网络建立了近红外光谱数据与氨基酸、精氨酸和总氨酸含量间的定量关联模型。通过比较不同的光谱预处理方法及光谱范围, 得到最优模型,即在7 501.7~6 097.8,5 453.7~4 246.5 cm-1区域内,近红外光谱的一阶微分光谱与其氨基酸含量之间建立模型。甘氨酸、精氨酸和总氨基酸的预测标准偏差分别为0.08,0.07和0.36,均优于主成分回归(PCR)和偏最小二乘回归(PLS)等线性模型的处理结果。结果表明,该方法是一种有效实用的非线性校正方法。为近红外光谱快速测定中药组分含量提供了一条新途径。  相似文献   

18.
通常低能医用加速器靶的设计遵从厚靶原则, 即采用厚度等于或大于电子射程的靶, 使靶后无电子污染. 本文提出了新的靶设计概念:设计靶时, 将靶与相关系统(初级准直锥、均整器)作为一个整体考虑, 通过减小靶厚来提高光子剂量率, 利用初级准直锥、均整器来减少电子污染. 采用这种设计概念, 用蒙特卡罗方法对BJ-6—6MV单光子医用电子直线加速器的靶进行优化设计. 模拟计算10cm×10cm野, 30cm×30cm野的深度剂量曲线, 结果表明:较之现有的厚靶, 治疗剂量提高了10%, 同时表面剂量、X射线的品质仍满足国标的要求. 通过薄靶、厚靶的加速器整机对比实验, 已验证了此结论的正确性.  相似文献   

19.
刘超  张尚剑  谢亮  祝宁华 《物理学报》2005,54(6):2606-2610
一般矢量网络分析仪(VNA)的双端口测试夹具校准,至少需要三个已知标准才能实现.本文基 于Triple-Through理论构建两个虚拟的对称网络,提出了一种只需要采用一个标准的网络分 析仪双端口测试夹具校准新方法.采用这种方法校准测试夹具后,扣除夹具影响的实验结果 与没有测试夹具转接直接测试数据十分吻合,证明该方法精度高,而且简单易行. 关键词: 校准 测试夹具 网络分析仪 散射参数测量  相似文献   

20.
连续光在生物组织中能流率分布的漫射近似和模拟   总被引:8,自引:7,他引:8  
分析了半无限大介质漫射近似不同边界条件的镜像光源结构,用镜像光源的方法给出了连续光入射时稳态能流率分布的漫射近似表达式,并用Monte Carlo方法对能流率分布进行模拟,分析了两种模型能流率分布的特点及其形成的机理;用Monte Carlo模拟结果检验了能流率分布漫射近似的精度,结果表明:漫射近似采用EBC边界条件的结果有较高的精度和计算较简单等优点,为能流率分布的快速准确计算提供了依据.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号