首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
针对复杂结构可靠性分析中面临的隐式功能函数和小样本问题,提出了一种粒子群优化和Kriging模型相结合的结构非概率可靠性分析方法。采用多维椭球描述结构不确定参数,运用粒子群优化对模型相关参数进行求解,并构建隐式功能函数的Kriging模型进行可靠性分析。三个算例结果表明所提方法有效可行,精度和效率均优于基于Kriging模型的非概率可靠性分析方法。  相似文献   

2.
The safety analysis of systems with nonlinear performance function and small probability of failure is a challenge in the field of reliability analysis. In this study, an efficient approach is presented for approximating small failure probabilities. To meet this aim, by introducing Probability Density Function (PDF) control variates, the original failure probability integral was reformulated based on the Control Variates Technique (CVT). Accordingly, using the adaptive cooperation of the subset simulation (SubSim) and the CVT, a new formulation was offered for the approximation of small failure probabilities. The proposed formulation involves a probability term (resulting from a fast-moving SubSim) and an adaptive weighting term that refines the obtained probability. Several numerical and engineering problems, involving nonlinear performance functions and system-level reliability problems, are solved by the proposed approach and common reliability methods. Results showed that the proposed simulation approach is not only more efficient, but is also robust than common reliability methods. It also presents a good potential for application in engineering reliability problems.  相似文献   

3.
结构可靠性分析的支持向量机方法   总被引:10,自引:0,他引:10  
针对结构可靠性分析中功能函数不能显式表达的问题,将支持向量机方法引入到结构可靠性分析中.支持向量机是一种实现了结构风险最小化原则的分类技术,它具有出色的小样本学习性能和良好的泛化性能,因此提出了两种基于支持向量机的结构可靠性分析方法.与传统的响应面法和神经网络法相比,支持向量机可靠性分析方法的显著特点是在小样本下高精度地逼近函数,并且可以避免维数灾难.算例结果也充分表明支持向量机方法可以在抽样范围内很好地逼近真实的功能函数,减少隐式功能函数分析(通常是有限元分析)的次数,具有一定的工程实用价值.  相似文献   

4.
采用传统极限平衡法进行边坡可靠度分析时,不可避免会遇到一个问题,即边坡功能函数形式的高度非线性以及隐含性.对于隐式功能函数,传统的求解方法是通过对功能函数进行多次迭代,从而得到安全系数值.但是由于功能函数的形式较为复杂,导致迭代计算的过程变得尤为繁琐且效率低下.鉴于传统边坡可靠度分析中存在的安全系数计算繁琐耗时的问题,提出一种基于粒子群优化(PSO)算法的自动采样Kriging代理模型方法,该方法可以代替功能函数的作用进行安全系数的求解.首先用拉丁超立方抽样方法(LHS)选取少量土体参数组,并通过极限平衡法求出对应的安全系数,将土体参数组和安全系数作为初始样本建立Kriging模型;其次由粒子群优化算法将最有期望改善模型拟合精度的样本点添加到样本集合中,以逐步迭代提升Kriging模型的计算精度;最后集合经典蒙特卡洛模拟(MCS)获得边坡的破坏概率.通过一个双层的土质边坡算例分析,证明了该方法可以实现准确高效的安全系数计算,尤其是在安全系数计算量十分庞大的情况下可以大大节省计算时间,是一种有效的边坡工程稳定可靠度分析方法.  相似文献   

5.
An adaptive trivariate dimension-reduction method is proposed for statistical moments evaluation and reliability analysis in this paper. First, the raw moments of the performance function can be estimated by means of the trivariate dimension-reduction method, where the trivariate, bivariate and univariate Gaussian-weighted integrals are involved. Since the trivariate and bivariate integrals control the efficiency and accuracy, delineating the existence of bivariate and trivariate cross terms is performed, which could significantly reduce the numbers of trivariate and bivariate integrals to be evaluated. When the cross terms exist, the trivariate and bivariate integrals are numerically evaluated directly by the high-order unscented transformation, where the involved free parameters are provided. When the cross terms don’t exist, the trivariate and bivariate integrals can be further decomposed to be the lower-dimensional integrals, where the high-order unscented transformation is again adopted for numerical integrations. In that regard, the first-four central moments can be computed accordingly and the performance function’s probability density function can be reconstructed by fitting the shifted generalized lognormal distribution model based on the first-four central moments. Then, the failure probability can be computed by a one-dimensional integral over the performance function’s probability density function in the failure domain. Three numerical examples, including both the explicit and implicit performance functions, are investigated, to demonstrate the efficacy of the proposed method for both the statistical moments assessment and reliability analysis.  相似文献   

6.
A new algorithm based on nonlinear transformation is proposed to improve the classical maximum entropy method and solve practical problems of reliability analysis. There are three steps in the new algorithm. Firstly, the performance function of reliability analysis is normalized, dividing by its value when each input is the mean value of the corresponding random variable. Then the nonlinear transformation of such normalized performance function is completed by using a monotonic nonlinear function with an adjustable parameter. Finally, the predictions of probability density function and/or the failure probability in reliability analysis are achieved by looking the result of transformation as a new form of performance function in the classical procedure of maximum entropy method in which the statistic moments are given through the univariate dimension reduction method. In the proposed method, the uncontrollable error of integration on the infinite interval is removed by transforming it into a bounded one. Three typical nonlinear transformation functions are studied and compared in the numerical examples. Comparing with results from Monte Carlo simulation, it is found that a proper choice of the adjustable parameter can lead to a better result of the prediction of failure probability. It is confirmed in the examples that result from the proposed method with the arctangent transformation function is better than the other transformation functions. The error of prediction of failure probability is controllable if the adjustable parameter is chosen in a given interval, but the suggested value of the adjustable parameter can only be given empirically.  相似文献   

7.
This paper presents an approximation method for performing efficient reliability analysis with complex computer models. The computational cost of industrial-scale models can cause problems when performing sampling-based reliability analysis. This is due to the fact that the failure modes of the system typically occupy a small region of the performance space and thus require relatively large sample sizes to accurately estimate their characteristics. The sequential sampling method proposed in this article, combines Gaussian process-based optimisation and subset simulation. Gaussian process emulators construct a statistical approximation to the output of the original code, which is both affordable to use and has its own measure of predictive uncertainty. Subset simulation is used as an integral part of the algorithm to efficiently populate those regions of the surrogate which are likely to lead to the performance function exceeding a predefined critical threshold. The emulator itself is used to inform decisions about efficiently using the original code to augment its predictions. The iterative nature of the method ensures that an arbitrarily accurate approximation of the failure region is developed at a reasonable computational cost. The presented method is applied to an industrial model of a biodiesel filter.  相似文献   

8.
The robustness and efficiency of the first-order reliability method (FORM) are the important issues in the structural reliability analysis. In this paper, a hybrid conjugate search direction with finite-step length is proposed to improve the efficiency and robustness of FORM, namely hybrid conjugate finite-step length (CFSL-H). The conjugate scalar factor in CFSL-H is adaptively updated using two conjugate methods with a dynamic participation factor. The accuracy, efficiency and robustness of the CFSL-H are illustrated through the nonlinear explicit and structural implicit limit state functions with normal and non-normal random variables. The results illustrated that the proposed CFSL-H algorithm is more robust, efficient and accurate than the modified existing FORM algorithms for complex structural problems.  相似文献   

9.
This paper proposes a method combining projection-outline-based active learning strategy with Kriging metamodel for reliability analysis of structures with mixed random and convex variables. In this method, it is determined that the approximation accuracy of projection outlines on the limit-state surface is crucial for estimation of failure probability instead of the whole limit-state surface. To efficiently improve the approximation accuracy of projection outlines, a new projection-outline-based active learning strategy is developed to sequentially obtain update points located around the projection outlines. Taking into account the influence of metamodel uncertainty on the estimation of failure probability, a quantification function of metamodel uncertainty is developed and introduced in the stopping condition of Kriging metamodel update. Finally, Monte Carlo simulation is employed to calculate the failure probability based on the refined Kriging metamodel. Four examples including the Burro Creek Bridge and a piezoelectric energy harvester are tested to validate the performance of the proposed method. Results indicate that the proposed method is accurate and efficient for reliability analysis of structures with mixed random and convex variables.  相似文献   

10.
Slope failure mechanisms (e.g., why and where slope failure occurs) are usually unknown prior to slope stability analysis. Several possible failure scenarios (e.g., slope sliding along different slip surfaces) can be assumed, leading to a number of scenario failure events of slope stability. How to account rationally for various scenario failure events in slope stability reliability analysis and how to identify key failure events that have significant contributions to slope failure are critical questions in slope engineering. In this study, these questions are resolved by developing an efficient computer-based simulation method for slope system reliability analysis. The proposed approach decomposes a slope system failure event into a series of scenario failure events representing possible failure scenarios and calculates their occurrence probabilities by a single run of an advanced Monte Carlo simulation (MCS) method, called generalized Subset Simulation (GSS). Using GSS results, representative failure events (RFEs) that are considered relatively independent are identified from scenario failure events using probabilistic network evaluation technique. Their relative contributions are assessed quantitatively, based on which key failure events are determined. The proposed approach is illustrated using a soil slope example and a rock slope example. It is shown that the proposed approach provides proper estimates of occurrence probabilities of slope system failure event and scenario failure events by a single GSS run, which avoids repeatedly performing simulations for each failure event. Compared with direct MCS, the proposed approach significantly improves computational efficiency, particularly for failure events with small failure probabilities. Key failure events of slope stability are determined among scenario failure events in a cost-effective manner. Such information is valuable in making slope design decisions and remedial measures.  相似文献   

11.
This study proposes a new reliability sensitivity analysis approach using an efficient hybrid simulation method that is a combination of subset simulation, importance sampling and control variates techniques. This method contains a probability term (a fast-moving by subset simulation) and an adaptive weighting part that improves the calculated probability. The Finite Difference Method is used to obtain reliability sensitivities, and the related formulation is derived. Five numerical examples (four-branch model, beam-cable system, one-story frame, ring-stiffened cylinder buckling, and 25-bar steel truss) are presented to describe the applications of the proposed method. The results are compared with those obtained by the available techniques. The results revealed that the proposed method efficiently and accurately solves rare-event, system-level, and real-world engineering problems with explicit and implicit limit state functions.  相似文献   

12.
树状网络系统在管道运输,网络通信中较为常见,对其进行可靠性评估对系统设计及优化具有重要意义。针对树状冗余系统,在n中连续取k失效准则下,通过有限马尔可夫嵌入法并对其进行变形,研究了树状系统可靠性求解方法。本文对树状系统建模加以定义,提出了基于层数参数,层-节点向量,父-子节点矩阵三元参数的树状系统表示方法,研究了变形有限马尔可夫嵌入法的树状系统n中连续取k失效准则下的可靠性求解方法,给出了三个数值算例应用并分析了算法的运算复杂度。最后,本文对比讨论了基于概率母函数法的树状系统在n中连续取k准则下系统可靠性求解方法的研究,得出结论本文算法针对树状冗余系统n中连续取k失效准则下系统可靠性求解应用范围更广,求解效率较高。  相似文献   

13.
多项式混沌拓展(polynomial chaos expansion,PCE)模型现已发展为全局灵敏度分析的强大工具,却很少作为替代模型用于可靠性分析。针对该模型缺乏误差项从而很难构造主动学习函数来逐步更新的事实,在结构可靠性分析的框架下提出了基于PCE模型和bootstrap重抽样的仿真方法来计算失效概率。首先,对试验设计(experimental design)使用bootstrap重抽样步骤以刻画PCE模型的预测误差;其次,基于这个局部误差构造主动学习函数,通过不断填充试验设计以自适应地更新模型,直到能够精确地逼近真实的功能函数;最后,当PCE模型具有足够精确的拟合、预测能力,再使用蒙特卡洛仿真方法来计算失效概率。提出的平行加点策略既能在模型更新过程中找到改进模型拟合能力的"最好"的点,又考虑了模型拟合的计算量;而且,当失效概率的数量级较低时,PCE-bootstrap步骤与子集仿真(subset simulation)的结合能进一步加速失效概率估计量的收敛。本文方法将PCE模型在概率可靠性领域的应用从灵敏度分析延伸到了可靠性分析,同时,算例分析结果显示了该方法的精确性和高效性。  相似文献   

14.
For accurately and efficiently estimating the time-dependent failure probability (TDFP) of the structure, a novel adaptive multiple-Kriging-surrogate method is proposed. In the proposed method, the multiple Kriging models with different regression trends (i.e., constant, linear and quadratic) are simultaneously constructed with the highest accuracy, on which the TDFP can be obtained. The multiple regression trends are adaptively selected based on the size of sample base, the maximum differences of multiple models and the global accuracy of multiple models. After that, the most suitable multiple regression trends are identified. The proposed method can avoid man-made subjectivity for regression trend in general Kriging surrogate method. Furthermore, better accuracy and efficiency will be obtained by the proposed multiple surrogates than just using a fixed regression model for some engineering applications. Five examples involving four applications with explicit performance function and one tone arch bridge under hurricane load example with implicit performance function are introduced to illustrate the effectiveness of the proposed method for estimating TDFP.  相似文献   

15.
High-dimensional reliability analysis is still an open challenge in structural reliability community. To address this problem, a new sampling approach, named the good lattice point method based partially stratified sampling is proposed in the fractional moments-based maximum entropy method. In this approach, the original sample space is first partitioned into several orthogonal low-dimensional sample spaces, say 2 and 1 dimensions. Then, the samples in each low-dimensional sample space are generated by the good lattice point method, which are deterministic points and possess the property of large variance reduction. Finally, the samples in the original space can be obtained by randomly pairing the samples in low-dimensions, which may also significantly reduce the variance in high-dimensional cases. Then, this sampling approach is applied to evaluate the low-order fractional moments in the maximum entropy method with the tradeoff of efficiency and accuracy for high-dimensional reliability problems. In this regard, the probability density function of the performance function involving a large number of random inputs can be derived accordingly, where the reliability can be straightforwardly evaluated by a simple integral over the probability density function. Numerical examples are studied to validate the proposed method, which indicate the proposed method is of accuracy and efficiency for high-dimensional reliability analysis.  相似文献   

16.
The response surface method (RSM), a simple and effective approximation technique, is widely used for reliability analysis in civil engineering. However, the traditional RSM needs a considerable number of samples and is computationally intensive and time-consuming for practical engineering problems with many variables. To overcome these problems, this study proposes a new approach that samples experimental points based on the difference between the last two trial design points. This new method constructs the response surface using a support vector machine (SVM); the SVM can build complex, nonlinear relations between random variables and approximate the performance function using fewer experimental points. This approach can reduce the number of experimental points and improve the efficiency and accuracy of reliability analysis. The advantages of the proposed method were verified using four examples involving random variables with different distributions and correlation structures. The results show that this approach can obtain the design point and reliability index with fewer experimental points and better accuracy. The proposed method was also employed to assess the reliability of a numerically modeled tunnel. The results indicate that this new method is applicable to practical, complex engineering problems such as rock engineering problems.  相似文献   

17.
The reliability for Weibull distribution with homogeneous heavily censored data is analyzed in this study. The universal model of heavily censored data and existing methods, including maximum likelihood, least-squares, E-Bayesian estimation, and hierarchical Bayesian methods, are introduced. An improved method is proposed based on Bayesian inference and least-squares method. In this method, the Bayes estimations of failure probabilities are focused on for all the samples. The conjugate prior distribution of failure probability is set, and an optimization model is developed by maximizing the information entropy of prior distribution to determine the hyper-parameters. By integrating the likelihood function, the posterior distribution of failure probability is then derived to yield the Bayes estimation of failure probability. The estimations of reliability parameters are obtained by fitting distribution curve using least-squares method. The four existing methods are compared with the proposed method in terms of applicability, precision, efficiency, robustness, and simplicity. Specifically, the closed form expressions concerning E-Bayesian estimation and hierarchical Bayesian methods are derived and used. The comparisons demonstrate that the improved method is superior. Finally, three illustrative examples are presented to show the application of the proposed method.  相似文献   

18.
如何推断系统的故障概率,是目前可靠性工程领域的一个重要问题.而对具有动态随机性故障的可修系统采用静态近似处理,经常导致计算的可靠性指标与实际情况相差甚远,采用蒙特卡罗方法产生等价于船用核动力系统基本部件故障率的随机数,代入到仿真模型中,经过逻辑运算得到等价于系统故障概率的随机数,对多次仿真得到的数据进行统计推断,便得到系统故障的概率分布及相应的置信区间.此方法计算结果精度高,对船用核动力装置的可靠性分析有重要意义.  相似文献   

19.
The accurate estimation of rare event probabilities is a crucial problem in engineering to characterize the reliability of complex systems. Several methods such as Importance Sampling or Importance Splitting have been proposed to perform the estimation of such events more accurately (i.e., with a lower variance) than crude Monte Carlo method. However, these methods assume that the probability distributions of the input variables are exactly defined (e.g., mean and covariance matrix perfectly known if the input variables are defined through Gaussian laws) and are not able to determine the impact of a change in the input distribution parameters on the probability of interest. The problem considered in this paper is the propagation of the input distribution parameter uncertainty defined by intervals to the rare event probability. This problem induces intricate optimization and numerous probability estimations in order to determine the upper and lower bounds of the probability estimate. The calculation of these bounds is often numerically intractable for rare event probability (say 10?5), due to the high computational cost required. A new methodology is proposed to solve this problem with a reduced simulation budget, using the adaptive Importance Sampling. To this end, a method for estimating the Importance Sampling optimal auxiliary distribution is proposed, based on preceding Importance Sampling estimations. Furthermore, a Kriging-based adaptive Importance Sampling is used in order to minimize the number of evaluations of the computationally expensive simulation code. To determine the bounds of the probability estimate, an evolutionary algorithm is employed. This algorithm has been selected to deal with noisy problems since the Importance Sampling probability estimate is a random variable. The efficiency of the proposed approach, in terms of accuracy of the found results and computational cost, is assessed on academic and engineering test cases.  相似文献   

20.
The aim of this paper is to model lifetime data for systems that have failure modes by using the finite mixture of Weibull distributions. It involves estimating of the unknown parameters which is an important task in statistics, especially in life testing and reliability analysis. The proposed approach depends on different methods that will be used to develop the estimates such as MLE through the EM algorithm. In addition, Bayesian estimations will be investigated and some other extensions such as Graphic, Non-Linear Median Rank Regression and Monte Carlo simulation methods can be used to model the system under consideration. A numerical application will be used through the proposed approach. This paper also presents a comparison of the fitted probability density functions, reliability functions and hazard functions of the 3-parameter Weibull and Weibull mixture distributions using the proposed approach and other conventional methods which characterize the distribution of failure times for the system components. GOF is used to determine the best distribution for modeling lifetime data, the priority will be for the proposed approach which has more accurate parameter estimates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号