首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The response surface metamodel is a useful sequential methodology for approximating the relationship between the input variables and the output response in computer simulation. Several strategies have been proposed to increase the accuracy of the estimation of the metamodel. In the current paper, we introduce an effective pseudo-random number (PRN) assignment strategy with Box-Behnken design to construct a more accurate second-order polynomial metamodel to estimate the network reliability of a complex system. The results obtained from the simulation approach show that the reduction in maximum absolute relative error between the response surface approximation and the actual reliability function is 35.63% after the PRN assignment strategy is applied.  相似文献   

2.
Moment-based methods use only statistical moments of random variables for reliability analysis. The cumulative distribution function (CDF) or probability density function (PDF) of a performance function can be constructed from the perspective of the first few statistical moments, and the failure probability can be evaluated accordingly. However, existing moment-based methods may lead to large errors or instability. As such, the present paper focuses on the high order moment method for higher accuracy of reliability estimation by combining the common saddlepoint approximation technique, and an improved high order moment-based saddlepoint approximation (SPA) method for reliability analysis is presented. The approximated cumulant generating function (CGF) and the CDF of the performance function in terms of its first four statistical-moments are constructed. The developed method can be used for reliability evaluation of uncertain structures follow any types of distribution. Several numerical examples are given to demonstrate the efficacy and accuracy of the proposed method. Comparisons of the new method and several existing high order moment methods are also made on the reliability assessment.  相似文献   

3.
This paper presents an efficient third-moment saddlepoint approximation approach for probabilistic uncertainty analysis and reliability evaluation of random structures. By constructing a concise cumulant generating function (CGF) for the state variable according to its first three statistical moments, approximate probability density function and cumulative distribution function of the state variable, which may possess any types of distribution, are obtained analytically by using saddlepoint approximation technique. A convenient generalized procedure for structural reliability analysis is then presented. In the procedure, the simplicity of general moment matching method and the accuracy of saddlepoint approximation technique are integrated effectively. The main difference of the presented method from existing moment methods is that the presented method may provide more detailed information about the distribution of the state variable. The main difference of the presented method from existing saddlepoint approximation techniques is that it does not strictly require the existence of the CGFs of input random variables. With the advantages, the presented method is more convenient and can be used for reliability evaluation of uncertain structures where the concrete probability distributions of input random variables are known or unknown. It is illustrated and examined by five representative examples that the presented method is effective and feasible.  相似文献   

4.
《Applied Mathematical Modelling》2014,38(15-16):3834-3847
Due to its weak dependence on the amount of the uncertainty information, the non-probability convex model approach can be used to deal with the problems without sufficient information. In this paper, by integrating the response surface (RS) technique with the convex model approach, a new structural reliability analysis method is developed for many complex engineering problems with black-box limit-state functions. Using the newly developed correlation analysis technique for non-probability convex model, the multi-dimensional ellipsoid is efficiently constructed to characterize the uncertain parameters. A quadratic polynomial without cross terms is adopted to parameterize the black-box limit-state function, based on which the functional values as well as the first-order gradients can be explicitly calculated. At each iteration, the created RS is combined with the iHL-RF algorithm to obtain an approximate reliability index. A sequential procedure is subsequently formulated to update the RS and hence improve the precision of the reliability analysis. Four numerical examples and one engineering application are investigated to demonstrate the effectiveness of the presented method.  相似文献   

5.
System reliability analysis involving correlated random variables is challenging because the failure probability cannot be uniquely determined under the given probability information. This paper proposes a system reliability evaluation method based on non-parametric copulas. The approximated joint probability distribution satisfying the constraints specified by correlations has the maximal relative entropy with respect to the joint probability distribution of independent random variables. Thus the reliability evaluation is unbiased from the perspective of information theory. The estimation of the non-parametric copula parameters from Pearson linear correlation, Spearman rank correlation, and Kendall rank correlation are provided, respectively. The approximated maximum entropy distribution is then integrated with the first and second order system reliability method. Four examples are adopted to illustrate the accuracy and efficiency of the proposed method. It is found that traditional system reliability method encodes excessive dependence information for correlated random variables and the estimated failure probability can be significantly biased.  相似文献   

6.
The present study deals with support vector regression-based metamodeling approach for efficient seismic reliability analysis of structure. Various metamodeling approaches e.g. response surface method, Kriging interpolation, artificial neural network, etc. are usually adopted to overcome computational challenge of simulation based seismic reliability analysis. However, the approximation capability of such empirical risk minimization principal-based metamodeling approach is largely affected by number of training samples. The support vector regression based on the principle of structural risk minimization has revealed improved response approximation ability using small sample learning. The approach is explored here for improved estimate of seismic reliability of structure in the framework of Monte Carlo Simulation technique. The parameters necessary to construct the metamodel are obtained by a simple effective search algorithm by solving an optimization sub-problem to minimize the mean square error obtained by cross-validation method. The simulation technique is readily applied by random selection of metamodel to implicitly consider record to record variations of earthquake. Without additional computational burden, the approach avoids a prior distribution assumption about approximated structural response unlike commonly used dual response surface method. The effectiveness of the proposed approach compared to the usual polynomial response surface and neural network based metamodels is numerically demonstrated.  相似文献   

7.
Conventional methods addressing the robust design optimization problem of structures usually require high computational requirements due to the nesting of uncertainty quantification within the optimization process. In order to address such a problem, this work proposes a methodology, based on Kriging models, to efficiently assess the uncertainty quantification in the optimization process. The Kriging model approximates the structural performance both in the design domain and in the stochastic domain, which allows to decouple the uncertainty quantification process and the optimization process. In addition, an infill criterion based on the variance of the Kriging prediction is included to update the Kriging model towards the global Pareto front. Three numerical examples show the applicability and the accuracy of the proposed methodology. The results show that the proposed method is appropriate to solve the robust design optimization problem with reasonable accuracy and a considerably lower number of function calls than required by conventional methods.  相似文献   

8.
For structural systems with both epistemic and aleatory uncertainties, the effect of epistemic uncertainty on failure probability is measured by the variance based sensitivity analysis, which generally needs a “triple-loop” crude sampling procedure to solve and is time consuming. Thus, the Kriging method is employed to avoid the complex sampling procedure and improve the computational efficiency. By utilizing the Kriging predictor model, the conditional expectation of failure probability on the given epistemic uncertainty can be calculated efficiently. Compared with the Sobol’s method, the proposed one can ensure reasonable accuracy of results but with lower computational cost. Three examples are employed to demonstrate the reasonability and efficiency of the proposed method.  相似文献   

9.
This paper presents an extension of the theory of finite random sets to infinite random sets, that is useful for estimating the bounds of probability of events, when there is both aleatory and epistemic uncertainty in the representation of the basic variables. In particular, the basic variables can be modelled as CDFs, probability boxes, possibility distributions or as families of intervals provided by experts. These four representations are special cases of an infinite random set. The method introduces a new geometrical representation of the space of basic variables, where many of the methods for the estimation of probabilities using Monte Carlo simulation can be employed. This method is an appropriate technique to model the bounds of the probability of failure of structural systems when there is parameter uncertainty in the representation of the basic variables. A benchmark example is used to demonstrate the advantages and differences of the proposed method compared with the finite approach.  相似文献   

10.
For the time-variant hybrid reliability problem under random and interval uncertainties, the upper bound of time-variant failure probability, as a conservative index to quantify the safety level of the structure, is highly concerned. To efficiently estimate it, the adaptive Kriging respectively combined with design point based importance sampling and meta-model based one are proposed. The first algorithm firstly searches the design point of the hybrid problem, on which the candidate random samples are generated by shifting the sampling center from mean value to design point. Then, the Kriging model is iteratively trained and the hybrid problem is solved by the well-trained Kriging model. The second algorithm firstly utilizes the Kriging-based importance sampling to approximate the quasi-optimal importance sampling samples and estimate the augmented upper bound of time-variant failure probability. After that, the Kriging model is further updated based on these importance samples to estimate the correction factor, on which the hybrid failure probability is calculated by the product of augmented upper bound of time-variant failure probability and correction factor. Meanwhile, an improved learning function is presented to efficiently train an accurate Kriging model. The proposed methods integrate the merits of adaptive Kriging and importance sampling, which can conduct the hybrid reliability analysis by as little as possible computational cost. The presented examples show the feasibility of the proposed methods.  相似文献   

11.
This paper presents a novel approach to simulation metamodeling using dynamic Bayesian networks (DBNs) in the context of discrete event simulation. A DBN is a probabilistic model that represents the joint distribution of a sequence of random variables and enables the efficient calculation of their marginal and conditional distributions. In this paper, the construction of a DBN based on simulation data and its utilization in simulation analyses are presented. The DBN metamodel allows the study of the time evolution of simulation by tracking the probability distribution of the simulation state over the duration of the simulation. This feature is unprecedented among existing simulation metamodels. The DBN metamodel also enables effective what-if analysis which reveals the conditional evolution of the simulation. In such an analysis, the simulation state at a given time is fixed and the probability distributions representing the state at other time instants are updated. Simulation parameters can be included in the DBN metamodel as external random variables. Then, the DBN offers a way to study the effects of parameter values and their uncertainty on the evolution of the simulation. The accuracy of the analyses allowed by DBNs is studied by constructing appropriate confidence intervals. These analyses could be conducted based on raw simulation data but the use of DBNs reduces the duration of repetitive analyses and is expedited by available Bayesian network software. The construction and analysis capabilities of DBN metamodels are illustrated with two example simulation studies.  相似文献   

12.
In this paper, we address an approximate solution of a probabilistically constrained convex program (PCCP), where a convex objective function is minimized over solutions satisfying, with a given probability, convex constraints that are parameterized by random variables. In order to approach to a solution, we set forth a conservative approximation problem by introducing a parameter α which indicates an approximate accuracy, and formulate it as a D.C. optimization problem.  相似文献   

13.
Summary We study the estimation of a density and a hazard rate function based on censored data by the kernel smoothing method. Our technique is facilitated by a recent result of Lo and Singh (1986) which establishes a strong uniform approximation of the Kaplan-Meier estimator by an average of independent random variables. (Note that the approximation is carried out on the original probability space, which should be distinguished from the Hungarian embedding approach.) Pointwise strong consistency and a law of iterated logarithm are derived, as well as the mean squared error expression and asymptotic normality, which is obtain using a more traditional method, as compared with the Hajek projection employed by Tanner and Wong (1983).  相似文献   

14.
Kriging metamodels (also called Gaussian process or spatial correlation models) approximate the Input/Output functions implied by the underlying simulation models. Such metamodels serve sensitivity analysis, especially for computationally expensive simulations. In practice, simulation analysts often know that this Input/Output function is monotonic. To obtain a Kriging metamodel that preserves this characteristic, this article uses distribution-free bootstrapping assuming each input combination is simulated several times to obtain more reliable averaged outputs. Nevertheless, these averages still show sampling variation, so the Kriging metamodel does not need to be an exact interpolator; bootstrapping gives a noninterpolating Kriging metamodel. Bootstrapping may use standard Kriging software. The method is illustrated through the popular M/M/1 model with either the mean or the 90% quantile as output; these outputs are monotonic functions of the traffic rate. The empirical results demonstrate that monotonicity-preserving bootstrapped Kriging gives higher probability of covering the true outputs, without lengthening the confidence interval.  相似文献   

15.
结构的失效可能度及模糊概率计算方法   总被引:2,自引:1,他引:1  
依据模糊可能性理论,系统地建立含模糊变量时结构的可靠性计算模型。旨在解决模糊结构、模糊-随机结构和模糊状态假设下结构的可靠性计算问题。所建模型可给出模糊结构失效的可能度和模糊-随机结构失效概率的可能性分布。研究表明:对同时含模糊变量和随机变量的混合可靠性计算问题,把失效概率(或可靠度)作为模糊变量,能更客观地反映系统的安全状况。算例分析说明了文中方法的合理性和有效性。  相似文献   

16.
The huge computational overhead is the main challenge in the application of community based optimization methods, such as multi-objective particle swarm optimization and multi-objective genetic algorithm, to deal with the multi-objective optimization involving costly simulations. This paper proposes a Kriging metamodel assisted multi-objective particle swarm optimization method to solve this kind of expensively black-box multi-objective optimization problems. On the basis of crowding distance based multi-objective particle swarm optimization algorithm, the new proposed method constructs Kriging metamodel for each expensive objective function adaptively, and then the non-dominated solutions of the metamodels are utilized to guide the update of particle population. To reduce the computational cost, the generalized expected improvements of each particle predicted by metamodels are presented to determine which particles need to perform actual function evaluations. The suggested method is tested on 12 benchmark functions and compared with the original crowding distance based multi-objective particle swarm optimization algorithm and non-dominated sorting genetic algorithm-II algorithm. The test results show that the application of Kriging metamodel improves the search ability and reduces the number of evaluations. Additionally, the new proposed method is applied to the optimal design of a cycloid gear pump and achieves desirable results.  相似文献   

17.
The Kriging surrogate model, which is frequently employed to apply evolutionary computation to real-world problems, with a coordinate transformation of the design space is proposed to improve the approximation accuracy of objective functions with correlated design variables. The coordinate transformation is conducted to extract significant trends in the objective function and identify the suitable coordinate system based on either one of two criteria: likelihood function or estimated gradients of the objective function to each design variable. Compared with the ordinary Kriging model, the proposed methods show higher accuracy in the approximation of various test functions. The proposed method based on likelihood shows higher accuracy than that based on gradients when the number of design variables is less than six. The latter method achieves higher accuracy than the ordinary Kriging model even for high-dimensional functions and is applied to an airfoil design problem with spline curves as an example with correlated design variables. This method achieves better performances not only in the approximation accuracy but also in the capability to explore the optimal solution.  相似文献   

18.
An efficient approach, called augmented line sampling, is proposed to locally evaluate the failure probability function (FPF) in structural reliability-based design by using only one reliability analysis run of line sampling. The novelty of this approach is that it re-uses the information of a single line sampling analysis to construct the FPF estimation, repeated evaluations of the failure probabilities can be avoided. It is shown that, when design parameters are the distribution parameters of basic random variables, the desired information about FPF can be extracted through a single implementation of line sampling. Line sampling is a highly efficient and widely used reliability analysis method. The proposed method extends the traditional line sampling for the failure probability estimation to the evaluation of the FPF which is a challenge task. The required computational effort is neither relatively sensitive to the number of uncertain parameters, nor grows with the number of design parameters. Numerical examples are given to show the advantages of the approach.  相似文献   

19.
A new computational method to evaluate comprehensively the positional accuracy reliability for single coordinate, single point, multipoint and trajectory accuracy of industrial robots is proposed using the sparse grid numerical integration method and the saddlepoint approximation method. A kinematic error model of end-effector is constructed in three coordinate directions using the sparse grid numerical integration method considering uncertain parameters. The first-four order moments and the covariance matrix for three coordinates of the end-effector are calculated by extended Gauss–Hermite integration nodes and corresponding weights. The eigen-decomposition is conducted to transform the interdependent coordinates into independent standard normal variables. An equivalent extreme value distribution of response is applied to assess the reliability of kinematic accuracy. The probability density function and probability of failure for extreme value distribution are then derived through the saddlepoint approximation method. Four examples are given to demonstrate the effectiveness of the proposed method.  相似文献   

20.
This paper investigates the use of Kriging in random simulation when the simulation output variances are not constant. Kriging gives a response surface or metamodel that can be used for interpolation. Because Ordinary Kriging assumes constant variances, this paper also applies Detrended Kriging to estimate a non-constant signal function, and then standardizes the residual noise through the heterogeneous variances estimated from replicated simulation runs. Numerical examples, however, suggest that Ordinary Kriging is a robust interpolation method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号