首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
This paper proposes a method combining projection-outline-based active learning strategy with Kriging metamodel for reliability analysis of structures with mixed random and convex variables. In this method, it is determined that the approximation accuracy of projection outlines on the limit-state surface is crucial for estimation of failure probability instead of the whole limit-state surface. To efficiently improve the approximation accuracy of projection outlines, a new projection-outline-based active learning strategy is developed to sequentially obtain update points located around the projection outlines. Taking into account the influence of metamodel uncertainty on the estimation of failure probability, a quantification function of metamodel uncertainty is developed and introduced in the stopping condition of Kriging metamodel update. Finally, Monte Carlo simulation is employed to calculate the failure probability based on the refined Kriging metamodel. Four examples including the Burro Creek Bridge and a piezoelectric energy harvester are tested to validate the performance of the proposed method. Results indicate that the proposed method is accurate and efficient for reliability analysis of structures with mixed random and convex variables.  相似文献   

2.
The present study deals with support vector regression-based metamodeling approach for efficient seismic reliability analysis of structure. Various metamodeling approaches e.g. response surface method, Kriging interpolation, artificial neural network, etc. are usually adopted to overcome computational challenge of simulation based seismic reliability analysis. However, the approximation capability of such empirical risk minimization principal-based metamodeling approach is largely affected by number of training samples. The support vector regression based on the principle of structural risk minimization has revealed improved response approximation ability using small sample learning. The approach is explored here for improved estimate of seismic reliability of structure in the framework of Monte Carlo Simulation technique. The parameters necessary to construct the metamodel are obtained by a simple effective search algorithm by solving an optimization sub-problem to minimize the mean square error obtained by cross-validation method. The simulation technique is readily applied by random selection of metamodel to implicitly consider record to record variations of earthquake. Without additional computational burden, the approach avoids a prior distribution assumption about approximated structural response unlike commonly used dual response surface method. The effectiveness of the proposed approach compared to the usual polynomial response surface and neural network based metamodels is numerically demonstrated.  相似文献   

3.
结合装备战场损伤仿真系统,研究了贝叶斯网络仿真元模型的构建方法.从条件概率角度描述了仿真模型输入参数与输出参数之间的映射关系,研究了构建贝叶斯网络仿真元模型的可行性,分析了贝叶斯网络仿真元模型的优点;研究了贝叶斯网络仿真元模型构建过程中的关键问题,包括:元模型参数的确定、原始模型参数向贝叶斯网络节点的转化、联结强度的计算、衍生元模型的构建;针对不完全信息条件下装备战场损伤快速定位问题,研究了基于K2算法的贝叶斯网络仿真元模型构建方法;构建了某型高炮的战场损伤贝叶斯网络仿真元模型.  相似文献   

4.
This paper investigates the use of Kriging in random simulation when the simulation output variances are not constant. Kriging gives a response surface or metamodel that can be used for interpolation. Because Ordinary Kriging assumes constant variances, this paper also applies Detrended Kriging to estimate a non-constant signal function, and then standardizes the residual noise through the heterogeneous variances estimated from replicated simulation runs. Numerical examples, however, suggest that Ordinary Kriging is a robust interpolation method.  相似文献   

5.
The major purpose of this paper is to evaluate the practical use of statistical techniques in both the generalization or analysis of simulation results, and the design of simulation experiments. This problem is investigated with the help of a real-life system, namely the container terminus of ECT in Rotterdam. This system is modeled by a simulation program. The relationship between the simulation response and its input variables is modeled by a linear regression model: metamodel or auxiliary model. The paper summarizes regression analysis including generalized least squares which might be used for simulation responses with non-constant variances. The validity of the postulated regression metamodel is tested statistically: F- and t-statistics. The selection of the situations to be simulated, is done through experimental design methodology, permitting both quantitative and qualitative factors. The statistical techniques apply not only to simulation but also to real-life experiments.  相似文献   

6.
This paper reports on the specification of a tomato processing plant cost-function using the Schruben-Cogliano (S-C) experimental procedure for response surface identification. The method is applied to a simulation model of a tomato processing plant in order to generate a cost-function metamodel that can be used to derive conditional factor-demand equations. The results suggest that the S-C methodology provides a means of deriving conditional factor-demand equations which reduces cost-function specification error.  相似文献   

7.
Simulation experiments are often analyzed through a linear regression model of their input/output data. Such an analysis yields a metamodel or response surface for the underlying simulation model. This metamodel can be validated through various statistics; this article studies (1) the coefficient of determination (R-square) for generalized least squares, and (2) a lack-of-fit F-statistic originally formulated by Rao [Biometrika 46 (1959) 49], who assumed multivariate normality. To derive the distributions of these two validation statistics, this paper shows how to apply bootstrapping—without assuming normality. To illustrate the performance of these bootstrapped validation statistics, the paper uses Monte Carlo experiments with simple models. For these models (i) R-square is a conservative statistic (rejecting a valid metamodel relatively rarely), so its power is low; (ii) Rao’s original statistic may reject a valid metamodel too often; (iii) bootstrapping Rao’s statistic gives only slightly conservative results, so its power is relatively high.  相似文献   

8.
The use of a mathematical metamodel such as a regression model, constructed from simulation data and used to aid in the analysis of the simulated system, has been studied in recent years. For practitioners, the vast benefits of establishing a functional relationship among the variables in an unfamiliar and complex simulated system may be largely overshadowed by the concern that the metamodel, being a strongly data-based technique, may be valid only for the one particular set of simulation-generated data that went into it, which is to say not valid at all. Based on a study of 30 simulation experiments using three different simulation models, the authors conclude that the simulation metamodel is a reliable and valid technique to use in post-simulation analysis, and is probably just as good as the simulation model on which it is based.  相似文献   

9.
This article explores the use of metamodels as simulation building blocks. The metamodel replaces a part of the simulation model with a mathematical function that mimics the input–output behavior of that part, with respect to some measure of interest to the designer. The integration of metamodels as components of the simulation model simplifies the model and reduces the simulation time. Such use of the metamodels also gives the designer a better understanding of the behavior of those parts of the model, making the simulation model as a whole more intelligible. The metamodel-based simulation model building process is examined, step by step, and the designer options are explored. This process includes the identification of the metamodel candidates and the construction of the metamodels themselves. The assessment of the proposed approach includes the evaluation of the integration effort of the metamodel into the metamodel-based simulation model, and the accuracy of the output data when compared to the original system.  相似文献   

10.
A method of finding the optimum solution for a stochastic discrete-event system is described. A simulation model of the system is first built and then used to train a neural network metamodel. The optimisation process consists of using the metamodel to find an approximate optimum solution. This solution is then used by the simulation as the starting point in a more precise search for an optimum. The approach is demonstrated with an example that finds the optimum number of kanbans needed to control a manufacturing system.  相似文献   

11.
This paper presents a novel approach to simulation metamodeling using dynamic Bayesian networks (DBNs) in the context of discrete event simulation. A DBN is a probabilistic model that represents the joint distribution of a sequence of random variables and enables the efficient calculation of their marginal and conditional distributions. In this paper, the construction of a DBN based on simulation data and its utilization in simulation analyses are presented. The DBN metamodel allows the study of the time evolution of simulation by tracking the probability distribution of the simulation state over the duration of the simulation. This feature is unprecedented among existing simulation metamodels. The DBN metamodel also enables effective what-if analysis which reveals the conditional evolution of the simulation. In such an analysis, the simulation state at a given time is fixed and the probability distributions representing the state at other time instants are updated. Simulation parameters can be included in the DBN metamodel as external random variables. Then, the DBN offers a way to study the effects of parameter values and their uncertainty on the evolution of the simulation. The accuracy of the analyses allowed by DBNs is studied by constructing appropriate confidence intervals. These analyses could be conducted based on raw simulation data but the use of DBNs reduces the duration of repetitive analyses and is expedited by available Bayesian network software. The construction and analysis capabilities of DBN metamodels are illustrated with two example simulation studies.  相似文献   

12.
Simulation Optimization (SO) is a class of mathematical optimization techniques in which the objective function can only be numerically evaluated through simulation. In this paper, a new SO approach called Golden Region (GR) search is developed for continuous problems. GR divides the feasible region into a number of (sub) regions and selects one region in each iteration for further search based on the quality and distribution of simulated points in the feasible region and the result of scanning the response surface through a metamodel. Monte Carlo experiments show that the GR method is efficient compared to three well-established approaches in the literature. We also prove the asymptotic convergence in probability to a global optimum for a large class of random search methods in general and GR in particular.  相似文献   

13.
Kriging metamodels (also called Gaussian process or spatial correlation models) approximate the Input/Output functions implied by the underlying simulation models. Such metamodels serve sensitivity analysis, especially for computationally expensive simulations. In practice, simulation analysts often know that this Input/Output function is monotonic. To obtain a Kriging metamodel that preserves this characteristic, this article uses distribution-free bootstrapping assuming each input combination is simulated several times to obtain more reliable averaged outputs. Nevertheless, these averages still show sampling variation, so the Kriging metamodel does not need to be an exact interpolator; bootstrapping gives a noninterpolating Kriging metamodel. Bootstrapping may use standard Kriging software. The method is illustrated through the popular M/M/1 model with either the mean or the 90% quantile as output; these outputs are monotonic functions of the traffic rate. The empirical results demonstrate that monotonicity-preserving bootstrapped Kriging gives higher probability of covering the true outputs, without lengthening the confidence interval.  相似文献   

14.
In an integrated circuit (IC) packaging plant, the ink-marking machine has a significantly higher throughput than the other processing machines. When periodic demand surges result in backlog orders or in lost customers, there is a need to increase system throughput. To resolve this problem, the purchase of a new machine often results in excess capacity in addition to added operation and acquisition costs. Therefore, the productivity improvement effort has priority over the machine purchase decision. This paper seeks to optimize both throughput and cycle time performance for IC ink-marking machines. While throughput increase is the primary objective, there is an acceptable cycle time limit for a feasible solution. It is a multi-objective problem. The proposed solution methodology constructed a simulation metamodel for the ink-marking operation by using a fractional factorial experimental design and regression analysis. It is then solved by a hybrid response surface method and lexicographical goal programming approach. Solution results illustrated a successful application.  相似文献   

15.
This paper discusses the use of modern heuristic techniques coupled with a simulation model of a Just in Time system to find the optimum number of kanbans while minimizing cost. Three simulation search heuristic procedures based on Genetic Algorithms, Simulated Annealing, and Tabu Search are developed and compared both with respect to the best results achieved by each algorithm in a limited time span and their speed of convergence to the results. In addition, a Neural Network metamodel is developed and compared with the heuristic procedures according to the best results. The results indicate that Tabu Search performs better than the other heuristics and Neural Network metamodel in terms of computational effort.  相似文献   

16.
Based on the reliability of transportation time, a transportation assignment model of stochastic-flow freight network is designed in this paper. This transportation assignment model is built by mean of stochastic chance-constraint programming and solved with a hybrid intelligent algorithm (HIA) which integrates genetic algorithm (GA), stochastic simulation (SS) and neural network (NN). GA is employed to report the optimal solution as well as the optimal objective function values of the proposed model. SS is used to simulate the value of uncertain system reliability function. The uncertain function approximated via NN is embedded into GA to check the feasibility and to compute the fitness of the chromosomes. These conclusions have been drawn after a test of numerical case using the proposed formulations. System reliability, total system cost and flow on each path would finally reach at their own convergence points. Increase of the system reliability causes increase of the total time cost. The system reliability and the total time cost converge at a possible Nash Equilibrium point.  相似文献   

17.
Researchers have long struggled to identify causal effects in nonexperimental settings. Many recently proposed strategies assume ignorability of the treatment assignment mechanism and require fitting two models—one for the assignment mechanism and one for the response surface. This article proposes a strategy that instead focuses on very flexibly modeling just the response surface using a Bayesian nonparametric modeling procedure, Bayesian Additive Regression Trees (BART). BART has several advantages: it is far simpler to use than many recent competitors, requires less guesswork in model fitting, handles a large number of predictors, yields coherent uncertainty intervals, and fluidly handles continuous treatment variables and missing data for the outcome variable. BART also naturally identifies heterogeneous treatment effects. BART produces more accurate estimates of average treatment effects compared to propensity score matching, propensity-weighted estimators, and regression adjustment in the nonlinear simulation situations examined. Further, it is highly competitive in linear settings with the “correct” model, linear regression. Supplemental materials including code and data to replicate simulations and examples from the article as well as methods for population inference are available online.  相似文献   

18.
This paper proposes a novel single-loop procedure for time-variant reliability analysis based on a Kriging model. A new strategy is presented to decouple the double-loop Kriging model for time-variant reliability analysis, in which the extreme value response in double-loop procedure is replaced by the best value in the current sampled points to avoid the inner optimization loop. Consequently, the extreme value response surface for time-variant reliability analysis can be directly established through a single-loop Kriging surrogate model. To further improve the accuracy of the proposed Kriging model, two methods are provided to adaptively choose a new sample point for updating the model. One method is to apply two commonly used learning functions to select the new sample point that resides as close to the extreme value response surface as possible, and the other is to apply a new learning function to select the new point. Synchronously, the corresponding different stopping criteria are also provided. It is worth nothing that the proposed single-loop Kriging model for time-variant reliability analysis is for a single time-variant performance function. To verify the proposed method, it is applied to four examples, two of which have with random process and others have not. Other popular methods for time-variant reliability analysis including the existing single-loop Kriging model are also used for the comparative analysis and their results testify the effectiveness of the proposed method.  相似文献   

19.
For many years, metamodels have been used in simulation to provide approximations to the input–output functions provided by a simulation model. In this paper, metamodels based on radial basis functions are applied to approximate test functions known from the literature. These tests were conducted to gain insights into the behavior of these metamodels, their ease of computation and their ability to capture the shape and minima of the test functions. These metamodels are compared against polynomial metamodels by using surface and contour graphs of the error function (difference between metamodel and the given function) and by evaluating the numerical stability of the required computations. Full factorial and Latin hypercube designs were used to fit the metamodels. Graphical and statistical methods were used to analyze the test results. Factorial designs were generally more successful for fitting the test functions as compared to Latin hypercube designs. Radial basis function metamodels using factorial and Latin hypercube designs provided better fit than polynomial metamodels using full factorial designs.  相似文献   

20.
Metamodels are used as analysis tools for solving optimization problems. A metamodel is a simplification of the simulation model, representing the system's input–output relationship through a mathematical function with customized parameters. The proposed approach uses confidence intervals as an acceptance procedure, and as a predictive validation procedure when new points are employed. To improve the knowledge about the system, the response is depicted by modelling both the mean and variance functions of a normal distribution along the experimental region. Such metamodels are specially useful when the variance of the output varies significantly. These metamodels may be used for minimizing product quality loss and improving production robustness. The development of such metamodels is illustrated with two examples.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号