首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This article explores the use of metamodels as simulation building blocks. The metamodel replaces a part of the simulation model with a mathematical function that mimics the input–output behavior of that part, with respect to some measure of interest to the designer. The integration of metamodels as components of the simulation model simplifies the model and reduces the simulation time. Such use of the metamodels also gives the designer a better understanding of the behavior of those parts of the model, making the simulation model as a whole more intelligible. The metamodel-based simulation model building process is examined, step by step, and the designer options are explored. This process includes the identification of the metamodel candidates and the construction of the metamodels themselves. The assessment of the proposed approach includes the evaluation of the integration effort of the metamodel into the metamodel-based simulation model, and the accuracy of the output data when compared to the original system.  相似文献   

2.
For many years, metamodels have been used in simulation to provide approximations to the input–output functions provided by a simulation model. In this paper, metamodels based on radial basis functions are applied to approximate test functions known from the literature. These tests were conducted to gain insights into the behavior of these metamodels, their ease of computation and their ability to capture the shape and minima of the test functions. These metamodels are compared against polynomial metamodels by using surface and contour graphs of the error function (difference between metamodel and the given function) and by evaluating the numerical stability of the required computations. Full factorial and Latin hypercube designs were used to fit the metamodels. Graphical and statistical methods were used to analyze the test results. Factorial designs were generally more successful for fitting the test functions as compared to Latin hypercube designs. Radial basis function metamodels using factorial and Latin hypercube designs provided better fit than polynomial metamodels using full factorial designs.  相似文献   

3.
This paper compares two forms of experimental design methods that may be used for the development of regression and neural network simulation metamodels. The experimental designs considered are full factorial designs and random designs. The paper shows that, for two example problems, neural network metamodels using a randomised experimental design produce more accurate and efficient metamodels than those produced by similar sized factorial designs with either regression or neural networks. The metamodelling techniques are compared by their ability to predict the results from two manufacturing systems that have different levels of complexity. The results of the comparison suggest that neural network metamodels outperform conventional regression metamodels, especially when data sets based on randomised simulation experimental designs are used to produce the metamodels rather than data sets from similar sized full factorial experimental designs.  相似文献   

4.
5.
This paper proposes a novel method to select an experimental design for interpolation in simulation. Although the paper focuses on Kriging in deterministic simulation, the method also applies to other types of metamodels (besides Kriging), and to stochastic simulation. The paper focuses on simulations that require much computer time, so it is important to select a design with a small number of observations. The proposed method is therefore sequential. The novelty of the method is that it accounts for the specific input/output function of the particular simulation model at hand; that is, the method is application-driven or customized. This customization is achieved through cross-validation and jackknifing. The new method is tested through two academic applications, which demonstrate that the method indeed gives better results than either sequential designs based on an approximate Kriging prediction variance formula or designs with prefixed sample sizes.  相似文献   

6.
In this study, two manufacturing systems, a kanban-controlled system and a multi-stage, multi-server production line in a diamond tool production system, are optimized utilizing neural network metamodels (tst_NNM) trained via tabu search (TS) which was developed previously by the authors. The most widely used training algorithm for neural networks has been back propagation which is based on a gradient technique that requires significant computational effort. To deal with the major shortcomings of back propagation (BP) such as the tendency to converge to a local optimal and a slow convergence rate, the TS metaheuristic method is used for the training of artificial neural networks to improve the performance of the metamodelling approach. The metamodels are analysed based on their ability to predict simulation results versus traditional neural network metamodels that have been trained by BP algorithm (bp_NNM). Computational results show that tst_NNM is superior to bp_NNM for both of the manufacturing systems.  相似文献   

7.
Kriging metamodels (also called Gaussian process or spatial correlation models) approximate the Input/Output functions implied by the underlying simulation models. Such metamodels serve sensitivity analysis, especially for computationally expensive simulations. In practice, simulation analysts often know that this Input/Output function is monotonic. To obtain a Kriging metamodel that preserves this characteristic, this article uses distribution-free bootstrapping assuming each input combination is simulated several times to obtain more reliable averaged outputs. Nevertheless, these averages still show sampling variation, so the Kriging metamodel does not need to be an exact interpolator; bootstrapping gives a noninterpolating Kriging metamodel. Bootstrapping may use standard Kriging software. The method is illustrated through the popular M/M/1 model with either the mean or the 90% quantile as output; these outputs are monotonic functions of the traffic rate. The empirical results demonstrate that monotonicity-preserving bootstrapped Kriging gives higher probability of covering the true outputs, without lengthening the confidence interval.  相似文献   

8.
Metamodels are used in many disciplines to replace simulation models of complex multivariate systems. To discover metamodels ‘quality-of-fit’ for simulation, simple information returned by average-based statistics, such as root-mean-square error RMSE, are often used. The sample of points used in determining these averages is restricted in size, especially for simulation models of complex multivariate systems. Obviously, decisions made based on average values can be misleading when the sample size is not adequate, and contributions made by each individual data point in such samples need to be examined. This paper presents methods that can be used to discover metamodels quality-of-fit graphically by means of two-dimensional plots. Three plot types are presented; these are the so-called circle plots, marksman plots, and ordinal plots. Such plots can be used to facilitate visual inspection of the effect on metamodel accuracy of each individual point in the data sample used for metamodel validation. The proposed methods can be used to complement quantitative validation statistics; in particular, for situations where there is not enough validation data or the validation data is too expensive to generate.  相似文献   

9.
This paper pertains to a detailed simulation study conducted on a typical Flexible Manufacturing System (FMS). Initially, the configurations of the FMS under study have been established. Two types of FMSs have been evolved: one is failure free and the other is failure prone. For each of these cases, simulation models have been developed using SLAMSYSTEM. Orders arriving randomly at the FMS are subjected to three levels of scheduling decisions namely, launching of parts into the system, routing of parts through machines and sequencing of parts on AGVs at a central buffer. The simulation results have been used to develop metamodels for the two types of FMSs. These metamodels have been subjected to statistical analysis to ascertain their adequacy to represent the simulation models. These metamodels have been found to be useful for simulating the FMSs under study so as to evaluate various multi-level scheduling decisions in the FMS.  相似文献   

10.
Dimensional and similarity analyses are used in physics and engineering, specially in fluid mechanics, to reduce the dimension of the input variable space with no loss of information. Here, we apply these techniques to the propagation of uncertainties for computer codes by the Monte Carlo method, in order to reduce the variance of the estimators of the parameters of the output variable distribution. In the physics and engineering literature, dimensional analysis is often formulated intuitively in terms of physical quantities or dimensions such as time, longitude, or mass; here we use the more rigorous and more abstract generalized dimensional analysis of Moran and Marshek. The reduction of dimensionality is only successful in reducing estimator variance when applying variance-reduction techniques and not when using ordinary random sampling. In this article we use stratified sampling, and the key point of the success of the reduction in dimensionality in improving the precision of the estimates is a better measurement of the distances betwen the outputs, for given inputs. We illustrate the methodology with an application to a physical problem, a radioactive contaminant transport code. A substantial variance reduction is achieved for the estimators of the mean, variance, and distribution function of the output. Last, we present a discussion on which conditions are necessary for the method to be successful.  相似文献   

11.
Kriging metamodeling in simulation: A review   总被引:1,自引:0,他引:1  
This article reviews Kriging (also called spatial correlation modeling). It presents the basic Kriging assumptions and formulas—contrasting Kriging and classic linear regression metamodels. Furthermore, it extends Kriging to random simulation, and discusses bootstrapping to estimate the variance of the Kriging predictor. Besides classic one-shot statistical designs such as Latin Hypercube Sampling, it reviews sequentialized and customized designs for sensitivity analysis and optimization. It ends with topics for future research.  相似文献   

12.
This paper addresses the managerial and economic impacts of improving delivery performance in a serial supply chain when delivery performance is evaluated with respect to a delivery window. Building on contemporary management theories that advocate variance reduction as the critical step in improving the overall performance of a system, we model the variance of delivery time to the final customer as a function of the investment to reduce delivery variance and the costs associated with untimely delivery (expected earliness and lateness). A logarithmic investment function is used and the model solution involves the minimization of a convex–concave total cost function. A numerical example is provided to illustrate the model and the solution procedure. The model presented provides guidelines for determining the optimal level of financial investment for reducing delivery variance. The managerial implications as well as the economic aspects of delivery variance reduction in supply chain management are discussed.  相似文献   

13.
In this study the variability properties of the output of transfer lines are investigated. The asymptotic variance rate of the output of an N-station synchronous transfer line with no interstation buffers and cycle-dependent failures is analytically determined. Unlike the other studies, the analytical method presented in this study yields a closed-form expression for the asymptotic variance rate of the output. The method is based on a general result derived for irreducible recurrent Markov chains. Namely, the limiting variance of the number of visits to a state of an irreducible recurrent Markov chain is obtained from the n-step transition probability function. Thus, the same method can be used in other applications where the limiting variance of the number of visits to a state of an irreducible recurrent Markov chain is of interest. Numerical results show that the asymptotic variance rate of the output does not monotonically increase as the number of stations in the transfer line increases. The asymptotic variance rate of the output may first increase and then decrease depending on the station parameters. This property of the production rate is investigated through numerical experiments and the results are presented.  相似文献   

14.
In this paper, we explore the potential application of fuzzy linear regression in developing simulation metamodels. It should be noted that the basic construct for simulation metamodels involves uncertainties and ambiguities that may be better addressed through fuzzy linear regression application. The solution techniques employed by fuzzy linear regression are very familiar, and the generation of fuzzy outputs may offer a wide range of solution space to the decision maker, thereby reducing the risk of making an incorrect economic decision. A numerical example is presented to show how a possibility distribution is used to capture the vagueness in a dependent variable for a regression metamodel.  相似文献   

15.
A systematic procedure for sensitivity analysis of a case study in the area of air pollution modeling has been performed. Contemporary mathematical models should include a large set of chemical and photochemical reactions to be established as a reliable simulation tool. The Unified Danish Eulerian Model is in the focus of our investigation as one of the most advanced large-scale mathematical models that describes adequately all physical and chemical processes.Variance-based methods are one of the most often used approaches for providing sensitivity analysis. To measure the extent of influence of the variation of the chemical rate constants in the mathematical model over pollutants’ concentrations the Sobol’ global sensitivity indices are estimated using efficient techniques for small sensitivity indices to avoid a loss of accuracy. Studying relationships between input parameters and the model’s output as well as internal mechanisms is very useful for a verification and an improvement of the model and also for development of monitoring and control strategies of harmful emissions, for a reliable prediction of the final output of scenarios when the concentration levels of pollutants are exceeded. The proposed procedure can also be applied when other large-scale mathematical models are used.  相似文献   

16.
Analyzing the sensitivity of model outputs to inputs is important to assess risk and make decisions in engineering application. However, for model with multiple outputs, it is difficult to interpret the sensitivity index since the effect of the dimension and the correlation between multiple outputs are often ignored in the existing methods. In this paper, a new kind of sensitivity analysis method is proposed by use of vector projection and dimension normalization for multiple outputs. Through the dimension normalization, the space of multiple outputs can be unified into a dimensionless one to eliminate the effect of the dimension of the different output. After an affine coordinate system is constructed by considering the correlation of the multiple normalized outputs, a total variance vector for the multiple outputs can be composed by the individual variance of each output. Then, by projecting the variance contribution vector composed by the individual variance contribution of the input to each output on the total variance vector, the new sensitivity indices are proposed for measuring the comprehensive effect of the input on the total variance vector of multiple outputs, it is defined as the ratio of the projection of the variance contribution vector to the norm of the total variance vector. We derive that the Sobol’ indices for a scalar output and the covariance decomposition based indices for multiple outputs are special cases of the proposed vector projection based indices. Then, the mathematical properties and geometric interpretation of the proposed method are discussed. Three numerical examples and a rotating shaft model of an aircraft wing are used to validate the proposed method and show their potential benefits.  相似文献   

17.
The design of computer experiments is an important step in black-box evaluation and optimization processes. When dealing with multiple black-box functions the need often arises to construct designs for all black boxes jointly, instead of individually. These so-called nested designs are particularly useful as training and test sets for fitting and validating metamodels, respectively. Furthermore, nested designs can be used to deal with linking parameters and sequential evaluations. In this paper, we introduce one-dimensional nested maximin designs. We show how to nest two designs optimally and develop a heuristic to nest three and four designs. These nested maximin designs can be downloaded from the website . Furthermore, it is proven that the loss in space-fillingness, with respect to traditional maximin designs, is at most 14.64 and 19.21%, when nesting two and three designs, respectively.  相似文献   

18.
Importance analysis is aimed at finding the contributions of the inputs to the output uncertainty. For structural models involving correlated input variables, the variance contribution by an individual input variable is decomposed into correlated contribution and uncorrelated contribution in this study. Based on point estimate, this work proposes a new algorithm to conduct variance based importance analysis for correlated input variables. Transformation of the input variables from correlation space to independence space and the computation of conditional distribution in the process ensure that the correlation information is inherited correctly. Different point estimate methods can be employed in the proposed algorithm, thus the algorithm is adaptable and evolvable. Meanwhile, the proposed algorithm is also applicable to uncertainty systems with multiple modes. The proposed algorithm avoids the sampling procedure, which usually consumes a heavy computational cost. Results of several examples in this work have proven the proposed algorithm can be used as an effective tool to deal with uncertainty analysis involving correlated inputs.  相似文献   

19.
In this paper we consider some iterative estimation algorithms, which are valid to analyse the variance of data, which may be either non-grouped or grouped with different classification intervals. This situation appears, for instance, when data is collected from different sources and the grouping intervals differ from one source to another. The analysis of variance is carried out by means of general linear models, whose error terms may be general. An initial procedure in the line of the EM, although it does not necessarily agree with it, opens the paper and gives rise to a simplified version where we avoid the double iteration, which implicitly appears in the EM and, also, in the initial procedure mentioned above. The asymptotic stochastic properties of the resulting estimates have been investigated in depth and used to test ANOVA hypothesis.  相似文献   

20.
In off‐line quality control, the settings that minimize the variance of a quality characteristic are unknown and must be determined based on an estimated dual response model of mean and variance. The present paper proposes a direct measure of the efficiency of any given design‐estimation procedure for variance minimization. This not only facilitates the comparison of different design‐estimation procedures, but may also provide a guideline for choosing a better solution when the estimated dual response model suggests multiple solutions. Motivated by the analysis of an industrial experiment on spray painting, the present paper also applies a class of link functions to model process variances in off‐line quality control. For model fitting, a parametric distribution is employed in updating the variance estimates used in an iteratively weighted least squares procedure for mean estimation. In analysing combined array experiments, Engel and Huele (Technometrics, 1996; 39:365) used log‐link to model process variances and considered an iteratively weighted least squares leading to the pseudo‐likelihood estimates of variances as discussed in Carroll and Ruppert (Transformation and Weighting in Regression, Chapman & Hall: New York). Their method is a special case of the approach considered in this paper. It is seen for the spray paint data that the log‐link may not be satisfactory and the class of link functions considered here improves substantially the fit to process variances. This conclusion is reached with a suggested method of comparing ‘empirical variances’ with the ‘theoretical variances’ based on the assumed model. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号