首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Yves Dallery 《Queueing Systems》1994,15(1-4):199-209
Failures of machines have a significant effect on the behavior of manufacturing systems. As a result it is important to model this phenomenon. Many queueing models of manufacturing systems do incorporate the unreliability of the machines. Most models assume that the times to failure and the times to repair of each machine are exponentially distributed (or geometrically distributed in the case of discrete-time models). However, exponential distributions do not always accurately represent actual distributions encountered in real manufacturing systems. In this paper, we propose to model failure and repair time distributions bygeneralized exponential (GE) distributions (orgeneralized geometric distributions in the case of a discretetime model). The GE distribution can be used to approximate distributions with any coefficient of variation greater than one. The main contribution of the paper is to show that queueing models in which failure and repair times are represented by GE distributions can be analyzed with the same complexity as if these distributions were exponential. Indeed, we show that failures and repair times represented by GE distributions can (under certain assumptions) be equivalently represented by exponential distributions.This work was performed while the author was visiting the Laboratory for Manufacturing and Productivity, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.  相似文献   

2.
A new methodology for performance analysis of flexible manufacturing systems (FMSs) with priority scheduling is presented. The analytic model developed extends the mean value analysis of closed networks of queues with multiple product types, various non-preemptive priority service disciplines, and with parallel machine stations. Performance measures derived include the expected throughput per product and per station, utilization of machines and transporters, queuing times and queue length measures for various configurations. Extensive numerical calculations have shown that the algorithm used for solving the problem converges rapidly and retains numerical stability for large models. The paper also illustrates the application of the model to a system with a mixture of FCFS and HOL disciplines which gives insights into various priority assignment policies in FMSs. Special attention was given to the problem of scheduling the robot carriers (transporters).  相似文献   

3.
This paper models and analyzes the throughput of a two-stage manufacturing system with multiple independent unreliable machines at each stage and one finite-sized buffer between the stages. The machines follow exponential operation, failure, and repair processes. Most of the literature uses binary random variables to model unreliable machines in transfer lines and other production lines. This paper first illustrates the importance of using more than two states to model parallel unreliable machines because of their independent and asynchronous operations in the parallel system. The system balance equations are then formulated based on a set of new notations of vector manipulations, and are transformed into a matrix form fitting the properties of the Quasi-Birth–Death (QBD) process. The Matrix-Analytic (MA) method for solving the generic QBD processes is used to calculate the system state probability and throughput. Numerical cases demonstrate that solution method is fast and accurate in analyzing parallel manufacturing systems, and thus prove the applicability of the new model and the effectiveness of the MA-based method. Such multi-state models and their solution techniques can be used as a building block for analyzing larger, more complex manufacturing systems.  相似文献   

4.
In this article, we develop an imperfect economic manufacturing quantity (EMQ) model for an unreliable production system subject to process deterioration, machine breakdown and repair and buffer stock. The basic model is developed under general process shift, machine breakdown and repair time distributions. We suggest a computational algorithm for determination of the optimal safety stock and production run time which minimize the expected cost per unit time in the steady state. For a numerical example, we illustrate the outcome of the proposed model and perform a sensitivity analysis with respect to the model-parameters which have direct influence on the optimal decisions.  相似文献   

5.
This paper presents a simulation and an analytical modeling of the machine interference problem in manufacturing cells. Each machine experiences two types of stoppage which are to be served by a robot. Several combinations of the distributions of the stoppages, service distributions, and service disciplines are studied. We determine the optimal number of machines assigned to the robot under different operating conditions such that the expected total cost of the manufacturing cell is minimized.This research is based on Mr. Norton's M.S. thesis, which was conducted at Rutgers.  相似文献   

6.
The von Mises-Fisher distribution is widely used for modelling directional data. In this paper we propose goodness-of-fit methods for a concentrated von Mises-Fisher distribution and we analyse by simulation some questions concerning the application of these tests. We analyse the empirical power of the Kolmogorov-Smirnov test for several dimensions of the sphere, supposing as alternative hypothesis a mixture of two von Mises-Fisher distributions with known parameters. We also compare the empirical power of the Kolmogorov-Smirnov test with the Rao’s score test for data on the sphere, supposing as alternative hypothesis, a mixture of two Fisher distributions with unknown parameters replaced by their maximum likelihood estimates or a 5-parameter Fisher-Bingham distribution. Finally, we give an example with real spherical data.  相似文献   

7.
In this paper we present a robust conjugate duality theory for convex programming problems in the face of data uncertainty within the framework of robust optimization, extending the powerful conjugate duality technique. We first establish robust strong duality between an uncertain primal parameterized convex programming model problem and its uncertain conjugate dual by proving strong duality between the deterministic robust counterpart of the primal model and the optimistic counterpart of its dual problem under a regularity condition. This regularity condition is not only sufficient for robust duality but also necessary for it whenever robust duality holds for every linear perturbation of the objective function of the primal model problem. More importantly, we show that robust strong duality always holds for partially finite convex programming problems under scenario data uncertainty and that the optimistic counterpart of the dual is a tractable finite dimensional problem. As an application, we also derive a robust conjugate duality theorem for support vector machines which are a class of important convex optimization models for classifying two labelled data sets. The support vector machine has emerged as a powerful modelling tool for machine learning problems of data classification that arise in many areas of application in information and computer sciences.  相似文献   

8.
Two-parameter gamma distributions are widely used in liability theory, lifetime data analysis, financial statistics, and other areas. Finite mixtures of gamma distributions are their natural extensions, and they are particularly useful when the population is suspected of heterogeneity. These distributions are successfully employed in various applications, but many researchers falsely believe that the maximum likelihood estimator of the mixing distribution is consistent. Similarly to finite mixtures of normal distributions, the likelihood function under finite gamma mixtures is unbounded. Because of this, each observed value leads to a global maximum that is irrelevant to the true distribution. We apply a seemingly negligible penalty to the likelihood according to the shape parameters in the fitted model. We show that this penalty restores the consistency of the likelihoodbased estimator of the mixing distribution under finite gamma mixture models. We present simulation results to validate the consistency conclusion, and we give an example to illustrate the key points.  相似文献   

9.
Mixture of Experts(MoE) regression models are widely studied in statistics and machine learning for modeling heterogeneity in data for regression, clustering and classification.Laplace distribution is one of the most important statistical tools to analyze thick and tail data. Laplace Mixture of Linear Experts(LMoLE) regression models are based on the Laplace distribution which is more robust. Similar to modelling variance parameter in a homogeneous population, we propose and study a new novel class of models: heteroscedastic Laplace mixture of experts regression models to analyze the heteroscedastic data coming from a heterogeneous population in this paper. The issues of maximum likelihood estimation are addressed. In particular, Minorization-Maximization(MM) algorithm for estimating the regression parameters is developed. Properties of the estimators of the regression coefficients are evaluated through Monte Carlo simulations. Results from the analysis of two real data sets are presented.  相似文献   

10.
This paper describes a study undertaken to develop a model for the replacement of a particular type of machine. The dominant operating costs are identified, and existing replacement models reviewed. One of the most important factors is the cost of production stoppages which can sometimes result from the breakdown of these machines. In order to predict the effects of this in terms of the machines' age, a simulation model is developed.The results from the replacement model are investigated in terms of their sensitivity to the variability in the estimates of the parameters required by the model. In particular some interesting results relating the method used for calculating the resale values and the optimal replacement interval are presented.  相似文献   

11.
A mixture approach to clustering is an important technique in cluster analysis. A mixture of multivariate multinomial distributions is usually used to analyze categorical data with latent class model. The parameter estimation is an important step for a mixture distribution. Described here are four approaches to estimating the parameters of a mixture of multivariate multinomial distributions. The first approach is an extended maximum likelihood (ML) method. The second approach is based on the well-known expectation maximization (EM) algorithm. The third approach is the classification maximum likelihood (CML) algorithm. In this paper, we propose a new approach using the so-called fuzzy class model and then create the fuzzy classification maximum likelihood (FCML) approach for categorical data. The accuracy, robustness and effectiveness of these four types of algorithms for estimating the parameters of multivariate binomial mixtures are compared using real empirical data and samples drawn from the multivariate binomial mixtures of two classes. The results show that the proposed FCML algorithm presents better accuracy, robustness and effectiveness. Overall, the FCML algorithm has the superiority over the ML, EM and CML algorithms. Thus, we recommend FCML as another good tool for estimating the parameters of mixture multivariate multinomial models.  相似文献   

12.
This paper investigates a distributionally robust scheduling problem on identical parallel machines, where job processing times are stochastic without any exact distributional form. Based on a distributional set specified by the support and estimated moments information, we present a min-max distributionally robust model, which minimizes the worst-case expected total flow time out of all probability distributions in this set. Our model doesn’t require exact probability distributions which are the basis for many stochastic programming models, and utilizes more information compared to the interval-based robust optimization models. Although this problem originates from the manufacturing environment, it can be applied to many other fields when the machines and jobs are endowed with different meanings. By optimizing the inner maximization subproblem, the min-max formulation is reduced to an integer second-order cone program. We propose an exact algorithm to solve this problem via exploring all the solutions that satisfy the necessary optimality conditions. Computational experiments demonstrate the high efficiency of this algorithm since problem instances with 100 jobs are optimized in a few seconds. In addition, simulation results convincingly show that the proposed distributionally robust model can hedge against the bias of estimated moments and enhance the robustness of production systems.  相似文献   

13.
The paper presents a generalized economic manufacturing quantity model for an unreliable production system in which the production facility may shift from an ‘in-control’ state to an ‘out-of-control’ state at any random time (when it starts producing defective items) and may ultimately break down afterwards. If a machine breakdown occurs during a production run, then corrective repair is done; otherwise, preventive repair is performed at the end of the production run to enhance the system reliability. The proposed model is formulated assuming that the time to machine breakdown, corrective and preventive repair times follow arbitrary probability distributions. However, the criteria for the existence and uniqueness of the optimal production time are derived under general breakdown and uniform repair time (corrective and preventive) distributions. The optimal production run time is determined numerically and the joint effect of process deterioration, machine breakdowns and repairs (corrective and preventive) on the optimal decisions is investigated for a numerical example.  相似文献   

14.
The analysis of manufacturing systems with finite capacity and with general service time distributions is made of two steps: the distributions have first to be transformed into tractable phase-type distributions, and then the modified system can be analytically modelled. In this paper, we propose a new alternative in order to build tractable phase-type distributions, and study its effects on the global modelling process. Called “probability masses fitting” (PMF), the approach is quite simple: the probability masses on regular intervals are computed and aggregated on a single value in the corresponding interval, leading to a discrete distribution. PMF shows some interesting properties: it is bounding, monotonic, refinable, it approximates distributions with finite support and it conserves the shape of the distribution. With the resulting discrete distributions, the evolution of the system is then exactly modelled by a Markov chain. Here, we focus on flow lines and show that the method allows us to compute upper and lower bounds on the throughput as well as good approximations of the cycle time distributions. Finally, the global modelling method is shown, by numerical experiments, to compute accurate estimations of the throughput and of various performance measures, reaching accuracy levels of a few tenths of a percent.  相似文献   

15.
Fork/join stations are commonly used to model the synchronization constraints in queuing models of computer networks, fabrication/assembly systems and material control strategies for manufacturing systems. This paper presents an exact analysis of a fork/join station in a closed queuing network with inputs from servers with two-phase Coxian service distributions, which models a wide range of variability in the input processes. The underlying queue length and departure processes are analyzed to determine performance measures such as throughput, distributions of the queue length and inter-departure times from the fork/join station. The results show that, for certain parameter settings, variability in the arrival processes has a significant impact on system performance. The model is also used to study the sensitivity of performance measures such as throughput, mean queue lengths, and variability of inter-departure times for a wide range of input parameters and network populations.  相似文献   

16.
The paper proposes a decomposition method for evaluating the performance of transfer lines where machines can fail in multiple modes and can be repaired with non-exponential times. Indeed, while times to machine failure can be often modeled using exponential distributions with acceptable accuracy, times to repair are very rarely observed to be exponentially distributed in actual systems. This feature limits the applicability of existing approximate analytical methods to real production lines. In this paper, the discrete acyclic phase-type distribution is used to model the repair process, for each failure mode of the machines composing the system. The exact analysis of the two-machine system is used as a building block for the decomposition method, proposed to study multi-stage lines. Numerical results show the high accuracy of the developed method in estimating the average throughput and buffer levels.  相似文献   

17.
Both technology and market demands within the high-tech electronics manufacturing industry change rapidly. Accurate and efficient estimation of cycle-time (CT) distribution remains a critical driver of on-time delivery and associated customer satisfaction metrics in these complex manufacturing systems. Simulation models are often used to emulate these systems in order to estimate parameters of the CT distribution. However, execution time of such simulation models can be excessively long limiting the number of simulation runs that can be executed for quantifying the impact of potential future operational changes. One solution is the use of simulation metamodeling which is to build a closed-form mathematical expression to approximate the input–output relationship implied by the simulation model based on simulation experiments run at selected design points in advance. Metamodels can be easily evaluated in a spreadsheet environment “on demand” to answer what-if questions without needing to run lengthy simulations. The majority of previous simulation metamodeling approaches have focused on estimating mean CT as a function of a single input variable (i.e., throughput). In this paper, we demonstrate the feasibility of a quantile regression based metamodeling approach. This method allows estimation of CT quantiles as a function of multiple input variables (e.g., throughput, product mix, and various distributional parameters of time-between-failures, repair time, setup time, loading and unloading times). Empirical results are provided to demonstrate the efficacy of the approach in a realistic simulation model representative of a semiconductor manufacturing system.  相似文献   

18.
A GPSS/H model is presented for a hypothetical flexible manufacturing system. The FMS consists of six machines composed of three machine types, manufactures three types of parts, and uses automatic guided vehicles (AGVs) to transport inprocess parts between appropriate machines and wait spaces in the system. Three logical modules have been designed for the model, with copies of these modules then being appropriately distributed and interfaced throughout the model and tailored to achieve overall representation of the specific FMS. The same technique can be used by others to build analogous or extended GPSS/H models for other specific FMSs in which AGVs are used as transporters. Simulations can then be performed with such models to research FMS design and control alternatives.  相似文献   

19.
针对现有供应商分类方法应用于高端装备制造业供应商所存在的局限性,从相互依赖视角构建了高端装备制造业供应商分类指标体系,提出了基于改进支持向量机的高端装备制造业供应商分类模型。该模型根据供应商误分代价不同,设计代价敏感支持向量机分类器,利用粒子群算法优化分类器的参数,并采用概率输出方法对多个优化的二类分类器的结果进行组合以实现多类分类。实验结果表明,该模型提高了现有方法的分类效果,可以降低总体误分代价,有效识别出对高端装备制造企业具有重大影响的供应商,为高端装备制造企业实施供应商分类管理提供了依据。  相似文献   

20.
Depending on the problem structure and routing strategies a machine location problem plays an important role in controlling the material flow of work-in-process in discrete product manufacturing environment. In this paper we investigate the effect of material flow and workload on the performance of heuristics for solving an important design problem for job routing and material flow in a manufacturing system. In this research we first develop a model for workload or traffic intensity between machines in a shop floor and then identify different structures of the problems, especially the data. This measure is then used to evaluate the effect of workload on efficiency of the heuristics to solve machine location problems. Some concluding remarks are made on to the effect of the workload or the traffic intensity of materials within the machine cell on the performance of some known heuristics. Conclusions are also made on the performance measures such as makespan, transporter utilization and machine utilization, depending on the problem and data structures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号