共查询到20条相似文献,搜索用时 46 毫秒
1.
In statistical process control (SPC), when dealing with a quality characteristic x that is a variable, it is usually necessary to monitor both the mean value and variability. This article proposes an optimization algorithm (called the holistic algorithm) to design the CUSUM charts for this purpose. It facilitates the determination of the charting parameters of the CUSUM charts and considerably or significantly increases their overall detection effectiveness. A single CUSUM chart (called the ABS CUSUM chart) has been developed by the holistic algorithm and fully investigated. This chart is able to detect two-sided mean shifts and increasing variance shifts by inspecting the absolute value of sample mean shift. The results of performance studies show that the overall performance of the ABS CUSUM chart is nearly as good as an optimal 3-CUSUM scheme (a scheme incorporating three individual CUSUM charts). However, since the ABS CUSUM chart is easier for implementation and design, it may be more suitable for many SPC applications in which both mean and variance of a variable have to be monitored. 相似文献
2.
In recent years, statistical process control (SPC)
has been widely used to monitor the performance of clinical practitioners,
such as surgeons and general practitioners. In this paper, two risk-adjusted
geometric control charts namely cumulative sum (CUSUM) and weighted likelihood
ratio test (WLRT) are proposed to monitor surgery performance in phase II.
The performance of the proposed control charts is evaluated and compared by
simulation experiments for different shift values in the parameters of a
risk-adjusted logistic regression model in terms of the average run length
(ARL) criterion. The results show that all methods work well in the sense
that they can effectively detect shifts in the process parameters. 相似文献
3.
4.
Membrane algorithms (MAs), which inherit from P systems, constitute a new parallel and distribute framework for approximate computation. In the paper, a membrane algorithm is proposed with the improvement that the involved parameters can be adaptively chosen. In the algorithm, some membranes can evolve dynamically during the computing process to specify the values of the requested parameters. The new algorithm is tested on a well-known combinatorial optimization problem, the travelling salesman problem. The em-pirical evidence suggests that the proposed approach is efficient and reliable when dealing with 11 benchmark instances, particularly obtaining the best of the known solutions in eight instances. Compared with the genetic algorithm, simulated annealing algorithm, neural net-work and a fine-tuned non-adaptive membrane algorithm, our algorithm performs better than them. In practice, to design the airline network that minimize the total routing cost on the CAB data with twenty-five US cities, we can quickly obtain high quality solutions using our algorithm. 相似文献
5.
Membrane algorithms (MAs), which inherit from P systems, constitute a new parallel and distribute framework for approximate computation. In the paper, a membrane algorithm is proposed with the improvement that the involved parameters can be adaptively chosen. In the algorithm, some membranes can evolve dynamically during the computing process to specify the values of the requested parameters. The new algorithm is tested on a well-known combinatorial optimization problem, the travelling salesman problem. The empirical evidence suggests that the proposed approach is efficient and reliable when dealing with 11 benchmark instances, particularly obtaining the best of the known solutions in eight instances. Compared with the genetic algorithm, simulated annealing algorithm, neural network and a fine-tuned non-adaptive membrane algorithm, our algorithm performs better than them. In practice, to design the airline network that minimize the total routing cost on the CAB data with twenty-five US cities, we can quickly obtain high quality solutions using our algorithm. 相似文献
6.
In this paper, we propose a realistic mathematical model taking into account the mutual interference among the interacting populations. This model attempts to describe the control (vaccination) function as a function of the number of infective individuals, which is an improvement over the existing susceptible–infective epidemic models. Regarding the growth of the epidemic as a nonlinear phenomenon we have developed a neural network architecture to estimate the vital parameters associated with this model. This architecture is based on a recently developed new class of neural networks known as co-operative and supportive neural networks. The application of this architecture to the present study involves preprocessing of the input data, and this renders an efficient estimation of the rate of spread of the epidemic. It is observed that the proposed new neural network outperforms a simple feed-forward neural network and polynomial regression. 相似文献
7.
8.
Sustainable development and sustainability assessment have been of great interest to both academe and practitioners in the past decades. In this study, we review the literature on data envelopment analysis (DEA) applications in sustainability using citation-based approaches. A directional network is constructed based on citation relationships among DEA papers published in journals indexed by the Web of Science database from 1996 to March 2016. We first draw the citation chronological graph to present a complete picture of literature development trajectory since 1996. Then we identify the local main DEA development paths in sustainability research by assigning an importance index, namely search path count (SPC), to each link in the citation network. The local main path suggests that the current key route of DEA applications in sustainability focus on the environmental sustainability. Through the Kamada–Kawai layout algorithm, we find four research clusters in the literature including corporate sustainability assessment, regional sustainability assessment, sustainability composite indicator construction, and sustainability performance analysis. For each of the clusters, we further identify the key articles based on citation network and local citation scores, demonstrate the developmental trajectory of the literature, and suggest future research directions. 相似文献
9.
10.
Indicator space configuration for early warning of violent political conflicts by genetic algorithms
Recognition of preconflict situations has a powerful potential for early warning of violent political conflicts. This paper
focuses on the design and application of artificial neural networks as classifiers of preconflict situations. Achieving a
desired level of performance of the neural network relies on the appropriate construction of recognition space (selection
of indicators) and the choice of network architecture. A fast and effective method for the design of reliable neural recognition
systems is described. It is based on genetic algorithm techniques and optimizes both the configuration of input space and
the network parameters. The implementation of the methodology provides for increased performance of the classifier in terms
of accuracy, generalization capacity, computational and data requirements.
This revised version was published online in June 2006 with corrections to the Cover Date. 相似文献
11.
《随机分析与应用》2013,31(6):1025-1057
Variable sampling interval (VSI) control charts vary the sampling rate adaptively as a function of the data coming from the process in order to reduce the detection delay of process changes. Zero-time performance refers to the detection delay of a process change that is present during the onset of the chart at time zero. Steady-state performance refers to the detection delay of a process change that occurs after the chart has been operating for some time. The zero-time performance of a VSI control chart can differ considerably from the chart's steady-state performance, which is generally more important than the zero-time performance. We develop an efficient quadratic-programming algorithm for the construction and investigation of steady-state-optimal sampling policies for various VSI charts. We show that a steady-state-optimal VSI scheme is fundamentally different from the respective zero-time-optimal VSI scheme, and recommend VSI policies based on two sampling intervals for the various types of control charts considered. 相似文献
12.
Various process models for discrete manufacturing systems (parts industry) can be treated as bounded discrete-space Markov chains, completely characterized by the original in-control state and a transition matrix for shifts to an out-of-control state. The present work extends these models by using a continuous-state Markov chain, incorporating non-random corrective actions. These actions are to be realized according to the statistical process control (SPC) technique and should substantially affect the model. The developed stochastic model yields Laplace distribution of a process mean. Real-data tests confirm its applicability for the parts industry and show that the distribution parameter is mainly controlled by the SPC sample size. 相似文献
13.
14.
Control charts with exponentially weighted moving average (EWMA) statistics (mean and variance) are used to jointly monitor the mean and variance of a process. An EWMA cost minimization model is presented to design the joint control scheme based on pure economic or both economic and statistical performance criteria. The pure economic model is extended to the economic-statistical design by adding constraints associated with in-control and out-of-control average run lengths. The quality related production costs are calculated using Taguchi’s quadratic loss function. The optimal values of smoothing constants, sampling interval, sample size, and control chart limits are determined by using a numerical search method. The average run length of the control scheme is computed by using the Markov chain approach. Computational study indicates that optimal sample sizes decrease as the magnitudes of shifts in mean and/or variance increase, and higher values of quality loss coefficient lead to shorter sampling intervals. The sensitivity analysis results regarding the effects of various inputs on the chart parameters provide useful guidelines for designing an EWMA-based process control scheme when there exists an assignable cause generating concurrent changes in process mean and variance. 相似文献
15.
Taking into account that the BDS test—which is used as a misspecification test applied to standardized residuals from the GARCH(1,1) model—is characterized by size distortion and departure from normality in finite samples, this paper obtains the critical values for the finite sample distribution of the BDS test. We focus on bootstrap simulation to avoid the sampling uncertainty of parameter estimation and make use of estimated response surface regressions (RSR) derived from the experimental results. We consider an extensive grid of models to obtain critical values with the results of the bootstrap experiments. The RSR used to estimate them is an artificial neural network (ANN) model, instead of the traditional linear regression models. Specifically, we estimate critical values by using a bootstrap aggregated neural network (BANN) and by employing functions of the sample size and parameters used in the experiment as the embedding dimension and proximity parameters in the BDS statistic, GARCH parameters and even the q-quantiles of the BDS distributions. The main results confirm that the sample size and BDS parameters play a role in size distortion. Finally, an empirical application to three price indexes is performed, to highlight the differences between decisions made using the asymptotic or our predicted critical values for the BDS test in finite samples. 相似文献
16.
为了克服神经网络财务危机预警方法收敛慢、不收敛和网络结构难以确定等缺陷,提出了基于蚁群算法的改进神经网络财务危机预警方法。将神经网络模型的结构和参数进行编码,利用蚁群算法确定若干个神经网络模型的结构和参数,然后通过评价函数得到神经网络的最佳结构,最后通过BP算法训练该神经网络,得到神经网络财务危机预警模型。验证结果表明,该模型结构简单、预警精度高。 相似文献
17.
If a given dynamical process contains an inherently unpredictable component, it may be modeled as a stochastic process. Typical examples from financial markets are the dynamics of prices (e.g. prices of stocks or commodities) or fundamental rates (exchange rates etc.). The unknown future value of the corresponding stochastic process is usually estimated as the expected value under a suitable measure, which may be determined from distribution of past (historical) values. The predictive power of this estimation is limited by the simplifying assumptions of common calibration methods. Here we propose a novel method of “intelligent” calibration, using learning (2-layer) neural networks in order to dynamically adapt the parameters of a stochastic model to the most recent time series of fixed length (memory depth) to the past. The process parameters are determined by the weights of the intermediate layer of the neural network. The final layer combines these parameters in a meaningful manner yielding the forecasting value for the stochastic process. On each actual finite memory, the neural network is trained by back-propagation, obtaining a much more flexible and realistic parameter calibration than an analogous fit to an autoregressive models could do. In the context of processes related to financial assets, the final combination of the output layer relates to their market-price-of-risk. The back propagation is limited to the typical memory length of the financial market (for example 10 previous business days). We demonstrate the learning efficiency of the new algorithm by tracking the next-day forecasts with one typical examples each, for the asset classes of currencies and stocks. 相似文献
18.
19.
《European Journal of Operational Research》2002,139(1):68-83
This paper builds on recent work on measuring and evaluating environmental performance of a process using statistical process control (SPC) techniques. We propose the CUSUM chart as a tool to monitor emissions data so that abnormal changes can be detected in a timely manner, and we propose using process capability indices to evaluate environmental performance in terms of the risk of non-compliance situations arising. In doing so, the paper fills an important gap in the ISO 14000 and TQEM literatures, which have focused more on environmental management systems and qualitative aspects rather than on quantitative tools. We explore how process capability indices have the potential to be useful as a risk management tool for practitioners and to help regulators execute and prioritize their enforcement efforts. Together, this should help in setting up useful guidelines for evaluating actual environmental performance against the firm's environmental objectives and targets and regulatory requirements, as well as encouraging further development and application of SPC techniques to the field of environmental quality management and data analysis. 相似文献
20.
基于小波网络的干旱程度评估方法 总被引:2,自引:0,他引:2
本根据干旱事件识别的基本原理,同时基于小波基具有很强的自适应性和函数变化能力,提出了一种基于小波网络的干旱程度评估新方法,并在最小均方能量准则下,采用相应的共轭梯度学习算法求解子波函数线性组合的尺度和时延参数,以及小波网络的权值,仿真实验表明采用该方法极大地提高了对干旱程度辩识的正确率,可为干旱研究提一条新的途径。 相似文献