首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Project dynamics and emergent complexity   总被引:1,自引:0,他引:1  
This paper presents a theoretical analysis of project dynamics and emergent complexity in new product development (NPD) projects subjected to the management concept of concurrent engineering. To provide a comprehensive study, the complexity frameworks, theories and measures that have been developed in organizational theory, systematic engineering design and basic scientific research are reviewed. For the evaluation of emergent complexity in NPD projects, an information-theory quantity—termed “effective measure complexity” (EMC)—is selected from a variety of measures, because it can be derived from first principles and therefore has high construct validity. Furthermore, it can be calculated efficiently from dynamic generative models or purely from historical data, without intervening models. The EMC measures the mutual information between the infinite past and future histories of a stochastic process. According to this principle, it is particularly interesting to evaluate the time-dependent complexity in NPD and to uncover the relevant interactions. To obtain analytical results, a model-driven approach is taken and a vector autoregression (VAR) model of cooperative work is formulated. The formulated VAR model provided the foundation for the calculation of a closed-form solution of the EMC in the original state space. This solution can be used to analyze and optimize complexity based on the model’s independent parameters. Moreover, a transformation into the spectral basis is carried out to obtain more expressive solutions in matrix form. The matrix form allows identification of the surprisingly few essential parameters and calculation of two lower complexity bounds. The essential parameters include the eigenvalues of the work transformation matrix of the VAR model and the correlations between components of performance fluctuations.  相似文献   

2.
The article deals with the effect of a human agent on the activity realization in a project. It does so from the perspective of the Theory of Constraints where the weakest point is the deadline of partial activity. Every existing activity in any project is, to a higher or lower extent, determined by the effect of the human agent. The inefficiency of a number of projects in practice is largely caused by an unsuccessful realization of partial activities. The effect of the human agent is in this respect fundamental. The human agent, as an allocated resource in the activity, is liable to a number of non-specified impacts and stimuli, and as such s/he is rather versatile in his/her behaviour. The versatility of the human agent in projects can be described by the “Student Syndrome” phenomenon and by the first “Parkinson’s law”. Both these qualitative phenomena create a starting point for the introduced theoretical and quantitative research. The findings in this article are based on the set of real data of work effort of the students of daily study programme on one university in the Czech Republic. The article presents, as the authors’ own theoretical contribution, a mathematical model for the “Student Syndrome” phenomenon with a practical use in quantitative methods of project management. This model was derived analytically from a performed data analysis and we can assume that it will be useful for further theoretical development of quantitative methods in project management. In the article we deduce the theoretical differentiation of the “Student Syndrome” phenomenon in work effort into three terminable phases during three different types of resource work allocation. We can regard this original viewpoint as suggestive for the area of human resources management in projects. Its contribution lies in delimitation of time-targeted resource stimulation, which may lead to lower project costs, besides higher work efficiency and compliance with time-targeted deadlines of activity termination. The article brings the quantification of qualitative features of the human agent in project management.  相似文献   

3.
Abstract This paper describes an adaptive learning framework for forecasting end‐season water allocations using climate forecasts, historic allocation data, and results of other detailed hydrological models. The adaptive learning framework is based on artificial neural network (ANN) method, which can be trained using past data to predict future water allocations. Using this technique, it was possible to develop forecast models for end‐irrigation‐season water allocations from allocation data available from 1891 to 2005 based on the allocation level at the start of the irrigation season. The model forecasting skill was further improved by the incorporation of a set of correlating clusters of sea surface temperature (SST) and the Southern oscillation index (SOI) data. A key feature of the model is to include a risk factor for the end‐season water allocations based on the start of the season water allocation. The interactive ANN model works in a risk‐management context by providing probability of availability of water for allocation for the prediction month using historic data and/or with the incorporation of SST/SOI information from the previous months. All four developed ANN models (historic data only, SST incorporated, SOI incorporated, SST‐SOI incorporated) demonstrated ANN capability of forecasting end‐of‐season water allocation provided sufficient data on historic allocation are available. SOI incorporated ANN model was the most promising forecasting tool that showed good performance during the field testing of the model.  相似文献   

4.
This paper aims to assess the performance of a sample of completed building projects in Oregon by employing the range-adjusted measure, a slack-based data envelopment analysis (DEA) model. In the first stage of analysis, project efficiency ratings (ie composite indicators) are derived using selected single performance indicators in a no-output model; whereas in the second stage, censored Tobit regression is employed to model the efficiency ratings. The results indicate that only four out of the 50 sample projects are efficient within the DEA context. Moreover, there is not much evidence for systematic effects of project size on DEA efficiency rating.  相似文献   

5.
Tactical forecasting in supply chain management supports planning for inventory, scheduling production, and raw material purchase, amongst other functions. It typically refers to forecasts up to 12 months ahead. Traditional forecasting models take into account univariate information extrapolating from the past, but cannot anticipate macroeconomic events, such as steep increases or declines in national economic activity. In practice this is countered by using managerial expert judgement, which is well known to suffer from various biases, is expensive and not scalable. This paper evaluates multiple approaches to improve tactical sales forecasting using macro-economic leading indicators. The proposed statistical forecast selects automatically both the type of leading indicators, as well as the order of the lead for each of the selected indicators. However as the future values of the leading indicators are unknown an additional uncertainty is introduced. This uncertainty is controlled in our methodology by restricting inputs to an unconditional forecasting setup. We compare this with the conditional setup, where future indicator values are assumed to be known and assess the theoretical loss of forecast accuracy. We also evaluate purely statistical model building against judgement aided models, where potential leading indicators are pre-filtered by experts, quantifying the accuracy-cost trade-off. The proposed framework improves on forecasting accuracy over established time series benchmarks, while providing useful insights about the key leading indicators. We evaluate the proposed approach on a real case study and find 18.8% accuracy gains over the current forecasting process.  相似文献   

6.
We consider forecasting in systems whose underlying laws are uncertain, while contextual information suggests that future system properties will differ from the past. We consider linear discrete-time systems, and use a non-probabilistic info-gap model to represent uncertainty in the future transition matrix. The forecaster desires the average forecast of a specific state variable to be within a specified interval around the correct value. Traditionally, forecasting uses a model with optimal fidelity to historical data. However, since structural changes are anticipated, this is a poor strategy. Our first theorem asserts the existence, and indicates the construction, of forecasting models with sub-optimal-fidelity to historical data which are more robust to model error than the historically optimal model. Our second theorem identifies conditions in which the probability of forecast success increases with increasing robustness to model error. The proposed methodology identifies reliable forecasting models for systems whose trajectories evolve with Knightian uncertainty for structural change over time. We consider various examples, including forecasting European Central Bank interest rates following 9/11.  相似文献   

7.
In modeling and forecasting mortality the Lee-Carter approach is the benchmark methodology. In many empirical applications the Lee-Carter approach results in a model that describes the log central death rates by means of linear trends. However, due to the volatility in (past) mortality data, the estimation of these trends, and, thus, the forecasts based on them, might be rather sensitive to the sample period employed. We allow for time-varying trends, depending on a few underlying factors, to make the estimates of the future trends less sensitive to the sampling period. We formulate our model in a state-space framework, and use the Kalman filtering technique to estimate it. We illustrate our model using Dutch mortality data.  相似文献   

8.
A better management of time uncertainty in major equipment procurement in engineering construction projects can significantly contribute to project performance. A survey study shows that time buffer is a popularly used approach to protect project schedule from activity duration variation and uncertainty. The problem is that there are repetitive time allowances inserted in the procurement supply chain process and these time buffers are used ineffectively, thus leading to considerable time wastage. Relevant lessons from supply chain management and critical chain project management are combined and applied to create an enhanced critical supply chain management model for major equipment procurement to achieve better management of time uncertainty. This model does not perceive uncertainty purely as a threat, but also as an opportunity to reduce procurement cycle times.  相似文献   

9.
The purpose of assessing past performances and setting future targets for an organisation such as a bank branch is to find where the branch stands in comparison to its peers within the bank branch network and how to improve the efficiency of its operations relatively when compared to the best practice branches. However, future performance targets may be set arbitrarily by the head-office and thus could be unrealistic and not achievable by a branch. A hybrid minimax reference point-data envelopment analysis (HMRP-DEA) approach is investigated to incorporate the value judgements of both branch managers and head-office directors and to search for the most preferred solution (MPS) along the efficient frontier for each bank branch. The HMRP-DEA approach is composed of three minimax models, including the super-ideal point model, the ideal point model and the shortest distance model, which share the same decision and objective spaces, are different from each other only in their reference points and weighting schema, and are proven to be equivalent to the output-oriented DEA dual models. These models are examined both analytically and graphically in this paper using a case study, which provides the unprecedented insight into integrated efficiency and trade-off analyses. The HMRP-DEA approach uses DEA as an ex-post-facto evaluation tool for past performance assessment and the minimax reference point approach as an ex-ante planning tool for future performance forecasting and target setting. Thus, the HMRP-DEA approach provides an alternative means for realistic target setting and better resource allocation. It is examined by a detailed investigation into the performance analysis for the fourteen branches of an international bank in the Greater Manchester area.  相似文献   

10.
随着我国经济快速成长,衍生性金融商品的投资分析,已成为国内财务数学研究热门课题。以股票市场而言,人们总希望比别人早一步掌握行情的脉动,以获取最高的报酬率,然而,影响股市加权股价指数波动的因素众多,要如何进行趋势分析与预测,是很多学者相当感兴趣与研究的主题。本文考虑以模糊统计方法,作模糊时间数列的趋势分析与预测。其望应用模糊统计分析方法比传统的时间数列分析方法能得到更合理的解释,且预测结果可以提供决策者更多的信息,做出正确的决策。最后以台湾地区加权股票指数为例,做一实证上的详细探讨。  相似文献   

11.
A method is proposed to quantify uncertainty on statistical forecasts using the formalism of belief functions. The approach is based on two steps. In the estimation step, a belief function on the parameter space is constructed from the normalized likelihood given the observed data. In the prediction step, the variable Y to be forecasted is written as a function of the parameter θ and an auxiliary random variable Z with known distribution not depending on the parameter, a model initially proposed by Dempster for statistical inference. Propagating beliefs about θ and Z through this model yields a predictive belief function on Y. The method is demonstrated on the problem of forecasting innovation diffusion using the Bass model, yielding a belief function on the number of adopters of an innovation in some future time period, based on past adoption data.  相似文献   

12.
An artificial neural network (ANN) model for economic analysis of risky projects is presented in this paper. Outputs of conventional simulation models are used as neural network training inputs. The neural network model is then used to predict the potential returns from an investment project having stochastic parameters. The nondeterministic aspects of the project include the initial investment, the magnitude of the rate of return, and the investment period. Backpropagation method is used in the neural network modeling. Sigmoid and hyperbolic tangent functions are used in the learning aspect of the system. Analysis of the outputs of the neural network model indicates that more predictive capability can be achieved by coupling conventional simulation with neural network approaches. The trained network was able to predict simulation output based on the input values with very good accuracy for conditions not in its training set. This allowed an analysis of the future performance of the investment project without having to run additional expensive and time-consuming simulation experiments.  相似文献   

13.
New technology implementation projects are notoriously over time and budget resulting in significant financial and strategic organizational consequences. Some argue that inadequate planning and management, misspecification of requirements, team capabilities and learning contribute to cost and schedule over runs. In this paper we examine how learning curve theory could inform better management of new technology implementation projects. Our research makes four important contributions: (1) It presents a comparative analysis of learning curves and proposes how they can be used to help ERP implementation planning and management. (2) Based on empirical data from four ERP implementation projects, it provides illustrations of how managers can apply the curves in different project situations. (3) It provides a theoretical basis for empirical studies of learning and ERP (and other IT) implementations in different organizational settings. (4) It provides empirical justification for the development of learning curve theory in IT implementation.  相似文献   

14.
Accurate estimates of efforts in software development are necessary in project management practices. Project managers or domain experts usually conduct software effort estimation using their experience; hence, subjective or implicit estimates occur frequently. As most software projects have incomplete information and uncertain relations between effort drivers and the required development effort, the grey relational analysis (GRA) method has been applied in building a formal software effort estimation model for this study. The GRA in the grey system theory is a problem-solving method that is used when dealing with similarity measures of complex relations. This paper examines the potentials of the software effort estimation model by integrating a genetic algorithm (GA) to the GRA. The GA method is adopted to find the best fit of weights for each software effort driver in the similarity measures. Experimental results show that the software effort estimation using an integration of the GRA with GA method presents more precise estimates over the results using the case-based reasoning (CBR), classification and regression trees (CART), and artificial neural networks (ANN) methods.  相似文献   

15.
Problems related to the management of construction projects are addressed in many studies. Falling behind schedule and being over budget are examples of bad results, due to the uncertainties and the dynamic environment of the construction process. This study proposes a decision model for helping project managers to focus on the main tasks of a project network during the life cycle of a project based on a MCDA (multiple criteria decision analysis) method. The model assigns priorities classes to activities in project management, taking into account several points of view. The model is based on the ELECTRE TRI-C method, which permits activities to be assigned to categories. As the environment is very dynamic, the model was built taking into consideration changes that may occur while a project is being carried out, and therefore the model must be reassessed during the project life cycle. Furthermore, the model supports a decision making environment where responsibilities are distributed amongst project team members and it brings the benefit of developing disciplines that lead to the team??s greater effectiveness. An application of the model, based on a realistic situation, is presented in the context of a construction project in order to demonstrate the use of the model. The results show that by using the model, managers can improve their performance with regard to controlling project activities.  相似文献   

16.
This paper deals with the problem of forecasting the population of growing pullets in England and Wales. The term Growing Pullet is used for a hen from birth until it is moved to the laying flock. Quarterly data is available on the number of growing pullets alive, while data is published every month on the number of hens hatched.Previous work in the field of forecasting agricultural populations has been based on Cobweb Theorem considerations and relies on past prices and expectations of future prices. However, given the available data and non-price relationships that take place in the growing sector, it is possible to build a non-price model.The approach followed is to start with a naive regression model estimated using ordinary least squares. The model does not conform to prior expectations of regression coefficients and has to be re-estimated using Quadratic Programming. This provides new ideas for a further model that incorporates adaptive expectations and that is updated using a Box and Jenkins time series approach.  相似文献   

17.
The design and development of large-scale software projects is a complex endeavor, often facing problems like cost and schedule overruns as well as low quality. Over the last years the management of software development projects has been recognized as the cornerstone point of seeking improvement and solutions. Simulation modeling of the software project process is gaining interest among academics and practitioners, as a method to tackle the complex questions with which relevant enterprises are confronted. It offers support on several issues, such as defining software product development strategies, decision-making regarding process improvement and training, in a time span ranging from a short portion of the life cycle to long term product evolution, with organization-wide implications. The aim of this work is to implement a model simulating a core part of a software project process, enabling the estimation of several project development details such as delivery times and quality metrics. The purpose of the model is to assist project managers in control and monitoring, but also in identifying the best planning alternatives. The model scope covers a portion of the life cycle of an incremental software development venture.  相似文献   

18.
Propensity scorecards allow forecasting, which bank customers would like to be granted new credits in the near future, through assessing their willingness to apply for new loans. Kalman filtering can help to monitor scorecard performance. Data from successive months are used to update the baseline model. The updated scorecard is the output of the Kalman filter. There is no assumption concerning the scoring model specification and no specific estimation method is presupposed. Thus, the estimator covariance is derived from the bootstrap. The focus is on a relationship between the score and the natural logarithm of the odds for that score, which is used to determine a customer's propensity level. The propensity levels corresponding to the baseline and updated scores are compared. That comparison allows for monitoring whether the scorecard is still up-to-date in terms of assigning the odds. The presented technique is illustrated with an example of a propensity scorecard developed on the basis of credit bureau data.  相似文献   

19.
Forecasting mortality rates is a problem which involves the analysis of high-dimensional time series. Most of usual mortality models propose to decompose the mortality rates into several latent factors to reduce this complexity. These approaches, in particular those using cohort factors, have a good fit, but they are less reliable for forecasting purposes. One of the major challenges is to determine the spatial–temporal dependence structure between mortality rates given a relatively moderate sample size. This paper proposes a large vector autoregressive (VAR) model fitted on the differences in the log-mortality rates, ensuring the existence of long-run relationships between mortality rate improvements. Our contribution is threefold. First, sparsity, when fitting the model, is ensured by using high-dimensional variable selection techniques without imposing arbitrary constraints on the dependence structure. The main interest is that the structure of the model is directly driven by the data, in contrast to the main factor-based mortality forecasting models. Hence, this approach is more versatile and would provide good forecasting performance for any considered population. Additionally, our estimation allows a one-step procedure, as we do not need to estimate hyper-parameters. The variance–covariance matrix of residuals is then estimated through a parametric form. Secondly, our approach can be used to detect nonintuitive age dependence in the data, beyond the cohort and the period effects which are implicitly captured by our model. Third, our approach can be extended to model the several populations in long run perspectives, without raising issue in the estimation process. Finally, in an out-of-sample forecasting study for mortality rates, we obtain rather good performances and more relevant forecasts compared to classical mortality models using the French, US and UK data. We also show that our results enlighten the so-called cohort and period effects for these populations.  相似文献   

20.
This paper reports the results obtained from use of project complexity parameters in modeling effort estimates. It highlights the attention that complexity has recently received in the project management area. After considering that traditional knowledge has consistently proved to be prone to failure when put into practice on actual projects, the paper endorses the belief that there is a need for more open-minded and novel approaches to project management. With a view to providing some insight into the opportunities that integrate complexity concepts into model building offers, we extend the work previously undertaken on the complexity dimension in project management. We do so analyzing the results obtained with classical linear models and artificial neural networks when complexity is considered as another managerial parameter. For that purpose, we have used the International Software Benchmarking Standards Group data set. The results obtained proved the benefits of integrating the complexity of the projects at hand into the models. They also addressed the need of a complex system, such as artificial neural networks, to capture the fine nuances of the complex systems to be modeled, the projects.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号