首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
ABSTRACT. The economic performance of fisheries is difficult to measure, due to the importance of (multi‐species) biological dynamics, property rights and regulatory issues affecting fishermen's behavior and efficiency. However, an understanding of performance patterns is essential for enhancing the economic and biological viability of fisheries. In this paper we estimate and evaluate alternative primal stochastic approaches to modeling and measuring technical efficiency for the Northern Spain hake fishery. We then compare the resulting efficiency measures to identify variations in their potential interpretation, and application to policy guidance. We find that multi‐output models are more theoretically and empirically justifiable than aggregate output production function models, and provide additional policy‐relevant insights, but that relative production and efficiency estimates are not sub‐stantively affected by model specification.  相似文献   

2.
Fluid dynamics models provide a powerful deterministic technique to approximate stochasticity in a variety of application areas. In this paper, we study two classes of fluid models, investigate their relationship as well as some of their applications. This analysis allows us to provide analytical models of travel times as they arise in dynamically evolving environments, such as transportation networks as well as supply chains. In particular, using the laws of hydrodynamic theory, we first propose and examine a general second-order fluid model. We consider a first-order approximation of this model and show how it is helpful in analyzing the dynamic traffic equilibrium problem. Furthermore, we present an alternate class of fluid models that are traditionally used in the context of dynamic traffic assignment. By interpreting travel times as price/inventory–sojourn-time relationships, we are also able to connect this approach with a tractable fluid model in the context of dynamic pricing and inventory management.  相似文献   

3.
We describe a strategy for Markov chain Monte Carlo analysis of nonlinear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis–Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the nonlinearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.  相似文献   

4.
Rapid developments of time series models and methods addressing volatility in computational finance and econometrics have been recently reported in the financial literature. The non-linear volatility theory either extends and complements existing time series methodology by introducing more general structures or provides an alternative framework (see Abraham and Thavaneswaran [B. Abraham, A. Thavaneswaran, A nonlinear time series model and estimation of missing observations, Ann. Inst. Statist. Math. 43 (1991) 493–504] and Granger [C.W.J. Granger, Overview of non-linear time series specification in Economics, Berkeley NSF-Symposia, 1998]). In this work, we consider Gaussian first-order linear autoregressive models with time varying volatility. General properties for process mean, variance and kurtosis are derived; examples illustrate the wide range of properties that can appear under the autoregressive assumptions. The results can be used in identifying some volatility models. The kurtosis of the classical RCA model of Nicholls and Quinn [D.F. Nicholls, B.G. Quinn, Random Coefficient Autoregressive Models: An Introduction, in: Lecture Notes in Statistics, vol. 11, Springer, New York, 1982] is shown to be a special case.  相似文献   

5.
Regression density estimation is the problem of flexibly estimating a response distribution as a function of covariates. An important approach to regression density estimation uses finite mixture models and our article considers flexible mixtures of heteroscedastic regression (MHR) models where the response distribution is a normal mixture, with the component means, variances, and mixture weights all varying as a function of covariates. Our article develops fast variational approximation (VA) methods for inference. Our motivation is that alternative computationally intensive Markov chain Monte Carlo (MCMC) methods for fitting mixture models are difficult to apply when it is desired to fit models repeatedly in exploratory analysis and model choice. Our article makes three contributions. First, a VA for MHR models is described where the variational lower bound is in closed form. Second, the basic approximation can be improved by using stochastic approximation (SA) methods to perturb the initial solution to attain higher accuracy. Third, the advantages of our approach for model choice and evaluation compared with MCMC-based approaches are illustrated. These advantages are particularly compelling for time series data where repeated refitting for one-step-ahead prediction in model choice and diagnostics and in rolling-window computations is very common. Supplementary materials for the article are available online.  相似文献   

6.
Bank efficiency estimates often serve as a proxy of managerial skill since they quantify sub-optimal production choices. But such deviations can also be due to omitted systematic differences among banks. In this study, we examine the effects of heterogeneity on bank efficiency scores. We compare different specifications of a stochastic cost and alternative profit frontier model with a baseline specification. After conducting a specification test, we discuss heterogeneity effects on efficiency levels, ranks and the tails of the efficiency distribution. We find that heterogeneity controls influence both banks’ optimal costs and profits and their ability to be efficient. Differences in efficiency scores are important for more than only methodological reasons. First, different ways of accounting for heterogeneity result in estimates of foregone profits and additional costs that are significantly different from what we infer from our general specification. Second, banks are significantly re-ranked when their efficiency is estimated with a specification other than the preferred, general specification. Third, the general specification gives the most reliable estimates of the probability of distress, although differences to the other specifications are low.  相似文献   

7.
For dynamic scheduling of multi-class systems where backorder cost is incurred per unit backordered regardless of the time needed to satisfy backordered demand, the following models are considered: the cost model to minimize the sum of expected average inventory holding and backorder costs and the service model to minimize expected average inventory holding cost under an aggregate fill rate constraint. Use of aggregate fill rate constraint in the service model instead of an individual fill rate constraint for each class is justified by deriving equivalence relations between the considered cost and service models. Based on the numerical investigation that the optimal policy for the cost model is a base-stock policy with switching curves and fixed base-stock levels, an alternative service model is considered over the class of base-stock controlled dynamic scheduling policies to minimize the total inventory (base-stock) investment under an aggregate fill rate constraint. The policy that solves this alternative model is proposed as an approximation of the optimal policy of the original cost and the equivalent service models. Very accurate heuristics are devised to approximate the proposed policy for given base-stock levels. Comparison with base-stock controlled First Come First Served (FCFS) and Longest Queue (LQ) policies and an extension of LQ policy (Δ policy) shows that the proposed policy performs much better to solve the service models under consideration, especially when the traffic intensity is high.  相似文献   

8.
In this paper we combine the idea of ‘power steady model’, ‘discount factor’ and ‘power prior’, for a general class of filter model, more specifically within a class of dynamic generalized linear models (DGLM). We show an optimality property for our proposed method and present the particle filter algorithm for DGLM as an alternative to Markov chain Monte Carlo method. We also present two applications; one on dynamic Poisson models for hurricane count data in Atlantic ocean and the another on the dynamic Poisson regression model for longitudinal count data.  相似文献   

9.
In this note we study a deterministic dynamic programming model with generalised discounting. We use a modified weighted norm approach and an approximation technique to a study of the Bellman equation for unbounded return functions. Furthermore, we apply this theory to economic growth models.  相似文献   

10.
To predict future claims, it is well-known that the most recent claims are more predictive than older ones. However, classic panel data models for claim counts, such as the multivariate negative binomial distribution, do not put any time weight on past claims. More complex models can be used to consider this property, but often need numerical procedures to estimate parameters. When we want to add a dependence between different claim count types, the task would be even more difficult to handle. In this paper, we propose a bivariate dynamic model for claim counts, where past claims experience of a given claim type is used to better predict the other type of claims. This new bivariate dynamic distribution for claim counts is based on random effects that come from the Sarmanov family of multivariate distributions. To obtain a proper dynamic distribution based on this kind of bivariate priors, an approximation of the posterior distribution of the random effects is proposed. The resulting model can be seen as an extension of the dynamic heterogeneity model described in Bolancé et al. (2007). We apply this model to two samples of data from a major Canadian insurance company, where we show that the proposed model is one of the best models to adjust the data. We also show that the proposed model allows more flexibility in computing predictive premiums because closed-form expressions can be easily derived for the predictive distribution, the moments and the predictive moments.  相似文献   

11.
The paper focuses on the similarity between modelling and knowledge representation, trying to bring together the OR/Systems Science and the Artificial Intelligence views when referring to a computer system simulation, especially of the discrete-event or the network types. The models we consider are generalized activity networks with resources, including either models with a finite lifetime, such as project scheduling networks, or steady state models, such as queueing networks. By enhancing the structure of entities and states and the logic of transitions within a model specification, modularity is improved and one may adopt a more declarative approach. The relational and rule-based representation formalisms are a convenient choice for that purpose. Then, the use of knowledge bases both for the static (i.e. consultative) and the dynamic (i.e. experimental) study of the model turns up to be more natural. Moreover, the task of building an expert system for decision support on system analysis or synthesis becomes easier. The paper reports some original work in the above directions, using a logic programming approach and an associated specification methodology based on general systems concepts.  相似文献   

12.
A crucial element in the development of econometric methodology during the past decade has been the concern with testing as opposed to estimating econometric models. In this paper we discuss—especially for the econometric analysis of time series—the main types of test procedures, and we also investigate the opportunities to uphold the Neyman-Pearson theory in the context of thorough model specification testing. In applied work it is quite usual to carry out several tests on the same set of sample data. We consider an extension of the Neyman-Pearson framework to the case of such repeated testing, and examine situations where the various hypotheses under test have a particular nesting structure. For the case where a sequence of superposed alternatives is tested by so-called marginal tests, we prove that the various test statistics are asymptotically independent under a common null hypothesis if the statistics are based on either the likelihood-ratio, or the Wald, or the Lagrange-multiplier approach. Testing a particular null hypothesis against a series of juxtaposed alternatives appears to lead to independent test statistics only in specific circumstances. It is shown how independence of test statistics enables the control over the overall Type I error probability, which is an essential element in the Neyman-Pearson theory. Using the notions of constructive hypotheses and auxiliary hypotheses, we can draw a clear distinction between specification tests and misspecification tests. Next an overview is given of approaches to and examples of specification and misspecification testing. With respect to the former, attention is paid to the problem determining the order of dynamics and discriminating between system dynamics and error dynamics. The misspecification testing is reviewed for specification error, nonconstancy of coefficients, heteroscedasticity, serial dependence, and nonnormality of disturbances. Also the problem of testing for several misspecifications jointly or sequentially is considered. Finally we discuss the options and associated difficulties in implementing the various tests in an overall testing strategy.  相似文献   

13.
Simultaneous estimation in nonlinear multivariate regression contexts is a complex problem in inference. In this paper, we compare the methodology suggested in the literature for an unknown covariance matrix among response components, the methodology by Beauchamp and Cornell (B&C), with the standard nonlinear least squares approach (NLS). In the first part of the paper, we contrast B&C and the standard NLS, pointing out, from the theoretical point of view, how a model specification error could affect the estimation. A comprehensive simulation study is also performed to evaluate the effectiveness of B&C versus standard NLS under both correct and misspecified models. Several alternative models are considered to highlight the consequences of different types of specification error. An application to a real dataset within the context of quantitative marketing is presented.  相似文献   

14.
In some multivariate time-series models a matrix power series is involved. These models can be identified as rational models if these series correspond to a matrix rational function. Moreover, it is necessary to answer some questions about minimality and uniqueness of representation. The main results of this paper fall within the sphere of matrix Padé approximation. On the basis of formal power series, matrix rational functions of arbitrary dimensions are characterized. Furthermore, we study certain minimality types, that are, global minimum degrees and row minimum degrees. In addition, given that the rational representation of the function for the same pair of degrees need not be unique, we have obtained conditions to study the uniqueness of said representation and, also, to find a “canonical” unique representation. Moreover, we consider an application to special series which is associated with time-series models; such series leads to new theoretical results relating to matrix Padé approximation. Finally, we comment on some illustrative examples.  相似文献   

15.
ABSTRACT. During the restoration planning phase of the natural resource damage assessment (NRDA) process, potential injuries to natural resources and services are evaluated in terms of the nature, degree and extent of injury so that the need for and scale of restoration actions can be ascertained. Injuries are quantified by comparing the condition of the injured natural resource relative to baseline (pre‐injury) conditions. The “Type A” procedures are used to quantify damages from smaller spills and rely on a standardized methodology and computer model to calculate injury and value of damages. In this model, fishery stock changes from injuries and resulting changes in user participation are not treated as dynamic. If true stock growth and re‐growth are indeed dynamic, then the Type A model is likely underestimating fishery losses. The purpose of this paper is to illustrate the potential for such underestimation by comparing simulated stock and harvest losses under dynamic treatment and a static treatment that more closely represents the way stock and service losses are estimated under the current NRDA process.  相似文献   

16.
17.
We present a robust model for determining the optimal order quantity and market selection for short-life-cycle products in a single period, newsvendor setting. Due to limited information about demand distribution in particular for short-life-cycle products, stochastic modeling approaches may not be suitable. We propose the minimax regret multi-market newsvendor model, where the demands are only known to be bounded within some given interval. In the basic version of the problem, a linear time solution method is developed. For the capacitated case, we establish some structural results to reduce the problem size, and then propose an approximation solution algorithm based on integer programming. Finally, we compare the performance of the proposed minimax regret model against the typical average-case and worst-case models. Our test results demonstrate that the proposed minimax regret model outperformed the average-case and worst-case models in terms of risk-related criteria and mean profit, respectively.  相似文献   

18.
This study considers specification and estimation of cost functions in public schools. A number of production characteristics are included in the specification to control for observable differences between municipalities in the provision of their school services. Both parametric and non-parametric approaches are used to take into account the quality differences in school services. These approaches are then compared to the alternative output measure without any adjustment for quality differences. The sensitivity of different elasticities and returns to scale (RTS) measures with respect to alternative model specifications and quality adjustments are also analyzed. In the empirical section we examine performances of 286 Swedish municipalities in the production of primary and secondary school education during the 1992/3–1994/5 school years.  相似文献   

19.
基于Fuzzy推理的时变系统建模   总被引:1,自引:0,他引:1  
提出一种基于Fuzzy推理的时变系统建模方法,其基本思想是:对时间维度进行分割,在每个较短的时间间隔内用时不变模型代替时变模型,将这些时不变模型组合在一起,最终获得一个整体非线性时变的微分方程模型.分别研究了输入输出型时变系统和状态空间型时变系统的模型建立方法,除了从理论上保证了所获得的模型对系统的逼近性,还从仿真实验验证了用该方法建立的模型对非线性时变系统有很好的逼近效果.  相似文献   

20.
Till present, models that determined batch plants configurations in the chemical process industry resorted to models with binary variables to represent the different admissible options. This approach allowed representing the problem in a simple way while considering a significant number of alternatives. Nevertheless, the non-convexity that arises when dealing with detailed models for representing the involved units operation prevents its correct resolution or has a low performance. This work presents a representation of the problem through a superstructure that takes explicitly into account all the alternatives without resorting to binary variables. By using extremely simple modeling, it is possible to manage an appropriate number of options for this type of problems by means of a non-linear programming (NLP) model. Moreover, it is possible to consider duplication in series of production stages, which is an alternative that has not been used till now. This approach is posed for the case of a fermentors network. The solution is reached with very low requirements as regards employed computer time and without the aforementioned difficulties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号