首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Biorefineries can provide a product portfolio from renewable biomass similar to that of crude oil refineries. To operate biorefineries of any kind, however, the availability of biomass inputs is crucial and must be considered during planning. Here, we develop a planning approach that uses Geographic Information Systems (GIS) to account for spatially scattered biomass when optimizing a biorefinery’s location, capacity, and configuration. To deal with the challenges of a non-smooth objective function arising from the geographic data, higher dimensionality, and strict constraints, the planning problem is repeatedly decomposed by nesting an exact nonlinear program (NLP) inside an evolutionary strategy (ES) heuristic, which handles the spatial data from the GIS. We demonstrate the functionality of the algorithm and show how including spatial data improves the planning process by optimizing a synthesis gas biorefinery using this new planning approach.  相似文献   

2.
In machine learning problems, the availability of several classifiers trained on different data or features makes the combination of pattern classifiers of great interest. To combine distinct sources of information, it is necessary to represent the outputs of classifiers in a common space via a transformation called calibration. The most classical way is to use class membership probabilities. However, using a single probability measure may be insufficient to model the uncertainty induced by the calibration step, especially in the case of few training data. In this paper, we extend classical probabilistic calibration methods to the evidential framework. Experimental results from the calibration of SVM classifiers show the interest of using belief functions in classification problems.  相似文献   

3.
Geospatial reasoning has been an essential aspect of military planning since the invention of cartography. Although maps have always been a focal point for developing situational awareness, the dawning era of network-centric operations brings the promise of unprecedented battlefield advantage due to improved geospatial situational awareness. Geographic information systems (GIS) and GIS-based decision support systems are ubiquitous within current military forces, as well as civil and humanitarian organizations. Understanding the quality of geospatial data is essential to using it intelligently. A systematic approach to data quality requires: estimating and describing the quality of data as they are collected; recording the data quality as metadata; propagating uncertainty through models for data processing; exploiting uncertainty appropriately in decision support tools; and communicating to the user the uncertainty in the final product. There are shortcomings in the state-of-the-practice in GIS applications in dealing with uncertainty. No single point solution can fully address the problem. Rather, a system-wide approach is necessary. Bayesian reasoning provides a principled and coherent framework for representing knowledge about data quality, drawing inferences from data of varying quality, and assessing the impact of data quality on modeled effects. Use of a Bayesian approach also drives a requirement for appropriate probabilistic information in geospatial data quality metadata. This paper describes our research on data quality for military applications of geospatial reasoning, and describes model views appropriate for model builders, analysts, and end users.  相似文献   

4.
The location set covering problem continues to be an important and challenging spatial optimization problem. The range of practical planning applications underscores its importance, spanning fire station siting, warning siren positioning, security monitoring and nature reserve design, to name but a few. It is challenging on a number of fronts. First, it can be difficult to solve for medium to large size problem instances, which are often encountered in combination with geographic information systems (GIS) based analysis. Second, the need to cover a region efficiently often brings about complications associated with the abstraction of geographic space. Representation as points can lead to significant gaps in actual coverage, whereas representation as polygons can result in a substantial overestimate of facilities needed. Computational complexity along with spatial abstraction sensitivity combine to make advances in solving this problem much needed. To this end, a solution framework for ensuring complete coverage of a region with a minimum number of facilities is proposed that eliminates potential error. Applications to emergency warning siren and fire station siting are presented to demonstrate the effectiveness of the developed approach. The approach can be applied to convex, non-convex and non-contiguous regions and is unaffected by arbitrary initial spatial representations of space.  相似文献   

5.
Underwriting the risk of rare disorders in long-term insurance often relies on rates of onset estimated from quite small epidemiological studies. These estimates can have considerable sampling uncertainty and any function based upon them, such as a premium rate, is also an estimate subject to uncertainty. This is particularly relevant in the case of genetic disorders, because the acceptable use of genetic information may depend on establishing its reliability as a measure of risk. The sampling distribution of a premium rate is hard to estimate without access to the original data, which is rarely possible. From two studies of adult polycystic kidney disease (APKD) we obtain, not the original data, but the cases and exposures used for Kaplan-Meier estimates of the survival probability. We use three resampling methods with these data, namely: (a) the standard bootstrap; (b) the weird bootstrap; and (c) simulation of censored random lifetimes. Rates of onset were obtained from each simulated sample using kernel-smoothed Nelson-Aalen estimates, hence critical illness insurance premium rates for a mutation carrier or a member of an affected family. From 10,000 such samples we estimate the sampling distributions of the premium rates, finding considerable uncertainty. Very careful consideration should be given before using small-sample epidemiological data to deal with insurance problems.  相似文献   

6.
The same term, ‘fractals’ incorporates two rather different meanings and it is convenient to split the term into physical or empirical fractals and mathematical ones. The former term is used when one considers real world or numerically simulated objects exhibiting a particular kind of scaling that is the so-called fractal behaviour, in a bounded range of scales between upper and lower cutoffs. The latter term means sets having non-integer fractal dimensions. Mathematical fractals are often used as models for physical fractal objects. Scaling of mathematical fractals is considered using the Barenblatt–Borodich approach that refers physical quantities to a unit of the fractal measure of the set. To give a rigorous treatment of the fractal measure notion and to develop the approach, the concepts of upper and lower box-counting quasi-measures are presented. Scaling properties of the quasi-measures are studied. As examples of possible applications of the approach, scaling properties of the problems of fractal cracking and adsorption of various substances to fractal rough surfaces are discussed.  相似文献   

7.
Spatial operations research problems seek best locations, often points of minimum aggregate weighted distance, requiring georeferenced data as input. Frequently maps of such data are incomplete, with holes in their geographic distributions. Spatial statistical procedures are available to complete these data sets with best estimates of the missing values. Impacts such imputations have on 2-median facility location–allocation solutions are explored. The sampling distribution of the spatial mean and standard distance of these medians are studied. Population density is used as the weight attribute in determining location-allocation solutions because it can be accurately described with a relatively simple spatial statistical model.  相似文献   

8.
Under the framework of sublinear expectation,we introduce a new type of G-Gaussian random fields,which contains a type of spatial white noise as a special case.Based on this result,we also introduce a spatial-temporal G-white noise.Different from the case of linear expectation,in which the probability measure needs to be known,under the uncertainty of probability measures,spatial white noises are intrinsically different from temporal cases.  相似文献   

9.
This paper proposes an approach for the robust averaged control of random vibrations for the Bernoulli–Euler beam equation under uncertainty in the flexural stiffness and in the initial conditions. The problem is formulated in the framework of optimal control theory and provides a functional setting, which is so general as to include different types of random variables and second-order random fields as sources of uncertainty. The second-order statistical moment of the random system response at the control time is incorporated in the cost functional as a measure of robustness. The numerical resolution method combines a classical descent method with an adaptive anisotropic stochastic collocation method for the numerical approximation of the statistics of interest. The direct and adjoint stochastic systems are uncoupled, which permits to exploit parallel computing architectures to solve the set of deterministic problem that arise from the stochastic collocation method. As a result, problems with a relative large number of random variables can be solved with a reasonable computational cost. Two numerical experiments illustrate both the performance of the proposed method and the significant differences that may occur when uncertainty is incorporated in this type of control problems.  相似文献   

10.
This paper develops a framework for examining the effect of demand uncertainty and forecast error on unit costs and customer service levels in the supply chain, including Material Requirements Planning (MRP) type manufacturing systems. The aim is to overcome the methodological limitations and confusion that has arisen in much earlier research. To illustrate the issues, the problem of estimating the value of improving forecasting accuracy for a manufacturer was simulated. The topic is of practical importance because manufacturers spend large sums of money in purchasing and staffing forecasting support systems to achieve more accurate forecasts. In order to estimate the value a two-level MRP system with lot sizing where the product is manufactured for stock was simulated. Final product demand was generated by two commonly occurring stochastic processes and with different variances. Different levels of forecasting error were then introduced to arrive at corresponding values for improving forecasting accuracy. The quantitative estimates of improved accuracy were found to depend on both the demand generating process and the forecasting method. Within this more complete framework, the substantive results confirm earlier research that the best lot sizing rules for the deterministic situation are the worst whenever there is uncertainty in demand. However, size matters, both in the demand uncertainty and forecasting errors. The quantitative differences depend on service level and also the form of demand uncertainty. Unit costs for a given service level increase exponentially as the uncertainty in the demand data increases. The paper also estimates the effects of mis-specification of different sizes of forecast error in addition to demand uncertainty. In those manufacturing problems with high demand uncertainty and high forecast error, improved forecast accuracy should lead to substantial percentage improvements in unit costs. Methodologically, the results demonstrate the need to simulate demand uncertainty and the forecasting process separately.  相似文献   

11.
Traditional approaches to capital budgeting are based on the premise that probability theory is necessary and sufficient to deal with the uncertainty and imprecision which underlie the estimates of required parameters. This paper argues that, in many circumstances, this premise is invalid since the principal sources of uncertainty are often non-random in nature and relate to the fuzziness rather than the frequency of data. To capture and quantify correctly the underlying uncertainty present in non-statistical situations, this paper suggests two alternative representations: interval analysis and possibility distributions. The use of these representations in economic analysis is discussed, and their application is illustrated through numerical examples.  相似文献   

12.
Conventional open pit mine optimization models for designing mining phases and ultimate pit limit do not consider expected variations and uncertainty in metal content available in a mineral deposit (supply) and commodity prices (market demand). Unlike the conventional approach, a stochastic framework relies on multiple realizations of the input data so as to account for uncertainty in metal content and financial parameters, reflecting potential supply and demand. This paper presents a new method that jointly considers uncertainty in metal content and commodity prices, and incorporates time-dependent discounted values of mining blocks when designing optimal production phases and ultimate pit limit, while honouring production capacity constraints. The structure of a graph representing the stochastic framework is proposed, and it is solved with a parametric maximum flow algorithm. Lagragnian relaxation and the subgradient method are integrated in the proposed approach to facilitate producing practical designs. An application at a copper deposit in Canada demonstrates the practical aspects of the approach and quality of solutions over conventional methods, as well as the effectiveness of the proposed stochastic approach in solving mine planning and design problems.  相似文献   

13.
The same term, ‘fractals’ incorporates two rather different meanings and it is convenient to split the term into physical or empirical fractals and mathematical ones. The former term is used when one considers real world or numerically simulated objects exhibiting a particular kind of scaling that is the so-called fractal behaviour, in a bounded range of scales between upper and lower cutoffs. The latter term means sets having non-integer fractal dimensions. Mathematical fractals are often used as models for physical fractal objects. Scaling of mathematical fractals is considered using the Barenblatt–Borodich approach that refers physical quantities to a unit of the fractal measure of the set. To give a rigorous treatment of the fractal measure notion and to develop the approach, the concepts of upper and lower box-counting quasi-measures are presented. Scaling properties of the quasi-measures are studied. As examples of possible applications of the approach, scaling properties of the problems of fractal cracking and adsorption of various substances to fractal rough surfaces are discussed.  相似文献   

14.
Geographic information systems (GIS) organize spatial data in multiple two-dimensional arrays called layers. In many applications, a response of interest is observed on a set of sites in the landscape, and it is of interest to build a regression model from the GIS layers to predict the response at unsampled sites. Model selection in this context then consists not only of selecting appropriate layers, but also of choosing appropriate neighborhoods within those layers. We formalize this problem as a linear model and propose the use of Lasso to simultaneously select variables, choose neighborhoods, and estimate parameters. Spatially dependent errors are accounted for using generalized least squares and spatial smoothness in selected coefficients is incorporated through use of a priori spatial covariance structure. This leads to a modification of the Lasso procedure, called spatial Lasso. The spatial Lasso can be implemented by a fast algorithm and it performs well in numerical examples, including an application to prediction of soil moisture. The methodology is also extended to generalized linear models. Supplemental materials including R computer code and data analyzed in this article are available online.  相似文献   

15.
In this paper, we extend the multi-period mean–variance optimization framework to worst-case design with multiple rival return and risk scenarios. Our approach involves a min–max algorithm and a multi-period mean–variance optimization framework for the stochastic aspects of the scenario tree. Multi-period portfolio optimization entails the construction of a scenario tree representing a discretised estimate of uncertainties and associated probabilities in future stages. The expected value of the portfolio return is maximized simultaneously with the minimization of its variance. There are two sources of further uncertainty that might require a strengthening of the robustness of the decision. The first is that some rival uncertainty scenarios may be too critical to consider in terms of probabilities. The second is that the return variance estimate is usually inaccurate and there are different rival estimates, or scenarios. In either case, the best decision has the additional property that, in terms of risk and return, performance is guaranteed in view of all the rival scenarios. The ex-ante performance of min–max models is tested using historical data and backtesting results are presented.  相似文献   

16.
Avoiding concentration or saturation of activities is fundamental in many environmental and urban planning contexts. Examples include dispersing retail and restaurant outlets, sensitivity to impacts in forest utilization, spatial equity of waste disposal, ensuring public safety associated with noxious facilities, and strategic placement of military resources, among others. Dispersion models have been widely applied to ensure spatial separation between activities or facilities. However, existing approaches rely on deterministic approaches that ignore issues of spatial data uncertainty, which could lead to poor decision making. To address data uncertainty issues in dispersion modelling, a multi-objective approach that explicitly accounts for spatial uncertainty is proposed, enabling the impacts of uncertainty to be evaluated with statistical confidence. Owing to the integration of spatial uncertainty, this dispersion model is more complex and computationally challenging to solve. In this paper we develop a multi-objective evolutionary algorithm to address the computational challenges posed. The proposed heuristic incorporates problem-specific spatial knowledge to significantly enhance the capability of the evolutionary algorithm for solving this problem. Empirical results demonstrate the performance superiority of the developed approach in supporting facility and service planning.  相似文献   

17.
Benefit-cost analysis is required by law and regulation throughout the federal government. Robert Dorfman (1996) declares ‘Three prominent shortcomings of benefit-cost analysis as currently practiced are (1) it does not identify the population segments that the proposed measure benefits or harms (2) it attempts to reduce all comparisons to a single dimension, generally dollars and cents and (3) it conceals the degree of inaccuracy or uncertainty in its estimates.’ The paper develops an approach for conducting benefit-cost analysis derived from data envelopment analysis (DEA) that overcomes each of Dorfman's objections. The models and methodology proposed give decision makers a tool for evaluating alternative policies and projects where there are multiple constituencies who may have conflicting perspectives. This method incorporates multiple incommensurate attributes while allowing for measures of uncertainty. An application is used to illustrate the method. This work was funded by grant N00014-99-1-0719 from the Office of Naval Research  相似文献   

18.
Applications of traditional data envelopments analysis (DEA) models require knowledge of crisp input and output data. However, the real-world problems often deal with imprecise or ambiguous data. In this paper, the problem of considering uncertainty in the equality constraints is analyzed and by using the equivalent form of CCR model, a suitable robust DEA model is derived in order to analyze the efficiency of decision-making units (DMUs) under the assumption of uncertainty in both input and output spaces. The new model based on the robust optimization approach is suggested. Using the proposed model, it is possible to evaluate the efficiency of the DMUs in the presence of uncertainty in a fewer steps compared to other models. In addition, using the new proposed robust DEA model and envelopment form of CCR model, two linear robust super-efficiency models for complete ranking of DMUs are proposed. Two different case studies of different contexts are taken as numerical examples in order to compare the proposed model with other approaches. The examples also illustrate various possible applications of new models.  相似文献   

19.
Abstract A regional modeling framework using national data series is developed to estimate the net cost of land‐applying manure under possible policy provisions to limit water‐ and air‐quality emissions. The modeling framework, applied to the Chesapeake Bay watershed, integrates GIS‐based spatial data within an optimization model to capture spatial effects at a subwatershed scale.  相似文献   

20.
Estimating the probability of extreme temperature events is difficult because of limited records across time and the need to extrapolate the distributions of these events, as opposed to just the mean, to locations where observations are not available. Another related issue is the need to characterize the uncertainty in the estimated probability of extreme events at different locations. Although the tools for statistical modeling of univariate extremes are well-developed, extending these tools to model spatial extreme data is an active area of research. In this paper, in order to make inference about spatial extreme events, we introduce a new nonparametric model for extremes. We present a Dirichlet-based copula model that is a flexible alternative to parametric copula models such as the normal and t-copula. The proposed modelling approach is fitted using a Bayesian framework that allow us to take into account different sources of uncertainty in the data and models. We apply our methods to annual maximum temperature values in the east-south-central United States.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号