首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到15条相似文献,搜索用时 15 毫秒
1.
Summary  A new approach for multiplicity control (Optimal Subset) is presented. This is based on the selection of the best subset of partial (univariate) hypotheses producing the minimal p-value. In this work, we show how to perform this new procedure in the permutation framework, choosing suitable combining functions and permutation strategies. The optimal subset approach can be very useful in exploratory studies because it performs a weak control for multiplicity which can be a valid alternative to the False Discovery Rate (FDR). A comparative simulation study and an application to neuroimaging real data shows that it is particularly useful in presence of a high number of hypotheses. We also show how stepwise regression may be a special case of Optimal Subset procedures and how to adjust the p-value of the selected model taking into account for the multiplicity arising from the possible different models selected by a stepwise regression.  相似文献   

2.
This paper uses a fully nonparametric approach to estimate efficiency measures for primary care units incorporating the effect of (exogenous) environmental factors. This methodology allows us to account for different types of variables (continuous and discrete) describing the main characteristics of patients served by those providers. In addition, we use an extension of this nonparametric approach to deal with the presence of undesirable outputs in data, represented by the rates of hospitalization for ambulatory care sensitive condition (ACSC). The empirical results show that all the exogenous variables considered have a significant and negative effect on efficiency estimates.  相似文献   

3.
In this paper, we address the changing composition of a customer portfolio taking into account actions undertaken by the company to adapt its service offer to market conditions and/or technological innovations. We present a specific methodology to identify clusters of customers in different periods and then compare them over time. The classification process takes into account both qualitative and quantitative aspects of the consumption levels of the services or products offered by the company. The possibility of period‐to‐period variation in the customer portfolio and the service or product offer is also considered, in order to achieve a more realistic scenario. The core of the proposed methodology is related to the family of exploratory factorial and cluster techniques. The customers are classified by using a bicriterial clustering methodology based on ‘tandem’ analysis (multiple factor analysis+cluster analysis of the main factors). The bicriterial approach allows for a compromise between customers' consumption levels (a quantitative criterion) and their consumption/non‐consumption pattern (a qualitative criterion). The evolution of the customer portfolio composition is explored through multiple correspondence analysis. This technique allows visual comparison of the position of different clusters against time and the identification of key changes in customer consumption behavior. The methodology is tested on realistic customer portfolio scenarios for a major telecommunication company. We simulate various scenarios to show the strengths of our proposal. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

4.
In this paper, we present a simulation optimization algorithm for solving the two-echelon constrained inventory problem. The goal is to determine the optimal setting of stocking levels to minimize the total inventory investment costs while satisfying the expected response time targets for each field depot. The proposed algorithm is more adaptive than ordinary optimization algorithms, and can be applied to any multi-item multi-echelon inventory system, where the cost structure and service level function resemble what we assume. Empirical studies are performed to compare the efficiency of the proposed algorithms with other existing simulation algorithms.  相似文献   

5.
This paper presents a dynamic distribution and assignment simulation model based on discrete time simulation techniques and dynamic route assignment for planning, engineering design, and operation analysis of big exhibition events from a pedestrian circulation perspective. Both, the distribution and assignment stages are incorporated in an interlaced way with a dynamic behavior along a specific time horizon. In the proposed model, the individual route choice is dynamically determined as consequence of facilities attractiveness and network congestion. Therefore, in contrast with other simulation approaches, it does not require the usual origin–destination trip matrices to describe the transportation demand or the specification of different paths to be followed by visitors. This modeling approach turns out to be very appropriate for the simulation of these big exhibition events where each visitor usually has multiple and a priori unordered destination choices after entering the scenario.  相似文献   

6.
We present a heuristic optimization method for stochastic production-inventory systems that defy analytical modelling and optimization. The proposed heuristic takes advantage of simulation while at the same time minimizes the impact of the dimensionality curse by using regression analysis. The heuristic was developed and tested for an oil and gas company, which decided to adopt the heuristic as the optimization method for a supply-chain design project. To explore the performance of the heuristic in general settings, we conducted a simulation experiment on 900 test problems. We found that the average cost error of using the proposed heuristic was reasonably low for practical applications.  相似文献   

7.
This paper presents a composite model in which two simulation approaches, discrete-event simulation (DES) and system dynamics (SD), are used together to address a major healthcare problem, the sexually transmitted infection Chlamydia. The paper continues an on-going discussion in the literature about the potential benefits of linking DES and SD. Previous researchers have argued that DES and SD are complementary approaches and many real-world problems would benefit from combining both methods. In this paper, a DES model of the hospital outpatient clinic which treats Chlamydia patients is combined with an SD model of the infection process in the community. These two models were developed in commercial software and linked in an automated fashion via an Excel interface. To our knowledge this is the first time such a composite model has been used in a healthcare setting. The model shows how the prevalence of Chlamydia at a community level affects (and is affected by) operational level decisions made in the hospital outpatient department. We discuss the additional benefits provided by the composite model over and above the benefits gained from the two individual models.  相似文献   

8.
9.
A parallel (2, n − 2)-system is investigated here where two units start their operation simultaneously and any one of them is replaced instantaneously upon its failure by one of the (n − 2) cold standbys. We assume availability of n non-identical, non-repairable units for replacement or support. The system reliability is evaluated by recursive relations with unit-lifetimes Ti (i = 1, … , n) that have a general joint distribution function F(t). On the basis of the derived expression, simulation techniques have been developed for the evaluation of the system reliability and the mean time to failure, useful when dealing with large systems or correlated unit-lifetimes and less mathematically manageable distributions. Simulation results are presented for various lifetime distributions and comparisons are made with derived analytic results for some special distributions and moderate values of n.  相似文献   

10.
A variable annuity (VA) is equity-linked annuity product that has rapidly grown in popularity around the world in recent years. Research up to date on VA largely focuses on the valuation of guarantees embedded in a single VA contract. However, methods developed for individual VA contracts based on option pricing theory cannot be extended to large VA portfolios. Insurance companies currently use nested simulation to valuate guarantees for VA portfolios but efficient valuation under nested simulation for a large VA portfolio has been a real challenge. The computation in nested simulation is highly intensive and often prohibitive. In this paper, we propose a novel approach that combines a clustering technique with a functional data analysis technique to address the issue. We create a highly non-homogeneous synthetic VA portfolio of 100,000 contracts and use it to estimate the dollar Delta of the portfolio at each time step of outer loop scenarios under the nested simulation framework over a period of 25 years. Our test results show that the proposed approach performs well in terms of accuracy and efficiency.  相似文献   

11.
This research theoretically explores the measurement of returns to scale (RTS), using a non-radial DEA (data envelopment analysis) model. A range-adjusted measure (RAM) is used as a representative of such non-radial models. Historically, a type of RTS has been discussed within an analytical framework of radial models. The radial-based RTS measurement is replaced by the non-radial RAM/RTS measurement in this study. When discussing the non-radial RAM/RTS measurement, this study finds a problem of multiple projections that cannot be found in the radial measurement. In this research, a new linear programming approach is proposed to identify all efficient DMUs (decision making units) on a reference set. The important feature of the proposed approach is that it can deal with a simultaneous occurrence of (a) multiple reference sets, (b) multiple supporting hyperplanes and (c) multiple projections. All of the three difficulties are handled by the proposed RAM/RTS measurement. In particular, we discuss both when the three different types of multiple solutions occur on the RAM/RTS measurement and how to deal with such difficulties. Our research results make it possible to measure not only the type of RTS but also the magnitude of RTS in the RAM measurement.  相似文献   

12.
This paper presents a hierarchical Bayesian analysis of the partial adjustment model of financial ratios using mixture models, an approach that allows us to estimate the distribution of the adjustment coefficients. More particularly, it enables us to analyse speed of reaction in the presence of shocks affecting financial ratios objectives as a basis to establish homogenous groups of firms. The proposed methodology is illustrated by examining a set of ratios for a sample of firms operating in the U.S. manufacturing sector. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

13.
This paper addresses an integrated vector management (IVM) approach for combating Aedes aegypti, the transmission vector of dengue, zika, and chikungunya diseases, some of the most important viral epidemics worldwide. In order to tackle this problem, a receding horizon control (RHC) strategy is adopted, considering a mono-objective and a multiobjective version of the optimal control model of combating the mosquito using chemical and biological control. RHC is essentially a suboptimal scheme of classical optimal control strategies considering discrete-time approximations. The integrated vector control actions used in this work consist in applying insecticides and inserting sterile males produced by irradiation in the population of mosquitoes. The cost function is defined in terms of social and economic costs, in order to quantify the effectiveness of the proposed epidemiological control throughout a time window of 4 months. Numerical simulations show that the obtained results are better than those from the optimal control strategies found in literature. Furthermore, through the application of the multiobjetive approach, varying the scenarios in the mono-objective formulation is no longer necessary and a set of optimal strategies can be obtained at once. Finally, in order to help health authorities in the choice of the best solution of the Pareto-optimal set to be implemented in practice, a cost-effectiveness analysis is performed and a strategy representing the most cost-effective control policy is obtained.  相似文献   

14.
This research proposes a mortality model with an age shift to project future mortality using principal component analysis (PCA). Comparisons of the proposed PCA model with the well-known models—the Lee-Carter model, the age-period-cohort model (Renshaw and Haberman, 2006), and the Cairns, Blake, and Dowd model—employ empirical studies of mortality data from six countries, two each from Asia, Europe, and North America. The mortality data come from the human mortality database and span the period 1970-2005. The proposed PCA model produces smaller prediction errors for almost all illustrated countries in its mean absolute percentage error. To demonstrate longevity risk in annuity pricing, we use the proposed PCA model to project future mortality rates and analyze the underestimated ratio of annuity price for whole life annuity and deferred whole life annuity product respectively. The effect of model risk on annuity pricing is also investigated by comparing the results from the proposed PCA model with those from the LC model. The findings can benefit actuaries in their efforts to deal with longevity risk in pricing and valuation.  相似文献   

15.
This paper develops an approach to deal with risk in agricultural decisions. Although the model is in line with the Prospect ranking theory and the Partitioned multiobjective risk method, which recognise the multidimensional character of any risk measure taken in agricultural decisionmaking problems, its behavioural hypothesis and analytical development are totally different. The way in which the approach works is illustrated through a simple but wellknown example in agricultural planning.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号