首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Structured treatment interruptions (STI) were proposed to reduce costs and side effects for HIV infected individuals, but whether the possible viral rebound within hosts after treatment interruption would lead to more new infections and additional costs among the population remains unknown. Oral pre-exposure prophylaxis (PrEP) is shown as an effective but expensive strategy to prevent the acquisition of HIV infection. To investigate the effectiveness and cost-effectiveness of STI and PrEP, we develop a multi-scale model linking within-host and between-host dynamics in the presence of drug resistance. Lyapunov functionals are constructed to analyze the global dynamics of the coupled system. We fit this model to the annual AIDS incidence and death data from 1980 to 2014 among men who have sex with men (MSM) in San Francisco and compare the impact of six various intervention scenarios (low, medium, high PrEP coverage with or without STI) on new infections and cost-effectiveness over the next 20 years. We estimate the PrEP efficacy needed to eliminate the disease for different fraction of acquired drug resistance under the above six scenarios. Numerical simulations show that expanding PrEP coverage is very cost-effective, but whether implementing STI is cost-saving depends on the efficacy of second-line drugs. That is to say, STI could always save money, but it may lead to more (less) new infections than continuous therapy and thus less (more) health benefit for high (low) efficacy of second-line drugs. These results provide insights about the long-term effect of STI and PrEP on the disease control and cost-effectiveness.  相似文献   

2.
Current sterilization techniques may not be completely effective at removing prions from surgical instruments, which can then infect patients on whom these instruments are subsequently used. This risk is increased due to the current level of instrument migration. With wide uncertainty in the numbers of patients that are incubating variant Creutzfeldt–Jakob disease (vCJD) and effectiveness of decontamination, the UK is facing a potentially self-sustaining epidemic, which could be averted with the introduction of single-use instruments. This paper focuses on the cost-effectiveness of management strategies concerning the introduction of single-use instruments and measures to prevent migration. We formulated a discrete event simulation model of the dynamics of infection transmission, surgical instrument contamination and migration, to produce results that were pivotal in shaping government policy. Field data about vCJD transmission has then been used to update cost-effectiveness assessments as part of a retrospective analysis, which reinforces the initial decision.  相似文献   

3.
Inventory control is especially difficult when demand is stochastic and nonstationary. We consider a spare part inventory control problem with multiple-period replenishment lead time, and describe a static-dynamic strategy for the problem. By solving a static-dynamic uncertainty model, the strategy first makes decisions on the replenishment periods and order-up-to-levels over the planning horizon, but implements only the decisions of the first period. It then uses the rolling horizon approach in the next period when the inventory status is revised, and the multi-period problem is updated as better forecasts become available. In light of structural property of the developed static-dynamic uncertainty model, the optimal solution to the model can be obtained without much computational effort and thus the strategy can be easily implemented. Computational experiments and result of a case study verify the efficacy of the proposed strategy.  相似文献   

4.
In this paper we present the first application to a healthcare problem of discrete-event simulation (DES) embedded in an ant colony optimisation (ACO) model. We are concerned with choosing optimal screening policies for retinopathy, a sight-threatening complication of diabetes. The early signs of retinopathy can be detected by screening before the patient is aware of symptoms, and blindness prevented by laser treatment. In this paper we describe the methodology used to combine the purpose-written DES model with the ACO algorithm. We simulate the effects of different screening strategies on a population of diabetic patients, and compare them in terms of two objective functions: Min C/E, cost-effectiveness (minimum incremental cost per year of sight saved, compared with a no-screening baseline) and Max E, maximum effectiveness (years of sight saved). We describe how ACO is used to optimise these two objectives, and discuss the issues involved in optimising stochastic variables. We present results for a range of different assumptions and scenarios about the format of screening programmes, using realistic data, and make policy recommendations on the basis of our findings.  相似文献   

5.
This paper discusses the associations between traits and haplotypes based on Fl (fluorescent intensity) data sets. We consider a clustering algorithm based on mixtures of t distributions to obtain all possible genotypes of each individual (i.e. "GenoSpec-trum"). We then propose a likelihood-based approach that incorporates the genotyping uncertainty to assessing the associations between traits and haplotypes through a haplotype-based logistic regression model. Simulation studies show that our likelihood-based method can reduce the impact induced by genotyping errors.  相似文献   

6.
This paper develops a λ mean-hybrid entropy model to deal with portfolio selection problem with both random uncertainty and fuzzy uncertainty. Solving this model provides the investor a tradeoff frontier between security return and risk. We model the security return as a triangular fuzzy random variable, where the investor’s individual preference is reflected by the pessimistic-optimistic parameter λ. We measure the security risk using the hybrid entropy in this model. Algorithm is developed to solve this bi-objective portfolio selection model. Beside, a numerical example is also presented to illustrate this approach.  相似文献   

7.
The internal dynamics of a hospital represent a complex non-linear structure. Planning and management of bed capacities must be evaluated within an environment of uncertainty, variability and limited resources. A common approach is to plan and manage capacities based on simple deterministic spreadsheet calculations. This paper demonstrates that these calculations typically do not provide the appropriate information and result in underestimating true bed requirements. More sophisticated, flexible and necessarily detailed capacity models are needed. The development and use of such a simulation model is presented in this paper. The modelling work, in conjunction with a major UK NHS Trust, considers various types of patient flows, at the individual patient level, and resulting bed needs over time. The consequence of changes in capacity planning policies and management of existing capacities can be readily examined. The work has highlighted the need for evaluating hospital bed capacities in light of both bed occupancies and refused admission rates. The relationship between occupancy and refusals is complex and often overlooked by hospital managers.  相似文献   

8.
We develop technology to plan delivery routes for the supply of blood products to hospitals by a blood bank. The technology produces low cost, robust plans that hedge against the natural uncertainty associated with blood product usage at hospitals. The technology relies on sampling-based approaches involving integer programming and variable neighborhood search. An extensive computational study shows the efficacy of the two approaches and highlights the impact of product usage uncertainty on the resulting delivery plans.  相似文献   

9.
Colorectal cancer includes cancerous growths in the colon, rectum and appendix and affects around 30?000 people in England each year. Maximizing health benefits for patients with colorectal cancer requires consideration of costs and outcomes across the whole service. In an era of scarce healthcare resources, there is a need to consider not only whether technologies and services may be considered clinically effective, but also whether they are cost-effective, that is, whether they represent value for money for the health service. Through the development of a whole disease model, it is possible to evaluate the cost-effectiveness of a range of options for service development consistently within a common framework. Discrete event simulation has been used to model the complete colorectal cancer patient pathway from patient presentation through to referral and diagnosis, treatment, follow-up, potential recurrence, treatment of metastases and end-of-life care. This simulation model has been used to examine the potential cost-effectiveness of different options for change across the entire colorectal cancer pathway. This paper provides an empirical demonstration of the potential application of modelling entire disease areas to inform clinical policy and resource allocation decision-making.  相似文献   

10.
We present an uncertainty model for geometric transformations based on polygonal uncertainty regions and transformation polytopes. The main contribution of this paper is a systematic approach for the computation of regions of interest for features by using the uncertainty model for affine and projective transformations. The focus is on the solution of transformation problems for geometric primitives, especially lines, so that regions of interest can be computed for corresponding geometric features in distinct images.  相似文献   

11.
In model-based analysis for comparative evaluation of strategies for disease treatment and management, the model of the disease is arguably the most critical element. A fundamental challenge in identifying model parameters arises from the limitations of available data, which challenges the ability to uniquely link model parameters to calibration targets. Consequently, the calibration of disease models leads to the discovery of multiple models that are similarly consistent with available data. This phenomenon is known as calibration uncertainty and its effect is transferred to the results of the analysis. Insufficient examination of the breadth of potential model parameters can create a false sense of confidence in the model recommendation, and ultimately cast doubt on the value of the analysis. This paper introduces a systematic approach to the examination of calibration uncertainty and its impact. We begin with a model of the calibration process as a constrained optimization problem and introduce the notion of plausible models which define the uncertainty region for model parameters. We illustrate the approach using a fictitious disease, and explore various methods for interpreting the outputs obtained.  相似文献   

12.
A hierarchical model is developed for the joint mortality analysis of pension scheme datasets. The proposed model allows for a rigorous statistical treatment of missing data. While our approach works for any missing data pattern, we are particularly interested in a scenario where some covariates are observed for members of one pension scheme but not the other. Therefore, our approach allows for the joint modelling of datasets which contain different information about individual lives. The proposed model generalizes the specification of parametric models when accounting for covariates. We consider parameter uncertainty using Bayesian techniques. Model parametrization is analysed in order to obtain an efficient MCMC sampler, and address model selection. The inferential framework described here accommodates any missing-data pattern, and turns out to be useful to analyse statistical relationships among covariates. Finally, we assess the financial impact of using the covariates, and of the optimal use of the whole available sample when combining data from different mortality experiences.  相似文献   

13.
We propose three strategies by which a professor of a university course can assign final letter grades taking into account the natural uncertainty in students’ individual assignment and final numerical grades. The first strategy formalizes a common technique that identifies large gaps in the final numerical grades. For the second and third strategies, we introduce the notion of a borderline student, that is, a student who is close to, but below, the breakpoint for the next highest letter grade. Using mixed-integer linear programming and a tailor-made branch-and-bound algorithm, we choose the letter-grade breakpoints to minimize the number of borderline students. In particular, the second strategy treats the uncertainty implicitly and minimizes the number of borderline students, while the third strategy uses a robust-optimization approach to minimize the maximum number of borderline students that could occur based on an explicit uncertainty set. We compare the three strategies on realistic instances and identify overall trends as well as some interesting exceptions. While no strategy appears best in all cases, each can be computed in a reasonable amount of time for a moderately sized course. Moreover, they collectively provide the professor important insight into how uncertainty affects the assignment of final letter grades.  相似文献   

14.
Korean government has been funding the small and medium enterprises (SME) with superior technology based on scorecard. However high default rate of funded SMEs has been reported. In order to effectively manage such governmental fund, it is important to develop accurate scoring model for SMEs. In this paper, we provide a random effects logistic regression model to predict the default of funded SMEs based on both financial and non-financial factors. Advantage of such a random effects model lies in the ability of accommodating not only the individual characteristics of each SME but also the uncertainty that cannot be explained by such individual factors. It is expected that our study can contribute to effective management of government funds by proposing the prediction models for defaults of funded SMEs.  相似文献   

15.
Community collective efficacy is an important concept in studying various community problems and in developing community well-beings. While existing research on collective efficacy mainly focuses on building analytical or statistical models from informant survey data, this paper develops a complexity science-informed agent-based model to study the dynamic process of collective efficacy formation in a community. We model the individual level cognitive process for participating in community tasks and simulate the emergent spatial patterns of agents’ collective efficacies in a community. The developed model is based on the self-efficacy theory and Theory of Planned Behavior. Interesting patterns such as spatial segregation has been observed through simulations, and an application of the developed model is presented.  相似文献   

16.
考虑了具有强健性的信用风险优化问题. 根据最差条件在值风险度量信用风险的方法,建立了信用风险优化问题的模型. 由于信用风险的损失分布存在不确定性,考虑了两类不确定性区间,即箱子型区间和椭球型区间. 把具有强健性的信用风险优化问题分别转化成线性规划问题和二阶锥规划问题. 最后,通过一个信用风险问题的例子来说明此模型的有效性.  相似文献   

17.
High job turnover rate can cause many problems and each company needs proper strategies to prevent the brain-drain of its manpower. For effective human resource management, predicting the occupational life expectancy or the mean residual life of those who are to leave and join another company is important. In this paper, we propose a random effects Weibull regression model for forecasting the occupational lifetime of the employees who join another company, based on their characteristics. Advantage of using such a random effects model is the ability of accommodating not only the individual characteristics of each employee but also the uncertainty that cannot be explained by individual factors. We apply the proposed model to the occupational lifetime data obtained from the company affiliated to general trading in Korea. From our analyses, we can infer the characteristics of those who have a relatively longer occupational lifetime as follows: the managing director level, relatively old, those who entered the company earlier, high school graduates, those who were involved in technical service, and married female employees. Accordingly, effective human resources management policy is necessary to retain those who are good but want to leave and those who stay but need more improvement for the betterment of the company.  相似文献   

18.
We provide a framework for simulating the entire patient journey across different phases (such as diagnosis, treatment, rehabilitation and long-term care) and different sectors (such as GP, hospital, social and community services), with the aim of providing better understanding of such processes and facilitating evaluation of alternative clinical and care strategies. A phase-type modelling approach is used to promote better modelling and management of the specific elements of a patient pathway, using performance measures such as clinical outcomes, patient quality of life, and cost. The approach is illustrated using stroke disease. Approximately 5% of the United Kingdom National Health Service budget is spent treating stroke disease each year. There is an urgent need to assess whether existing services are cost-effective or new interventions could increase efficiency. This assessment can be made using models across primary and secondary care; in particular we evaluate the cost-effectiveness of thrombolysis (clot busting therapy), using discrete event simulation. Using our model, patient quality of life and the costs of thrombolysis are compared under different regimes. In addition, our simulation framework is used to illustrate the impact of internal discharge queues, which can develop while patients are awaiting placement. Probabilistic Sensitivity Analysis of the value parameters is also carried out.  相似文献   

19.
Expert knowledge in the form of mathematical models can be considered sufficient statistics of all prior experimentation in the domain, embodying generic or abstract knowledge of it. When used in a probabilistic framework, such models provide a sound foundation for data mining, inference, and decision making under uncertainty.We describe a methodology for encapsulating knowledge in the form of ordinary differential equations (ODEs) in dynamic Bayesian networks (DBNs). The resulting DBN framework can handle both data and model uncertainty in a principled manner, can be used for temporal data mining with noisy and missing data, and can be used to re-estimate model parameters automatically using data streams. A standard assumption when performing inference in DBNs is that time steps are fixed. Generally, the time step chosen is small enough to capture the dynamics of the most rapidly changing variable. This can result in DBNs having a natural time step that is very short, leading to inefficient inference; this is particularly an issue for DBNs derived from ODEs and for systems where the dynamics are not uniform over time.We propose an alternative to the fixed time step inference used in standard DBNs. In our algorithm, the DBN automatically adapts the time step lengths to suit the dynamics in each step. The resulting system allows us to efficiently infer probable values of hidden variables using multiple time series of evidence, some of which may be sparse, noisy or incomplete.We evaluate our approach with a DBN based on a variant of the van der Pol oscillator, and demonstrate an example where it gives more accurate results than the standard approach, but using only one tenth the number of time steps.We also apply our approach to a real-world example in critical care medicine. By incorporating knowledge in the form of an existing ODE model, we have built a DBN framework for efficiently predicting individualised patient responses using the available bedside and lab data.  相似文献   

20.
Stochastic processes are natural models for the progression of many individual and team sports. Such models have been applied successfully to select strategies and to predict outcomes in the context of games, tournaments and leagues. This information is useful to participants and gamblers, who often need to make decisions while the sports are in progress. In order to apply these models, much of the published research uses parameters estimated from historical data, thereby ignoring the uncertainty of the parameter values and the most relevant information that arises during competition. In this paper, we investigate candidate stochastic processes for familiar sporting applications that include cricket, football and badminton, reviewing existing models and offering some new suggestions. We then consider how to model parameter uncertainty with prior and posterior distributions, how to update these distributions dynamically during competition and how to use these results to make optimal decisions. Finally, we combine these ideas in a case study aimed at predicting the winners of next year’s University Boat Race.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号