首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A method for combining two types of judgments about an object analyzed, which are elicited from experts, is considered in the paper. It is assumed that the probability distribution of a random variable is known, but its parameters may be determined by experts. The method is based on the use of the imprecise probability theory and allows us to take into account the quality of expert judgments, heterogeneity and imprecision of information supplied by experts. An approach for computing “cautious” expert beliefs under condition that the experts are unknown is studied. Numerical examples illustrate the proposed method.  相似文献   

2.
A two-stage prognosis model in condition based maintenance   总被引:1,自引:0,他引:1  
We often observe in practice that the life of a piece of production equipment can be divided into two stages. The first stage is referred to as the normal working stage where no significant deviation from the normal operating state is observed. The second stage is called the failure delay period, since a defect may be initiated, and progressively develop into an actual failure, i.e., the equipment is in a defective stage but still working during this stage. With the help of condition monitoring, hidden defects already present in the equipment may be detected, but for maintenance planning purposes, the prediction of the initiation point of the second stage, and more importantly, the residual life thereafter is important. This paper reports on the development of a probability model to predict the initiation point of the second stage and the remaining life based on available condition monitoring information. The method for model parameters estimation is discussed and applied to real data.  相似文献   

3.
In condition monitoring practice, one of the primary concernsof maintenance managers is how long the item monitored can survivegiven condition information obtained to date. This relates tothe concept of the condition residual time where the survivaltime is not only dependent upon the age of the item monitored,but also upon the condition information obtained. Once sucha probability density function of the condition residual timeis available, a consequencial decision model can be readilyestablished to recommend a ‘best’ maintenance policybased upon all information available to date. This paper reportson a study using the monitored vibration signals to predictthe residual life of a set of rolling element bearings on thebasis of a chosen distribution. A set of complete life dataof six identical bearings along with the history of their monitoredvibration signals is available to us. The data were obtainedfrom a laboratory fatigue experiment which was conducted underan identical condition. We use stochastic filtering to predictthe residual life distribution given the monitored conditionmonitoring history to date. As the life data are available,we can compare them with the prediction. The predicted resultsare satisfactory and provide a basis for further studies. Itshould be pointed out that although the model itself is developedfor the bearings concerned, it can be generalized to modellinggeneral condition-based maintenance decison making providedsimilar conditions are met.  相似文献   

4.
This paper reports on a study of modelling condition monitoring intervals. The model is formulated based upon two important concepts. One is the failure delay time concept, which is used to divide the failure process of the item into two periods, namely a normal working period followed by a failure delay time period from a defect being first identified to the actual failure. The other is the conditional residual time concept, which assumes that the residual time also depends on the history condition information obtained. Stochastic filtering theory is used to predict the residual time distribution given all monitored information obtained to date over the failure delay time period. The solution procedure is carried out in two stages. We first propose a static model that is used to determine a fixed condition monitoring interval over the item life. Once the monitored information indicates a possible abnormality of the item concerned, that is the start of the failure delay time, a dynamic approach is employed to determine the next monitoring time at the current monitoring point given that the item is not scheduled for a preventive replacement before that time. This implies that the dynamic model overrides the static model over the failure delay time since more frequent monitoring might be needed to keep the item in close attention before an appropriate replacement is made prior to failure. Two key problems are addressed in the paper. The first is which criterion function we should use in determining the monitoring check interval, and the second is the optimization process for both models, which can be solved neither analytically nor numerically since they depend on two unknown quantities, namely, the available condition information and a decision of the time to replace the item over the failure delay time. For the first problem, we propose five appealingly good criterion functions, and test them using simulations to see which one performs best. The second problem was solved using a hybrid of simulation and analytical solution procedures. We finally present a numerical example to demonstrate the modelling methodology.  相似文献   

5.
In this paper the modelling of condition monitoring information for three critical water pumps at a large soft-drinks manufacturing plant is described. The purpose of the model is to predict the distribution of the residual lifetimes of the individual pumps. This information is used to aid maintenance management decision-making, principally relating to overhaul. We describe a simple decision rule to determine whether maintenance action is necessary given monitoring information to date.  相似文献   

6.
Proportional hazards model (PHM) is a convenient statistical tool that can be successfully applied in industrial problems, such as in accelerated life testing and condition-based maintenance, or in biomedical sciences. Estimation of PHM requires lifetime data, as well as condition monitoring data, which often is incomplete or missing, and necessitates the use of expert knowledge to compensate for it. This paper describes the methodology for elicitation of expert's beliefs and experience necessary to estimate the parameters of a PHM with time-dependent covariates. The paper gives a background of PHM and review of the literature related to the knowledge elicitation problem and gives a foundation for the proposed methodology. The knowledge elicitation process is based on case analyses and comparisons. This method results in a set of inequalities, which in turn define a feasible space for the parameters of the PHM. By sampling from the feasible space an empirical prior distribution of the parameters can be estimated. Then, using Bayes rule and statistical data the posterior distribution can be obtained. This technique can also provide reliable outcomes when no statistical data are available. The technique has been tested several times in laboratory experiments and in a real industrial case and has shown promising results.  相似文献   

7.
Summary An α-percentile residual life function does not uniquely determine a life distribution; however, a continuous life distribution can be uniquely determined by its α-percentile and β-percentile residual life functions if α and β satisfy a certain condition. Two characterizations in terms of percentle residual lifetimes are given for the Beta (1, θ,K), Exponential (λ) and Pareto (θ,K) family of distributions.  相似文献   

8.
Established condition based maintenance modelling techniques can be computationally expensive. In this paper we propose an approximate methodology using extended Kalman-filtering and condition monitoring information to recursively establish a conditional probability density function for the residual life of a component. The conditional density is then used in the construction of a maintenance/replacement decision model. The advantages of the methodology, when compared with alternative approaches, are the direct use of the often multi-dimensional condition monitoring data and the on-line automation opportunity provided by the computational efficiency of the model that potentially enables the simultaneous condition monitoring and associated inference for a large number of components and monitored variables. The methodology is applied to a vibration monitoring scenario and compared with alternative models using the case data.  相似文献   

9.
Hedging a contingent claim with an asset which is not perfectly correlated with the underlying asset results in unhedgeable residual risk. Even if the residual risk is considered diversifiable, the option writer is faced with the problem of uncertainty in the estimation of the drift rates of the underlying and the hedging instrument. If the residual risk is not considered diversifiable, then this risk can be priced using an actuarial standard deviation principle in infinitesimal time. In both cases, these models result in the same nonlinear partial differential equation (PDE). A fully implicit, monotone discretization method is developed for solution of this pricing PDE. This method is shown to converge to the viscosity solution. Certain grid conditions are required to guarantee monotonicity. An algorithm is derived which, given an initial grid, inserts a finite number of nodes in the grid to ensure that the monotonicity condition is satisfied. At each timestep, the nonlinear discretized algebraic equations are solved using an iterative algorithm, which is shown to be globally convergent. Monte Carlo hedging examples are given to illustrate the profit and loss distribution at the expiry of the option.  相似文献   

10.
Eugenia M. Furems 《TOP》2011,19(2):402-420
Classification problems in decision making are, at least, ill-structured or even unstructured ones, since, among other things, human judgments (i.e., Decision Maker preferences and/or expert knowledge) are the primary sources of information for their solving. Thus, not only the classification rules eliciting, but the application domain structuring as well, is a complex problem itself. The paper focuses on knowledge-based classification problem structuring in the context of complete (up to the expert knowledge) and consistent knowledge base construction for a Diagnostic Decision Support System. Two structuring techniques are proposed as expert aids, as well as an approach to large-size problem decomposition. It is asserted that application domain structuring and classification rules eliciting have to be arranged as interconnected procedures.  相似文献   

11.
One feasible approach to aggregating uncertainty judgments in risk assessments is to use calibration variables (or seed questions) and the Kullback-Leibler (K-L) distance to evaluate experts’ substantive or normative expertise and assign weights based on the corresponding scores. However, the reliability of this aggregation model and the effects of the number of seed questions or experts on the stability of the aggregated results are still at issue. To assess the stability of the aggregation model, this study applies the jackknife re-sampling technique to a large data set of real-world expert opinions. We also use a nonlinear regression model to analyze and interpret the resulting jackknife estimates. Our statistical model indicates that the stability of Cooke’s classical model, in which the components of the scoring rule are determined by the K-L distance, increases exponentially as the number of seed questions increases. Considering the difficulty and importance of creating and choosing appropriate seed variables, the results of this study justify the use of the K-L distance to determine and aggregate better probability interval or distribution estimates.  相似文献   

12.
In this paper, we study upper and lower bounds for the reliability function in harmonic new better than used in expectation (HNBUE) life distribution class with known first two moments. Here we say a life distribution has HNBUE property if the integral harmonic mean value of the residual life in any interval [0,t] is no more than its mean. By a constructive proof, we determine the lower and upper reliability bounds analytically and show that these bounds are all sharp.  相似文献   

13.
One of the hardest challenges in building a realistic Bayesian Network (BN) model is to construct the node probability tables (NPTs). Even with a fixed predefined model structure and very large amounts of relevant data, machine learning methods do not consistently achieve great accuracy compared to the ground truth when learning the NPT entries (parameters). Hence, it is widely believed that incorporating expert judgments can improve the learning process. We present a multinomial parameter learning method, which can easily incorporate both expert judgments and data during the parameter learning process. This method uses an auxiliary BN model to learn the parameters of a given BN. The auxiliary BN contains continuous variables and the parameter estimation amounts to updating these variables using an iterative discretization technique. The expert judgments are provided in the form of constraints on parameters divided into two categories: linear inequality constraints and approximate equality constraints. The method is evaluated with experiments based on a number of well-known sample BN models (such as Asia, Alarm and Hailfinder) as well as a real-world software defects prediction BN model. Empirically, the new method achieves much greater learning accuracy (compared to both state-of-the-art machine learning techniques and directly competing methods) with much less data. For example, in the software defects BN for a sample size of 20 (which would be considered difficult to collect in practice) when a small number of real expert constraints are provided, our method achieves a level of accuracy in parameter estimation that can only be matched by other methods with much larger sample sizes (320 samples required for the standard machine learning method, and 105 for the directly competing method with constraints).  相似文献   

14.
针对大规模群决策问题,提出了一种基于专家意见相似度的群体判断信息逐步集结规划的方法。首先利用备选方案序关系向量的灰色关联度和夹角余弦构造两两专家判断信息的组合相似度;其次以判断相似度为标准,采用一种广度邻居搜索算法对专家进行聚类;然后以判断偏差最小为目标,构造非线性的约束规划模型对每一类专家意见进行集结,从而获得类内专家的集结信息;最后从专家数量最多的类别开始,依次对每类专家集结后的判断信息进行再次集结,从而获得最终的评判结果。该方法将大规模的复杂群决 策转化为低复杂度的多阶段专家信息集结问题,并保证了群体结果的一致性。算例分析验证了方法的可行性和有效性。  相似文献   

15.
There exists a wide variety of models for return, and the chosen model determines the tool required to calculate the value at risk (VaR). This paper introduces an alternative methodology to model‐based simulation by using a Monte Carlo simulation of the Dirichlet process. The model is constructed in a Bayesian framework, using properties initially described by Ferguson. A notable advantage of this model is that, on average, the random draws are sampled from a mixed distribution that consists of a prior guess by an expert and the empirical process based on a random sample of historical asset returns. The method is relatively automatic and similar to machine learning tools, e.g. the estimate is updated as new data arrive.  相似文献   

16.
In this paper, we consider a two-state (up and down) network consisting of n links. We study the D-spectrum based dynamic reliability of the network under the assumption that the links are subject to failure according to a nonhomogeneous Poisson process. Several mixture representations are provided for the reliability function of residual lifetime of used networks, under different conditions on the status of the network or its links. These representations enable us to explore the residual reliability of operating networks in terms of the reliability functions of residual lifetimes of upper record values. The distribution function of inactivity time of a network is examined under the condition that the network has failed by inspection time t. Stochastic ordering properties of the residual lifetimes of networks under conditional D-spectra are investigated. Several examples and graphs are also provided to illustrate the established results.  相似文献   

17.
One of the challenges managers face when trying to understand complex, technological systems (in their efforts to mitigate system risks) is the quantification of accident probability, particularly in the case of rare events. Once this risk information has been quantified, managers and decision makers can use it to develop appropriate policies, design projects, and/or allocate resources that will mitigate risk. However, rare event risk information inherently suffers from a sparseness of accident data. Therefore, expert judgment is often elicited to develop frequency data for these high-consequence rare events. When applied appropriately, expert judgment can serve as an important (and, at times, the only) source of risk information. This paper presents a Bayesian methodology for assessing relative accident probabilities and their uncertainty using paired comparison to elicit expert judgments. The approach is illustrated using expert judgment data elicited for a risk study of the largest passenger ferry system in the US.  相似文献   

18.
A new life distribution is proposed, known as ``two-parameter generalized exponential sum distribution". We study the density function and failure rate function, the average failure rate function, the image features and the numerical characteristics of the mean residual life of the distribution. Several methods of calculating point estimation of parameters are discussed. Through the Monte-Carlo simulation, we compare the precision of the point estimations. In our opinion, the best linear unbiased estimation is the most optimal solution of these methods. At the same time, several methods of calculating parameters of interval estimations are given. We also discuss the precision of interval estimations by Monte-Carlo simulation and use the best linear unbiased estimation and the best linear invariant estimation to construct interval estimations which are better than other estimation method. Finally, several simulation examples and a case of maintaining tanks is used to illustrate the application of the methods presented in this paper.  相似文献   

19.
Over the last years, the valuation of life insurance contracts using concepts from financial mathematics has become a popular research area for actuaries as well as financial economists. In particular, several methods have been proposed of how to model and price participating policies, which are characterized by an annual interest rate guarantee and some bonus distribution rules. However, despite the long terms of life insurance products, most valuation models allowing for sophisticated bonus distribution rules and the inclusion of frequently offered options assume a simple Black–Scholes setup and, more specifically, deterministic or even constant interest rates.We present a framework in which participating life insurance contracts including predominant kinds of guarantees and options can be valuated and analyzed in a stochastic interest rate environment. In particular, the different option elements can be priced and analyzed separately. We use Monte Carlo and discretization methods to derive the respective values.The sensitivity of the contract and guarantee values with respect to multiple parameters is studied using the bonus distribution schemes as introduced in [Bauer, D., Kiesel, R., Kling, A., Ruß, J., 2006. Risk-neutral valuation of participating life insurance contracts. Insurance: Math. Econom. 39, 171–183]. Surprisingly, even though the value of the contract as a whole is only moderately affected by the stochasticity of the short rate of interest, the value of the different embedded options is altered considerably in comparison to the value under constant interest rates. Furthermore, using a simplified asset portfolio and empirical parameter estimations, we show that the proportion of stock within the insurer’s asset portfolio substantially affects the value of the contract.  相似文献   

20.
The returns on most financial assets exhibit kurtosis and many also have probability distributions that possess skewness as well. In this paper a general multivariate model for the probability distribution of assets returns, which incorporates both kurtosis and skewness, is described. It is based on the multivariate extended skew-Student-t distribution. Salient features of the distribution are described and these are applied to the task of asset pricing. The paper shows that the market model is non-linear in general and that the sensitivity of asset returns to return on the market portfolio is not the same as the conventional beta, although this measure does arise in special cases. It is shown that the variance of asset returns is time varying and depends on the squared deviation of market portfolio return from its location parameter. The first order conditions for portfolio selection are described. Expected utility maximisers will select portfolios from an efficient surface, which is an analogue of the familiar mean-variance frontier, and which may be implemented using quadratic programming.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号