首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
Nonstationary shock models   总被引:1,自引:0,他引:1  
This paper extends results obtained by Esary, Marshall and Proschan [10]. Life distribution properties of a device subject to shocks governed by a nonhomogeneous Poisson process are related to corresponding properties of the probability of failing after experiencing a given number of shocks. Physically motivated models are analyzed in which shocks cause damage to a set of components, the damages accumulate additively, and when the accumulated damage exceeds a critical threshold (possibly random) for any of the components, the device fails. Bounds are obtained on the moments of the life length of the device.  相似文献   

2.
In this paper, a random fuzzy shock model and a random fuzzy fatal shock model are proposed. Then bivariate random fuzzy exponential distribution is derived from the random fuzzy fatal shock model. Furthermore, some properties of the bivariate random fuzzy exponential distribution are proposed. Finally, an example is given to show the application of the bivariate random fuzzy exponential distribution.  相似文献   

3.
We construct an independent increments Gaussian process associated to a class of multicolor urn models. The construction uses random variables from the urn model which are different from the random variables for which central limit theorems are available in the two color case.  相似文献   

4.
Fabio Boschetti 《Complexity》2016,21(6):202-213
Computer models can help humans gain insight into the functioning of complex systems. Used for training, they can also help gain insight into the cognitive processes humans use to understand these systems. By influencing humans understanding (and consequent actions) computer models can thus generate an impact on both these actors and the very systems they are designed to simulate. When these systems also include humans, a number of self‐referential relations thus emerge which can lead to very complex dynamics. This is particularly true when we explicitly acknowledge and model the existence of multiple conflicting representations of reality among different individuals. Given the increasing availability of computational devices, the use of computer models to support individual and shared decision making could potentially have implications far wider than the ones often discussed within the Information and Communication Technologies community in terms of computational power and network communication. We discuss some theoretical implications and describe some initial numerical simulations. © 2015 Wiley Periodicals, Inc. Complexity 21: 202–213, 2016  相似文献   

5.
Standard assumptions in shock models are that failures of items are related either to the cumulative effect of shocks (cumulative models) or that they are caused by shocks that exceed a certain critical level (extreme shocks models). In this paper, we present useful generalizations of this setting to the case when an item is deteriorating itself, for example, when the boundary for the fatal shock magnitude is decreasing with time. Three stochastic failure models describing different impacts of shocks on items are considered. The cumulative effect of shocks is modeled in a way similar to the proportional hazards model. Explicit formulas for the corresponding survival functions are derived and several simple examples are considered. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
We consider the extreme values of a portfolio of independent continuous Gaussian processes ( ) which are asymptotically locally stationary, with expectations and variances , and a trend for some constants with . We derive the probability for , which may be interpreted as ruin probability. AMS 2000 Subject Classifications Primary—60G15, 62G32, 91B28  相似文献   

7.
Bóna (2007) [6] studied the distribution of ascents, plateaux and descents in the class of Stirling permutations, introduced by Gessel and Stanley (1978) [13]. Recently, Janson (2008) [17] showed the connection between Stirling permutations and plane recursive trees and proved a joint normal law for the parameters considered by Bóna. Here we will consider generalized Stirling permutations extending the earlier results of Bóna (2007) [6] and Janson (2008) [17], and relate them with certain families of generalized plane recursive trees, and also (k+1)-ary increasing trees. We also give two different bijections between certain families of increasing trees, which both give as a special case a bijection between ternary increasing trees and plane recursive trees. In order to describe the (asymptotic) behaviour of the parameters of interests, we study three (generalized) Pólya urn models using various methods.  相似文献   

8.
《Optimization》2012,61(4):629-636
A general shock model associated with a correlated pair (X n ,Y n ) of renewal sequences is considered. The system fails when the magnitude of the shock exceeds a random threshold Zfollowing exponential law. The distribution of the system failure time T Z is found and first two moments of T Z are derived. A class of correlated cumulative shock models is also studied. As an application stochastic clearing system is studied in detail.  相似文献   

9.
Estimating financial risk is a critical issue for banks and insurance companies. Recently, quantile estimation based on extreme value theory (EVT) has found a successful domain of application in such a context, outperforming other methods. Given a parametric model provided by EVT, a natural approach is maximum likelihood estimation. Although the resulting estimator is asymptotically efficient, often the number of observations available to estimate the parameters of the EVT models is too small to make the large sample property trustworthy. In this paper, we study a new estimator of the parameters, the maximum Lq-likelihood estimator (MLqE), introduced by Ferrari and Yang (Estimation of tail probability via the maximum Lq-likelihood method, Technical Report 659, School of Statistics, University of Minnesota, 2007 ). We show that the MLqE outperforms the standard MLE, when estimating tail probabilities and quantiles of the generalized extreme value (GEV) and the generalized Pareto (GP) distributions. First, we assess the relative efficiency between the MLqE and the MLE for various sample sizes, using Monte Carlo simulations. Second, we analyze the performance of the MLqE for extreme quantile estimation using real-world financial data. The MLqE is characterized by a distortion parameter q and extends the traditional log-likelihood maximization procedure. When q→1, the new estimator approaches the traditional maximum likelihood estimator (MLE), recovering its desirable asymptotic properties; when q ≠ 1 and the sample size is moderate or small, the MLqE successfully trades bias for variance, resulting in an overall gain in terms of accuracy (mean squared error).   相似文献   

10.
This paper introduces a multiple quantile utility model of Cumulative Prospect Theory in an ambiguous setting. We show a representation theorem in which a prospect is valued by a composite value function. The composite value function is able to represent asymmetric attitude on extreme events and a rational prudence on ordinary events.  相似文献   

11.
This paper proposes a conditional technique for the estimation of VaR and expected shortfall measures based on the skewed generalized t (SGT) distribution. The estimation of the conditional mean and conditional variance of returns is based on ten popular variations of the GARCH model. The results indicate that the TS-GARCH and EGARCH models have the best overall performance. The remaining GARCH specifications, except in a few cases, produce acceptable results. An unconditional SGT-VaR performs well on an in-sample evaluation and fails the tests on an out-of-sample evaluation. The latter indicates the need to incorporate time-varying mean and volatility estimates in the computation of VaR and expected shortfall measures.  相似文献   

12.
This paper presents variable acceptance sampling plans based on the assumption that consecutive observations on a quality characteristic(X) are autocorrelated and are governed by a stationary autoregressive moving average (ARMA) process. The sampling plans are obtained under the assumption that an adequate ARMA model can be identified based on historical data from the process. Two types of acceptance sampling plans are presented: (1) Non-sequential acceptance sampling: In this case historical data is available based on which an ARMA model is identified. Parameter estimates are used to determine the action limit (k) and the sample size(n). A decision regarding acceptance of a process is made after a complete sample of size n is selected. (2) Sequential acceptance sampling: Here too historical data is available based on which an ARMA model is identified. A decision regarding whether or not to accept a process is made after each individual sample observation becomes available. The concept of Sequential Probability Ratio Test (SPRT) is used to derive the sampling plans. Simulation studies are used to assess the effect of uncertainties in parameter estimates and the effect of model misidentification (based on historical data) on sample size for the sampling plans. Macros for computing the required sample size using both methods based on several ARMA models can be found on the author’s web page .  相似文献   

13.
Sharp comparisons between aging renewal process shock models and the corresponding Esary-Marshall-Proschan (EMP) shock model are considered. The usefulness of such comparisons derive from the simplicity of the latter models. Simple conditions under which such aging renewal process shock models are stochastically ordered relative to a corresponding EMP-model are derived. Applications to renewal functions and single server queues are indicated.  相似文献   

14.
We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.  相似文献   

15.
Two devices are subjected to shocks arriving according to a general counting process. Let M1 and M2 be the random number of shocks that cause the failure of the first and the second device, respectively. We find conditions on the counting process such that the mean residual life ordering, the increasing convex ordering and the expectation ordering between M1 and M2 are preserved in the random lifetimes of the two devices.  相似文献   

16.
Mathematical Programming - We consider maximum likelihood estimation for Gaussian Mixture Models (Gmm s). This task is almost invariably solved (in theory and practice) via the Expectation...  相似文献   

17.
An extended stochastic failure model for a system subject to random shocks   总被引:1,自引:0,他引:1  
In this article, a stochastic failure model for a system subject to a random shock process is studied. It is assumed that a fatal shock results in an immediate system failure, whereas a non-fatal shock may increase the susceptibility of the system to failure. The lifetime distribution of the system and its failure rate function are derived, and the effect of environmental factors on the failure process of the system is also investigated. Lifetimes of systems operated under different environmental conditions are stochastically compared.  相似文献   

18.
The theory and behavior of the clock version of the ascending auction has been well understood for at least 20 years. The more widely used oral outcry version of the ascending auction that allows bidders to submit their own bids has been the subject of some recent controversy mostly in regard to whether or not jump bidding, i.e. bidders submitting bids higher than required by the auctioneer, should be allowed. Isaac, Salmon & Zillante (2005) shows that the standard equilibrium for the clock auction does not apply to the non-clock format and constructs an equilibrium bid function intended to match with field data on ascending auctions. In this study, we will use economic experiments to provide a direct empirical test of that model while simultaneously providing empirical evidence to resolve the policy disputes centered around the place of jump bidding in ascending auctions.Received: March 2005The authors would like to thank Florida State University for providing the funding for the experiments in this paper and Bradley Andrews for programming assistance.  相似文献   

19.
Consider the class of linear models (with uncorrelated observation, each having variance σ2), in which it is known that at most k (location) parameters are negligible, but it is not known which are negligible. The problem is to identify the nonnegligible parameters. In this paper, for k = 1, and under certain restrictions on the model, a technique is developed for solving this problem, which has the feature of requiring (in an information theoretic sense) the minimum amount of computation. (It can “search through” 2m objects, using m “steps.”) The technique consists of dichotomizing the set of parameters (one known subset possibly containing the nonnegligible element, and the other not), using chi-square variables. A method for computing the probability that the correct parameter is identified, is presented, and an important application to factorial search designs is established.  相似文献   

20.
The quality of the estimation of a latent segment model when only store-level aggregate data is available seems to be dependent on the computational methods selected and in particular on the optimization methodology used to obtain it. Following the stream of work that emphasizes the estimation of a segmentation structure with aggregate data, this work proposes an optimization method, among the deterministic optimization methods, that can provide estimates for segment characteristics as well as size, brand/product preferences and sensitivity to price and price promotion variation estimates that can be accommodated in dynamic models. It is shown that, among the gradient based optimization methods that were tested, the Sequential Quadratic Programming method (SQP) is the only that, for all scenarios tested for this type of problem, guarantees of reliability, precision and efficiency being robust, i.e., always able to deliver a solution. Therefore, the latent segment models can be estimated using the SQP method when only aggregate market data is available.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号