首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Data generated in forestry biometrics are not normal in statistical sense as they rarely follow the normal regression model. Hence, there is a need to develop models and methods in forest biometric applications for non-normal models. Due to generality of Bayesian methods it can be implemented in the situations when Gaussian regression models do not fit the data. Data on diameter at breast height (dbh), which is a very important characteristic in forestry has been fitted to Weibull and gamma models in Bayesian paradigm and comparisons have also been made with its classical counterpart. It may be noted that MCMC simulation tools are used in this study. An attempt has been made to apply Bayesian simulation tools using \textbf{R} software.  相似文献   

2.
This paper presents a competing risks reliability model for a system that releases signals each time its condition deteriorates. The released signals are used to inform opportunistic maintenance. The model provides a framework for the determination of the underlying system lifetime from right-censored data, without requiring explicit assumptions about the type of censoring to be made. The parameters of the model are estimated from observational data by using maximum likelihood estimation. We illustrate the estimation process through a simulation study. The proposed signal model can be used to support decision-making in optimising preventive maintenance: at a component level, estimates of the underlying failure distribution can be used to identify the critical signal that would trigger maintenance of the individual component; at a multi-component system level, accurate estimates of the component underlying lifetimes are important when making general maintenance decisions. The benefit of good estimation from censored data, when adequate knowledge about the dependence structure is not available, may justify the additional data collection cost in cases where full signal data is not available.  相似文献   

3.
The dynamics of mechanical systems with a finite number of degrees of freedom (discrete mechanical systems) is governed by the Lagrange equation which is a second-order differential equation on a Riemannian manifold (the configuration manifold). The handling of perfect (frictionless) unilateral constraints in this framework (that of Lagrange’s analytical dynamics) was undertaken by Schatzman and Moreau at the beginning of the 1980s. A mathematically sound and consistent evolution problem was obtained, paving the road for many subsequent theoretical investigations. In this general evolution problem, the only reaction force which is involved is a generalized reaction force, consistently with the virtual power philosophy of Lagrange. Surprisingly, such a general formulation was never derived in the case of frictional unilateral multibody dynamics. Instead, the paradigm of the Coulomb law applying to reaction forces in the real world is generally invoked. So far, this paradigm has only enabled to obtain a consistent evolution problem in only some very few specific examples and to suggest numerical algorithms to produce computational examples (numerical modeling). In particular, it is not clear what is the evolution problem underlying the computational examples. Moreover, some of the few specific cases in which this paradigm enables to write down a precise evolution problem are known to show paradoxes: the Painlevé paradox (indeterminacy) and the Kane paradox (increase in kinetic energy due to friction). In this paper, we follow Lagrange’s philosophy and formulate the frictional unilateral multibody dynamics in terms of the generalized reaction force and not in terms of the real-world reaction force. A general evolution problem that governs the dynamics is obtained for the first time. We prove that all the solutions are dissipative; that is, this new formulation is free of Kane paradox. We also prove that some indeterminacy of the Painlevé paradox is fixed in this formulation.  相似文献   

4.
Granular Computing is an emerging conceptual and computing paradigm of information-processing. A central notion is an information-processing pyramid with different levels of clarifications. Each level is usually represented by ‘chunks’ of data or granules, also known as information granules. Rough Set Theory is one of the most widely used methodologies for handling or defining granules.Ontologies are used to represent the knowledge of a domain for specific applications. A challenge is to define semantic knowledge at different levels of human-depending detail.In this paper we propose four operations in order to have several granular perspectives for a specific ontological commitment. Then these operations are used to have various views of an ontology built with a rough-set approach. In particular, a rough methodology is introduced to construct a specific granular view of an ontology.  相似文献   

5.
Personnel rostering problems are highly constrained resource allocation problems. Human rostering experts have many years of experience in making rostering decisions which reflect their individual goals and objectives. We present a novel method for capturing nurse rostering decisions and adapting them to solve new problems using the Case-Based Reasoning (CBR) paradigm. This method stores examples of previously encountered constraint violations and the operations that were used to repair them. The violations are represented as vectors of feature values. We investigate the problem of selecting and weighting features so as to improve the performance of the case-based reasoning approach. A genetic algorithm is developed for off-line feature selection and weighting using the complex data types needed to represent real-world nurse rostering problems. This approach significantly improves the accuracy of the CBR method and reduces the number of features that need to be stored for each problem. The relative importance of different features is also determined, providing an insight into the nature of expert decision making in personnel rostering.  相似文献   

6.
In this paper we introduce the Single Period Coverage Facility Location Problem. It is a multi-period discrete location problem in which each customer is serviced in exactly one period of the planning horizon. The locational decisions are made independently for each period, so that the facilities that are open need not be the same in different time periods. It is also assumed that at each period there is a minimum number of customers that can be assigned to the facilities that are open. The decisions to be made include not only the facilities to open at each time period and the time period in which each customer will be served, but also the allocation of customers to open facilities in their service period.  相似文献   

7.
In Bayesian analysis of multidimensional scaling model with MCMC algorithm, we encounter the indeterminacy of rotation, reflection and translation of the parameter matrix of interest. This type of indeterminacy may be seen in other multivariate latent variable models as well. In this paper, we propose to address this indeterminacy problem with a novel, offline post-processing method that is easily implemented using easy-to-use Markov chain Monte Carlo (MCMC) software. Specifically, we propose a post-processing method based on the generalized extended Procrustes analysis to address this problem. The proposed method is compared with four existing methods to deal with indeterminacy thorough analyses of artificial as well as real datasets. The proposed method achieved at least as good a performance as the best existing method. The benefit of the offline processing approach in the era of easy-to-use MCMC software is discussed.  相似文献   

8.
In this paper we construct and study a natural invariant measure for a birational self-map of the complex projective plane. Our main hypothesis - that the birational map be "separating" - is a condition on the indeterminacy set of the map. We prove that the measure is mixing and that it has distinct Lyapunov exponents. Under a further hypothesis on the indeterminacy set we show that the measure is hyperbolic in the sense of Pesin theory. In this case, we also prove that saddle periodic points are dense in the support of the measure.  相似文献   

9.
Stochastic programs with recourse provide an effective modeling paradigm for sequential decision problems with uncertain or noisy data, when uncertainty can be modeled by a discrete set of scenarios. In two-stage problems the decision variables are partitioned into two groups: a set of structural, first-stage decisions, and a set of second-stage, recourse decisions. The structural decisions are scenario-invariant, but the recourse decisions are scenario-dependent and can vary substantially across scenarios. In several applications it is important to restrict the variability of recourse decisions across scenarios, or to investigate the tradeoffs between the stability of recourse decisions and expected cost of a solution.We present formulations of stochastic programs with restricted recourse that trade off recourse stability with expected cost. The models generate a sequence of solutions to which recourse robustness is progressively enforced via parameterized, satisficing constraints. We investigate the behavior of the models on several test cases, and examine the performance of solution procedures based on the primal-dual interior point method.  相似文献   

10.
Managers, typically, are unaware of the significant impact their decisions could have on the random mechanism driving a data generating process. Here, a new parametric Bayesian technique is introduced that would allow managers to obtain an estimate of the impact of their decisions on the stochastic process driving the data; this, in turn, should enhance a company’s overall decision-making capabilities. This general approach to modeling decision-dependency is carried out via an efficient Markov chain Monte Carlo method. A simulated example, and a real-life example, using historical maintenance and failure time data from a system at the South Texas Project Nuclear Operating Company, exemplifies the paper’s theoretical contributions. Conclusive evidence of decision dependence in the failure time distribution is reported, which in turn points to an optimal maintenance policy that results in potentially large financial savings to the Texas-based company.  相似文献   

11.
Convex demand functions, although commonly used in consumer theory and in accordance with a large amount of empirical evidence, are known to be problematic in the analysis of firms’ behavior; therefore, they are rarely used in oligopoly theory, due to the possible lack of concavity of the firms’ profit functions and the indeterminacy arising in the limit as marginal costs tend to zero. We investigate a dynamic oligopoly model with hyperbolic demand and sticky price, characterizing the open-loop optimal control and the related steady-state equilibrium, to show that the indeterminacy associated with the limit of the static model is indeed confined to the steady state of the dynamic model, while the latter allows for a well-behaved solution at any time during the game. Although the feedback solution cannot be analytically attained since the model is not built in linear-quadratic form, we show that analogous considerations also apply to the Bellman equation of the individual firm.  相似文献   

12.
Haplotype inference by pure parsimony (Hipp) is a well-known paradigm for haplotype inference. In order to assess the biological significance of this paradigm, we generalize the problem of Hipp to the problem of finding all optimal solutions, which we call Chipp. We study intrinsic haplotype features, such as backbone haplotypes and fat genotypes as well as equal columns and decomposability. We explicitly exploit these features in three computational approaches that are based on integer linear programming, depth-first branch-and-bound, and Boolean satisfiability. Further we introduce two hybrid algorithms that draw upon the diverse strengths of the approaches. Our experimental analysis shows that our optimized algorithms are significantly superior to the baseline algorithms, often with orders of magnitude faster running time. Finally, our experiments provide some useful insights into the intrinsic features of this important problem.  相似文献   

13.
This paper is based on an invited lecture given by the author at the ORSA/TIMS Special Interest Group on Applied Probability Conference onStatistical and Computational Problems in Probability Modeling, held at Williamsburg, Virginia, January 7–9, 1985.The theme of this paper is twofold. First, that members of the above group should be seriously concerned with issues of statistical inference — they should not stop short upon proposing a probability model. Second, that inference be undertaken via a strict adherence to the rules of probability — the Bayesian paradigm. To underscore a need for emphasizing the first theme, it may be pertinent to note that an overwhelming majority of the papers dealing with statistical and inferential issues that were presented at this conference were authored by members who did not claim to belong to the ORSA/TIMS Special Interest Group on Applied Probability.The lecture was followed by a panel discussion, with Drs. Lyle Broemeling and Edward Wegman of the Office of Naval Research as discussants. Dr. Robert Launer of the Army Research Office served as a moderator. Discussions from the floor included comments by Professors D. Harrington of Harvard University, E. Parzen of Texas A & M University, and R. Smith of Imperial College, London, England. This paper, and the comments of the panelists, are published in this volume of theAnnals of Operations Research, which is going to serve as a Proceedings of the Conference.Supported by Contract No. N00014-85-K-0202, Office of Naval Research, and Grant No. DAAG 29-84-K-0160, Army Research Office.  相似文献   

14.
Thomas Sattig 《Metaphysica》2013,14(2):211-223
The problem of the many poses the task of explaining mereological indeterminacy of ordinary objects in a way that sustains our familiar practice of counting these objects. The aim of this essay is to develop a solution to the problem of the many that is based on an account of mereological indeterminacy as having its source in how ordinary objects are, independently of how we represent them. At the center of the account stands a quasi-hylomorphic ontology of ordinary objects as material objects with multiple individual forms.  相似文献   

15.
The main purpose of this paper is to discuss how college students enrolled in a college level elementary algebra course exercised control decisions while working on routine and non-routine problems, and how their personal belief systems shaped those control decisions. In order to prepare students for success in mathematics we as educators need to understand the process steps they use to solve homework or examination questions, in other words, understand how they “do” mathematics. The findings in this study suggest that an individual’s belief system impacts how they approach a problem. Lack of confidence and previous lack of success combined to prompt swift decisions to stop working. Further findings indicate that students continue with unsuccessful strategies when working on unfamiliar problems due to a perceived dependence of solution strategies to specific problem types. In this situation, the students persisted in an inappropriate solution strategy, never reaching a correct solution. Control decisions concerning the pursuit of alternative strategies are not an issue if the students are unaware that they might need to make different choices during their solutions. More successful control decisions were made when working with familiar problems.  相似文献   

16.
The physics of granular materials is interesting from many points of view because they exhibit a wealth of phenomena that have both fluid and solid aspects [C.S. Campbell, Annu. Rev. Fluid. Mech. 22 (1990) 57, H.M. Jaeger, S.R. Nagel, R.P. Behringer, Phys. Today 494 (1996) 32]. Recently a difficult pattern was observed if sand falls in the space between two plates and passes an obstacle [Y. Amarouchene, J.F. Boudet, H. Kellay, Phys. Rev. Lett. 86 (2001) 4286]. The interesting behaviour occurs on top of the obstacle where a dynamic dune with a parabolic tip is formed. Inside this parabola, a triangular region of non- or very slow flowing sand is observed. Using factor analysis it is possible to extract latent parameters from a dynamic process. Applying a three factor model we can clearly identify the inner triangle (1st factor) and the outer parabolic pattern (3rd factor). The second factor we interpret as shock wave. Most interactions between particles take place in a relatively small region. We show that the pattern formation process depends on the restitution coefficients (particle–particle and particle–obstacle) and also on the particle size. These findings cannot be observed if standard velocity profiles are used to analyse the data. Our findings show, that most interactions take place in a relatively small area correlating with the particle size. If the interactions between different particles and particle–obstacle are elastic the formation of a non-flowing triangular region is more difficult as if inelastic collisions are used. The factor curves also clearly show that a pattern formation process has to be finished, before the next pattern can be formed.  相似文献   

17.
In this paper, we propose a new methodology for handling optimization problems with uncertain data. With the usual Robust Optimization paradigm, one looks for the decisions ensuring a required performance for all realizations of the data from a given bounded uncertainty set, whereas with the proposed approach, we require also a controlled deterioration in performance when the data is outside the uncertainty set. The extension of Robust Optimization methodology developed in this paper opens up new possibilities to solve efficiently multi-stage finite-horizon uncertain optimization problems, in particular, to analyze and to synthesize linear controllers for discrete time dynamical systems. Research was supported by the Binational Science Foundation grant #2002038  相似文献   

18.
Matti Eklund 《Metaphysica》2013,14(2):165-179
The topic of this paper is whether there is metaphysical vagueness. It is shown that it is important to distinguish between the general phenomenon of indeterminacy and the more narrow phenomenon of vagueness (the phenomenon that paradigmatically rears its head in sorites reasoning). Relatedly, it is important to distinguish between metaphysical indeterminacy and metaphysical vagueness. One can wish to allow metaphysical indeterminacy but rule out metaphysical vagueness. As is discussed in the paper, central argument against metaphysical vagueness, like those of Gareth Evans and Mark Sainsbury, would if successful rule out metaphysical indeterminacy. One way to argue specifically against the possibility of metaphysical vagueness might be thought to be to argue for a specific theory of the nature of vagueness according to which vagueness is a semantic phenomenon. But it is shown that there are complications also pertaining to arguments with that structure. Toward the end of the paper, I discuss Trenton Merricks’ well-known argument against a semantic view on vagueness and for a metaphysical view.  相似文献   

19.
A commonly used paradigm in modeling count data is to assume that individual counts are generated from a Binomial distribution, with probabilities varying between individuals according to a Beta distribution. The marginal distribution of the counts is then BetaBinomial. Bradlow, Hardie, and Fader (2002, p. 189) make use of polynomial expansions to simplify Bayesian computations with Negative-Binomial distributed data. This article exploits similar expansions to facilitate Bayesian inference with data from the Beta-Binomial model. This has great application and computational importance to many problems, as previous research has resorted to computationally intensive numerical integration or Markov chain Monte Carlo techniques.  相似文献   

20.
Inventory levels are critical to the operations, management, and capacity decisions of inventory systems but can be difficult to model in heterogeneous, non-stationary throughput systems. The inpatient hospital is a complicated throughput system and, like most inventory systems, hospitals dynamically make managerial decisions based on short term subjective demand predictions. Specifically, short term hospital staffing, resource capacity, and finance decisions are made according to hospital inpatient inventory predictions. Inpatient inventory systems have non-stationary patient arrival and service processes. Previously developed models present poor inventory predictions due to model subjectivity, high model complexity, solely expected value predictions, and assumed stationary arrival and service processes. Also, no models present statistical testing for model significance and quality-of-fit. This paper presents a Markov chain probability model that uses maximum likelihood regression to predict the expectations and discrete distributions of transient inpatient inventories. The approach has a foundation in throughput theory, has low model complexity, and provides statistical significance and quality-of-fit tests unique to this Markov chain. The Markov chain is shown to have superior predictability over Seasonal ARIMA models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号