首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Inference algorithms in directed evidential networks (DEVN) obtain their efficiency by making use of the represented independencies between variables in the model. This can be done using the disjunctive rule of combination (DRC) and the generalized Bayesian theorem (GBT), both proposed by Smets [Ph. Smets, Belief functions: the disjunctive rule of combination and the generalized Bayesian theorem, International Journal of Approximate Reasoning 9 (1993) 1–35]. These rules make possible the use of conditional belief functions for reasoning in directed evidential networks, avoiding the computations of joint belief function on the product space. In this paper, new algorithms based on these two rules are proposed for the propagation of belief functions in singly and multiply directed evidential networks.  相似文献   

2.
We study a new approach to statistical prediction in the Dempster–Shafer framework. Given a parametric model, the random variable to be predicted is expressed as a function of the parameter and a pivotal random variable. A consonant belief function in the parameter space is constructed from the likelihood function, and combined with the pivotal distribution to yield a predictive belief function that quantifies the uncertainty about the future data. The method boils down to Bayesian prediction when a probabilistic prior is available. The asymptotic consistency of the method is established in the iid case, under some assumptions. The predictive belief function can be approximated to any desired accuracy using Monte Carlo simulation and nonlinear optimization. As an illustration, the method is applied to multiple linear regression.  相似文献   

3.
For implementation of various model based techniques such as in control and fault diagnosis, data-driven identification is key for enabling cheap and rapid development of models of hybrid systems of industrial interest. In the present work, a novel identification method is proposed for a class of hybrid systems which are linear and separable in the discrete variables (that is discrete states and discrete inputs). The method takes cognizance of the fact that the separable structure of the hybrid system constrains the evolution of system dynamics. In particular, the proposed method identifies models corresponding to a certain number of modes, far fewer than the total possible modes of the system. It then generates the models for the remaining modes without any further requirement for input–output data by exploiting the separable structure of the hybrid system. We experimentally validate the method by identifying the model for a three tank benchmark hybrid system followed by model predictive control using the identified model.  相似文献   

4.
Rough set theory, a mathematical tool to deal with inexact or uncertain knowledge in information systems, has originally described the indiscernibility of elements by equivalence relations. Covering rough sets are a natural extension of classical rough sets by relaxing the partitions arising from equivalence relations to coverings. Recently, some topological concepts such as neighborhood have been applied to covering rough sets. In this paper, we further investigate the covering rough sets based on neighborhoods by approximation operations. We show that the upper approximation based on neighborhoods can be defined equivalently without using neighborhoods. To analyze the coverings themselves, we introduce unary and composition operations on coverings. A notion of homomorphism is provided to relate two covering approximation spaces. We also examine the properties of approximations preserved by the operations and homomorphisms, respectively.  相似文献   

5.
6.
Recently, Gauthier introduced a method to construct solutions to the equations of motion associated with oscillating systems into the mathematics education research literature. In particular, Gauthier's approach involved certain manipulations of the differential equations; and drew on the theory of complex variables.

Motivated by the work of Gauthier, we construct an alternative pedagogical approach for the learning and teaching of solution methods to these equations. The innovation lies in drawing on factorization techniques of differential equations and harmonizing them with Gauthier's approach of the theory of complex variables. When blended together to form a new approach, the significance lies in its accessibility, justifiability and transferrability to other problems.

We pedagogically ground our approach in the educational development theory of Piaget, with the results informing the learning and teaching of solution methods to differential equations for lecturers, teachers and learners within universities, colleges, polytechnics and schools around the world.  相似文献   

7.
8.
The notion of ageing plays an important role in reliability analysis and in identifying life distributions. Most of the ageing concepts existing in the literature are described on the basis of measures defined in terms of the distribution function. Recently, the role of quantile functions has also been identified as lifetime models, and reliability functions based on distribution functions were redefined in terms of quantile functions. In the present paper, we redefine some important popular ageing concepts using quantile functions. The uses of new definitions are illustrated by discussing ageing properties of some quantile function models.  相似文献   

9.
On account of the basic physical feature of elastoplastic deformations, namely, the occurrence of internal self-equilibrated stress field upon removal of external loadings, a process of destressing an elastoplastic body with non-uniform stress field would inevitably lead to loss of the line element concept in the resultant incompatible unstressed configuration. This would mean the loss of the kinematic ground or prerequisite for defining kinematic quantities in usual material continua. Then, an additional variable labelled as “elastic” or “plastic” via destressing could not be endowed with the necessary kinematic and physical content characterising a deformation quantity and, accordingly, might actually be a formal variable in mathematical sense. Thus follows an observation that the elastoplastic deformation would be an inherently inseparable physical entity. (© 2005 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

10.
We are interested in the problem of multi-source information fusion in the case when the information provided has some uncertainty. We note that sensor provided information generally has a probabilistic type of uncertainty whereas linguistic information typically introduces a possibilistic type of uncertainty. More generally, we are faced with a problem in which we must fuse information with different types of uncertainty. In order to provide a unified framework for the representation of these different types of uncertain information we use a set measure approach for the representation of uncertain information. We discuss a set measure representation of uncertain information. In the multi-source fusion problem, in addition to having a collection of pieces of information that must be fused, we need to have some expert provided instructions on how to fuse these pieces of information. Generally these instructions can involve a combination of linguistically and mathematically expressed directions. In the course of this work we begin to consider the fundamental task of how to translate these instructions into formal operations that can be applied to our information. This requires us to investigate the important problem of the aggregation of set measures.  相似文献   

11.
12.
The author presents an intuitive approach to elementary concepts in probability using some of the familiar concepts already mastered by students of the sixth grade.  相似文献   

13.
A new approach to the estimation of the reliability of classification algorithms is proposed. The approach is based on an unconventional information model of such algorithms. Examples of new estimates are given, which are compared with usual statistical estimates.  相似文献   

14.
A method for constructing priors is proposed that allows the off-diagonal elements of the concentration matrix of Gaussian data to be zero. The priors have the property that the marginal prior distribution of the number of nonzero off-diagonal elements of the concentration matrix (referred to below as model size) can be specified flexibly. The priors have normalizing constants for each model size, rather than for each model, giving a tractable number of normalizing constants that need to be estimated. The article shows how to estimate the normalizing constants using Markov chain Monte Carlo simulation and supersedes the method of Wong et al. (2003) [24] because it is more accurate and more general. The method is applied to two examples. The first is a mixture of constrained Wisharts. The second is from Wong et al. (2003) [24] and decomposes the concentration matrix into a function of partial correlations and conditional variances using a mixture distribution on the matrix of partial correlations. The approach detects structural zeros in the concentration matrix and estimates the covariance matrix parsimoniously if the concentration matrix is sparse.  相似文献   

15.
An epidemiological approach is adopted to develop a model of viral meme propagation. The successful implementation in the modelling of meme spread as reflected in Internet search data shows that memes may be treated as infectious entities when modelling their propagation over time and across societies.  相似文献   

16.
By introducing a time factor into the local deformation function, the theory of local deformations can be used to unite the theories of plasticity and creep. The local deformation function is taken as a rheological dependence in the form of an integral equation, and it is assumed to vary in a way that depends on the direction relative to the principal isotropic axes. Thus, equations are obtained for an orthotropic material with nonlinear creep. The relations obtained also allow for the variation of Poisson's ratio with time. It is shown that special cases of these expressions were previously used to describe the creep of glass-reinforced plastics at low stresses.Mekhanika polimerov, Vol. 1, No. 1, pp. 44–49, 1965  相似文献   

17.
This paper addresses the problem of designing LAN-WAN (Local Area Network, Wide Area Network) computer networks with transparent bridges. Bridges are high performance devices that are used to interconnect LANs at the MAC (Medium Access Control) level in the protocol hierarchy. LANs in remote areas are connected by leased telecommunications lines such as T1 and DDS. A requirement of transparent bridges is that networks must be configured in a spanning tree topology. We have developed mathematical models of LAN-WAN networks and formulated an optimization problem. This problem is a nonconvex, nonlinear, mixed integer program. A simulated annealing algorithm is proposed. The algorithm generates sequences of neighboring spanning trees and evaluates design constraints based on maximum flow, bridge capacity, and end-to-end delay. As the annealing temperature parameter is lowered the algorithm moves towards the global optimal solution. Experimental results have shown that LAN-WAN designs using simulated annealing are better than 99.99% of all feasible designs.  相似文献   

18.
19.
We present a new approach for removing the nonspecific noise from Drosophila segmentation genes. The algorithm used for filtering here is an enhanced version of singular spectrum analysis method, which decomposes a gene profile into the sum of a signal and noise. Because the main issue in extracting signal using singular spectrum analysis procedure lies in identifying the number of eigenvalues needed for signal reconstruction, this paper seeks to explore the applicability of the new proposed method for eigenvalues identification in four different gene expression profiles. Our findings indicate that when extracting signal from different genes, for optimised signal and noise separation, different number of eigenvalues need to be chosen for each gene. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

20.
This paper presents a consensus model for group decision making with interval multiplicative and fuzzy preference relations based on two consensus criteria: (1) a consensus measure which indicates the agreement between experts’ preference relations and (2) a measure of proximity to find out how far the individual opinions are from the group opinion. These measures are calculated by using the relative projections of individual preference relations on the collective one, which are obtained by extending the relative projection of vectors. First, the weights of experts are determined by the relative projections of individual preference relations on the initial collective one. Then using the weights of experts, all individual preference relations are aggregated into a collective one. The consensus and proximity measures are calculated by using the relative projections of experts’ preference relations respectively. The consensus measure is used to guide the consensus process until the collective solution is achieved. The proximity measure is used to guide the discussion phase of consensus reaching process. In such a way, an iterative algorithm is designed to guide the experts in the consensus reaching process. Finally the expected value preference relations are defined to transform the interval collective preference relation to a crisp one and the weights of alternatives are obtained from the expected value preference relations. Two numerical examples are given to illustrate the models and approaches.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号