首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Within this paper, a modeling approach for flexure hinges based on the Euler-Bernoulli beam theory for beams of variable cross section is investigated in a static analysis. The proposed approach is implemented in a finite beam element routine, for which two different discretizations are discussed. The results are compared to a full scale three dimensional model. It is shown that a circular flexure hinge cannot be modeled accurately with one element. An improved model with three elements across the flexure hinge length is presented which shows excellent accordance with the reference model. A geometry optimization is realized based on the improved, low-DOF model. (© 2013 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

2.
A multiresolutional approach for measurement decomposition and system modeling is presented in this paper. The decomposition is performed in both spatial and time domains and provides an excellent platform for developing computationally efficient algorithms. Using multiresolutional decomposition and modeling, a multiresolutional joint probabilistic data association (MR-JPDA) algorithm is developed for multiple target tracking. Monte Carlo simulations demonstrate that the computation of the MRJPDA algorithm is much less than the traditional joint probabilistic data association (JPDA) algorithm with a comparable performance.  相似文献   

3.
Both analytic and simulation models were used to analyze the capabilities and requirements of an automated circuit card manufacturing system. Analytic models were used to determine the sensitivity of the measures of effectiveness (MOEs) to various design parameters. This analysis gave approximate results and bounded the range of input parameters for the simulation model. A detailed simulation model was required for use during both the design and production phases of the project. This simulation model incorporated only those variables to which the MOEs are most sensitive, and provided additional features to observe system behavior. The benefits and appropriate uses for each class of models are discussed.  相似文献   

4.
5.
Finite mixture regression models are useful for modeling the relationship between response and predictors arising from different subpopulations. In this article, we study high-dimensional predictors and high-dimensional response and propose two procedures to cluster observations according to the link between predictors and the response. To reduce the dimension, we propose to use the Lasso estimator, which takes into account the sparsity and a maximum likelihood estimator penalized by the rank, to take into account the matrix structure. To choose the number of components and the sparsity level, we construct a collection of models, varying those two parameters and we select a model among this collection with a non-asymptotic criterion. We extend these procedures to functional data, where predictors and responses are functions. For this purpose, we use a wavelet-based approach. For each situation, we provide algorithms and apply and evaluate our methods both on simulated and real datasets, to understand how they work in practice.  相似文献   

6.
In this paper, a numerical study of a diffusion problem in the presence of wells on which integral boundary conditions are used is performed. It is shown that a method proposed earlier is fully efficient and offers certain advantages as compared with direct modeling of wells based on the finite element method. The results of calculations for two wells are presented.  相似文献   

7.
We propose a new parametric model for continuous data, a “g-model”, on the basis of gradient maps of convex functions. It is known that any multivariate probability density on the Euclidean space is uniquely transformed to any other density by using the gradient map of a convex function. Therefore the statistical modeling for quantitative data is equivalent to design of the gradient maps. The explicit expression for the gradient map enables us the exact sampling from the corresponding probability distribution. We define the g-model as a convex subset of the space of all gradient maps. It is shown that the g-model has many desirable properties such as the concavity of the log-likelihood function. An application to detect the three-dimensional interaction of data is investigated.  相似文献   

8.
Under study is some mathematical model for quantitative evaluation of investment projects for development of oil fields at the stage of conceptual design. As the basis of such a model we suggest that the field is considered as a cluster of equitype elements of area pattern of oil wells. The model operates with the net present value as a continuous function of the process parameters and enables us to analyze a broad spectrum of possible options in implementing the investment project. Some important ratios between the technical and economic parameters are obtained in concise and practically suitable forms by application of operational calculus and the Laplace transform.  相似文献   

9.
The analysis of data concerning the deterioration of pavement over time yielded a problem of aggregating the data in a manner that preserved independence of the aggregated data points and maximized the number of points. We show that this problem can be modeled as a maximum cardinality vertex packing problem on a proper internal graph and thus can be solved very efficiently by a greedy algorithm.  相似文献   

10.
The first step in parameter estimation is to reduce the dimensionality of the problem by deriving estimates from independent experimentation and from the literature. In addition, insensitive parameters are either removed or fixed. In the remaining lower-dimensional problem, parameter-space delimitation is possible by analytical means. Three conjunctive methods are derived: period-average analysis, extremum analysis, and quasisteady-state analysis. The basic idea is to find conditions for the parameters that must be fulfilled in order to comply with average and extreme values in the observations. The approach is applied to the modeling of the phytoplankton dynamics of Lake Balaton. The analytical techniques prove to supply valuable insight into parameter interrelationships and model adequacy, and can serve as satisfactory substitutes for formal parameter-estimation techniques in the early stages of model development.  相似文献   

11.
We describe a software package for computing and manipulating the subdivision of a sphere by a collection of (not necessarily great) circles and for computing the boundary surface of the union of spheres. We present problems that arise in the implementation of the software and the solutions that we have found for them. At the core of the paper is a novel perturbation scheme to overcome degeneracies and precision problems in computing spherical arrangements while using floating point arithmetic. The scheme is relatively simple, it balances between the efficiency of computation and the magnitude of the perturbation, and it performs well in practice. In one O(n) time pass through the data, it perturbs the inputs necessary to insure no potential degeneracies and then passes the perturbed inputs on to the geometric algorithm. We report and discuss experimental results. Our package is a major component in a larger package aimed to support geometric queries on molecular models; it is currently employed by chemists working in “rational drug design”. The spherical subdivisions are used to construct a geometric model of a molecule where each sphere represents an atom.  相似文献   

12.
This paper will describe the application of an interactive queueing network analyzer and an interactive graphics system to the analysis of a multiple processor computer system. The application of these tools greatly increased the productivity of the modelers and resulted in insights which would have otherwise been difficult, if not impossible, to obtain. With this experience as background, we discuss how the increasing availability of computing resources, especially high resolution interactive computer graphics and sophisticated modeling packages, is likely to have a profound influence on the applied stochastic modeler.  相似文献   

13.
This paper analyses the performance of a counterflow concentric tube heat exchanger and estimates the uncertainties in the temperature response of the heat exchanging fluids caused by uncertainties in their inlet temperatures and overall heat transfer coefficients between the fluid streams. The analysis is based upon the two-point distribution technique. This paper describes the application of this technique to a heat exchanger for which the exact solution for the steady state temperature response is available. Results show that the uncertainty in inlet temperature has a stronger influence. The effect of data uncertainty in heat transfer coefficients, which generally have very high level of uncertainty, have only weak influence. The accuracy of predicted temperature response can, thus, be significantly improved by accurate measurement of the inlet fluid temperatures.  相似文献   

14.
This paper briefly reviews the existing methods of capacity utilization in nonparametric framework from economic perspectives, and then suggests an alternative in the light of limitations therein. In the spirit of work by Coelli et al. [Coelli, T.J., Grifell-Tatje, E., Perelman, S., 2002. Capacity utilisation and profitability: A decomposition of short run profit efficiency. International Journal of Production Economics 79, 261–278], we propose two methods, radial and non-radial, to decompose the input-based physical (technological) capacity utilization into various meaningful components viz., technical inefficiency, ray economic capacity utilization and optimal capacity idleness. A case study of Indian banking industry is taken as an example to illustrate the potential application of these two methods of decomposition. Our two broad empirical findings are that first, competition created after financial sector reforms generates high efficiency growth, and reduces excess capacity; second, the cost gap of the short-run cost from the actual cost is higher for the nationalized banks over the private banks indicating that the former banks, though old, do not reflect their learning experience in their cost minimizing behavior.  相似文献   

15.
A density forecast is an estimate of the probability distribution of the possible future values of a random variable. From the current literature, an economic time series may have three types of asymmetry: asymmetry in unconditional distribution, asymmetry in conditional distribution, volatility asymmetry. In this paper, we propose three density forecasting methods under two-piece normal assumption to capture these asymmetric features. A GARCH model with two-piece normal distribution is developed to capture asymmetries in the conditional distributions. In this approach, we first estimate parameters of a GARCH model by assuming normal innovations, and then fit a two-piece normal distribution to the empirical residuals. Block bootstrap procedure, and moving average method with two-piece normal distribution are presented for volatility asymmetry and asymmetry in the conditional distributions. Application of the developed methods to the weekly S&P500 returns illustrates that forecast quality can be significantly improved by modeling these asymmetric features.  相似文献   

16.
It is very common in AIDS studies that response variable (e.g., HIV viral load) may be subject to censoring due to detection limits while covariates (e.g., CD4 cell count) may be measured with error. Failure to take censoring in response variable and measurement errors in covariates into account may introduce substantial bias in estimation and thus lead to unreliable inference. Moreover, with non-normal and/or heteroskedastic data, traditional mean regression models are not robust to tail reactions. In this case, one may find it attractive to estimate extreme causal relationship of covariates to a dependent variable, which can be suitably studied in quantile regression framework. In this paper, we consider joint inference of mixed-effects quantile regression model with right-censored responses and errors in covariates. The inverse censoring probability weighted method and the orthogonal regression method are combined to reduce the biases of estimation caused by censored data and measurement errors. Under some regularity conditions, the consistence and asymptotic normality of estimators are derived. Finally, some simulation studies are implemented and a HIV/AIDS clinical data set is analyzed to to illustrate the proposed procedure.  相似文献   

17.
The article presents a 3-D numerical simulation study for the growth of single crystal semiconductors under strong magnetic fields. In such high fields, the magnetic body force components, which also depend on the flow velocity components present numerical challenges particularly in terms of convergence of iterations. To remedy such difficulties, a novel numerical approach was introduced in an in-house 3-D finite volume-based computer code. As an application, the Travelling Heater Method (THM) was selected for the growth of CdTe crystals under a static vertical magnetic field.  相似文献   

18.
This paper describes the modeling of the potential of an organization to develop an insider threat given certain attributes of its culture. The model represents all employees of the organization and their likelihood of becoming an insider threat. Each employee is instantiated in an agent-zero construct, which accounts for affective, rational, and social behavioral influences. The main driver of behavior is the employee’s level of disgruntlement against the organization. The simulation is run over a period of 10 years and the total number of employees that exceed a certain threshold of becoming an insider threat are computed. This number is compared with survey data on work force ethics as a measure of validity of the simulation results.  相似文献   

19.
Ridge regression is an important approach in linear regression when explanatory variables are highly correlated. Although expressions of estimators of ridge regression parameters have been successfully obtained via matrix operation after observed data are standardized, they cannot be used to big data since it is impossible to load the entire data set to the memory of a single computer and it is hard to standardize the original observed data. To overcome these difficulties, the present article proposes new methods and algorithms. The basic idea is to compute a matrix of sufficient statistics by rows. Once the matrix is derived, it is not necessary to use the original data again. Since the entire data set is only scanned once, the proposed methods and algorithms can be extremely efficient in the computation of estimates of ridge regression parameters. It is expected that the basic knowledge gained in this article will have a great impact on statistical approaches to big data.  相似文献   

20.
A multi-objective linear programming model is developed for the Indian sugar industry to plan for additional output by production technique, geographical region and forecasted year. Various policy scenarios generated by assigning different values to the policy variables in the model are studied. Thus a useful planning tool which demonstrates the exact impact of the policy parameters on various objectives is provided to the central decision maker. A satisficing multi-objective decision making method is developed based on an existing method of solution and used in policy analysis. The solution method is ideally suited to any general planning problem.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号