全文获取类型
收费全文 | 145篇 |
免费 | 2篇 |
国内免费 | 3篇 |
专业分类
化学 | 41篇 |
力学 | 10篇 |
数学 | 71篇 |
物理学 | 28篇 |
出版年
2024年 | 1篇 |
2023年 | 3篇 |
2022年 | 3篇 |
2021年 | 1篇 |
2020年 | 1篇 |
2019年 | 7篇 |
2018年 | 5篇 |
2017年 | 4篇 |
2016年 | 1篇 |
2015年 | 3篇 |
2014年 | 6篇 |
2013年 | 9篇 |
2012年 | 5篇 |
2011年 | 11篇 |
2010年 | 8篇 |
2009年 | 9篇 |
2008年 | 11篇 |
2007年 | 11篇 |
2006年 | 8篇 |
2005年 | 7篇 |
2004年 | 7篇 |
2003年 | 4篇 |
2001年 | 6篇 |
2000年 | 1篇 |
1998年 | 2篇 |
1997年 | 3篇 |
1995年 | 3篇 |
1994年 | 1篇 |
1993年 | 2篇 |
1992年 | 1篇 |
1991年 | 2篇 |
1989年 | 1篇 |
1983年 | 2篇 |
1980年 | 1篇 |
排序方式: 共有150条查询结果,搜索用时 15 毫秒
1.
2.
3.
The present paper provides a statistical model to the size effect on grained materials tensile strength; it is based on an Extreme Value Theory approach. Since the weakest link in grained materials is usually represented by the interface between the matrix and the aggregates, it is assumed that the flaw distribution can be represented by the aggregate distribution, expressed as a probability density function (pdf) of the grain diameters. Under the hypothesis that the strength of the material depends on the largest flaw, the tensile strength is computed as a function of the specimen size. In this way, two remarkable results are obtained: (i) a size effect for the average tensile strength that substantially agrees with the multifractal scaling law (MFSL) proposed by the first author and (ii) an increase of scatter of the tensile strength values when testing small specimens. Both these trends are confirmed by experimental data available in the literature. 相似文献
4.
《Journal of computational science》2014,5(2):126-134
Traditional debuggers are of limited value for modern scientific codes that manipulate large complex data structures. Current parallel machines make this even more complicated, because the data structure may be distributed across processors, making it difficult to view/interpret and validate its contents. Therefore, many applications’ developers resort to placing validation code directly in the source program. This paper discusses a novel debug-time assertion, called a “Statistical Assertion”, that allows using extracted statistics instead of raw data to reason about large data structures, therefore help locating coding defects. In this paper, we present the design and implementation of an ‘extendable’ statistical-framework which executes the assertion in parallel by exploiting the underlying parallel system. We illustrate the debugging technique with a molecular dynamics simulation. The performance is evaluated on a 20,000 processor Cray XE6 to show that it is useful for real-time debugging. 相似文献
5.
We continue our use of “simple” energetic patterns, where simple means the use of parameters derived only from the stoichiometry
of these species in our studies of the entropy of formation (TΔf
S
o) of aqueous anions. Relationships between the entropy of formation and different parameters such as the number of oxygen
atoms, the natural logarithm of the molecular weight and the total number of atoms are explored. The charge of the species,
z− continues to be explicitly considered where we now explore various choices of p and use of z
p as a parameter. 相似文献
6.
Mark P. HollandRenato Vitolo Pau RabassaAlef E. Sterk Henk W. Broer 《Physica D: Nonlinear Phenomena》2012,241(5):497-513
Extreme value theory for chaotic deterministic dynamical systems is a rapidly expanding area of research. Given a system and a real function (observable) defined on its phase space, extreme value theory studies the limit probabilistic laws obeyed by large values attained by the observable along orbits of the system. Based on this theory, the so-called block maximum method is often used in applications for statistical prediction of large value occurrences. In this method, one performs statistical inference for the parameters of the Generalised Extreme Value (GEV) distribution, using maxima over blocks of regularly sampled observable values along an orbit of the system. The observables studied so far in the theory are expressed as functions of the distance with respect to a point, which is assumed to be a density point of the system’s invariant measure. However, at least with respect to the ambient (usually Euclidean) metric, this is not the structure of the observables typically encountered in physical applications, such as windspeed or vorticity in atmospheric models. In this paper we consider extreme value limit laws for observables which are not expressed as functions of the distance (in the ambient metric) from a density point of the dynamical system. In such cases, the limit laws are no longer determined by the functional form of the observable and the dimension of the invariant measure: they also depend on the specific geometry of the underlying attractor and of the observable’s level sets. We present a collection of analytical and numerical results, starting with a toral hyperbolic automorphism as a simple template to illustrate the main ideas. We then formulate our main results for a uniformly hyperbolic system, the solenoid map. We also discuss non-uniformly hyperbolic examples of maps (Hénon and Lozi maps) and of flows (the Lorenz63 and Lorenz84 models). Our purpose is to outline the main ideas and to highlight several serious problems found in the numerical estimation of the limit laws. 相似文献
7.
We generalize some identities and q-identities previously known for the symmetric group to Coxeter groups of type B and D. The extended results include theorems of Foata and Schützenberger, Gessel, and Roselle on various distributions of statistics, like inversion number, major index, and descent number. In order to show our results we provide explicit characterizations of the systems of minimal coset representatives of Coxeter groups of type B and D. 相似文献
8.
We report a case study that explored how three college students mentally represented the knowledge they held of inferential statistics, how this knowledge was connected, and how it was applied in two problem solving situations. A concept map task and two problem categorization tasks were used along with interviews to gather the data. We found that the students’ representations were based on incomplete statistical understanding. Although they grasped various concepts and inferential tests, the students rarely linked key concepts together or to tests nor did they accurately apply that knowledge to categorize word problems. We suggest that one reason the students had difficulty applying their knowledge is that it was not sufficiently integrated. In addition, we found that varying the instruction for the categorization task elicited different mental representations. One instruction was particularly effective in revealing students’ partial understandings. This finding suggests that modifying the task format as we have done could be a useful diagnostic tool. 相似文献
9.
10.
Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept ‘Illness Fuzzy Set‘ and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determinethe weights wi in fuzzy comprehensive evaluation, and a new method ‘Improved Information Distribution‘ is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method. 相似文献