首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3269篇
  免费   377篇
  国内免费   133篇
化学   235篇
晶体学   2篇
力学   265篇
综合类   89篇
数学   2330篇
物理学   858篇
  2024年   4篇
  2023年   41篇
  2022年   44篇
  2021年   98篇
  2020年   74篇
  2019年   79篇
  2018年   55篇
  2017年   122篇
  2016年   135篇
  2015年   74篇
  2014年   208篇
  2013年   308篇
  2012年   149篇
  2011年   171篇
  2010年   155篇
  2009年   202篇
  2008年   209篇
  2007年   201篇
  2006年   157篇
  2005年   156篇
  2004年   115篇
  2003年   91篇
  2002年   92篇
  2001年   98篇
  2000年   85篇
  1999年   86篇
  1998年   84篇
  1997年   56篇
  1996年   53篇
  1995年   51篇
  1994年   33篇
  1993年   32篇
  1992年   22篇
  1991年   21篇
  1990年   30篇
  1989年   18篇
  1988年   9篇
  1987年   24篇
  1986年   11篇
  1985年   34篇
  1984年   11篇
  1983年   12篇
  1982年   12篇
  1981年   10篇
  1980年   9篇
  1979年   14篇
  1978年   4篇
  1976年   4篇
  1974年   3篇
  1973年   4篇
排序方式: 共有3779条查询结果,搜索用时 15 毫秒
141.
Iterative Estimation of the Extreme Value Index   总被引:1,自引:0,他引:1  
Let {Xn, n ≥ 1} be a sequence of independent random variables with common continuous distribution function F having finite and unknown upper endpoint. A new iterative estimation procedure for the extreme value index γ is proposed and one implemented iterative estimator is investigated in detail, which is asymptotically as good as the uniform minimum varianced unbiased estimator in an ideal model. Moreover, the superiority of the iterative estimator over its non iterated counterpart in the non asymptotic case is shown in a simulation study.AMS 2000 Subject Classification: 62G32Supported by Swiss National Science foundation.  相似文献   
142.
Summary  Common non-parametric estimators of a probability density function (PDF) show bad performance for heavy-tailed PDFs. Using a parametric approximation of the true cumulative distribution function (CDF), the transformation-retransformation of the data is explored here as a useful tool for the reliable PDF prediction. The PDF estimators are compared by their capacity to solve a classification problem. Simulation results and an application to Web data analysis are presented, too.  相似文献   
143.
In this note we show that the mathematical tools of cooperative game theory allow a successful approach to the statistical problem of estimating a density function. Specifically, any random sample of an absolutely continuous random variable determines a transferable utility game, the Shapley value of which proves to be an estimator of the density function of binned kernel and WARPing types, with good computational and statistical properties.Authors acknowledge the financial support of Spanish Ministry for Science and Technology and FEDER through projects BFM2002-03213 and BEC2002-04102-C02-02 and of Xunta de Galicia through projects PGIDT00PXI20104PR and PGIDT03PXIC20701PN. They also thank the comments of two anonymous referees.  相似文献   
144.
In this paper we give an account of a new change of perspective in non-linear modelling and prediction as applied to smooth systems. The core element of these developments is the Gamma test a non-linear modelling and analysis tool which allows us to examine the nature of a hypothetical input/output relationship in a numerical data-set. In essence, the Gamma test allows us to efficiently calculate that part of the variance of the output which cannot be accounted for by the existence of any smooth model based on the inputs, even though this model be unknown. A key aspect of this tool is its speed: the Gamma test has time complexity O( ), where M is the number of data-points. For data-sets consisting of a few thousand points and a reasonable number of attributes, a single run of the Gamma test typically takes a few seconds. Around this essentially simple procedure a new set of analytical tools has evolved which allow us to model smooth non-linear systems directly from the data with a precision and confidence that hitherto was inaccessible. In this paper we briefly describe the Gamma test, its benefits in model identification and model building, and then in more detail explain and motivate the procedures which facilitate a Gamma analysis. We briefly report on a case study applying these ideas to the practical problem of predicting level and flow rates in the Thames valley river basin. Finally we speculate on the future development and enhancement of these techniques into areas such as datamining and the production of complex non-linear models directly from data via graphical representations of process charts and automated Gamma analysis of each input-output node.  相似文献   
145.
Cardiac elastography is a useful diagnostic technique for detection of heart function abnormalities, based on analysis of echocardiograms. The analysis of the regional heart motion allows assessing the extent of myocardial ischemia and infarction. In this paper, a new two-stage algorithm for cardiac motion estimation is proposed, where the data is taken from a sequence of 2D echocardiograms. The method combines the advantages of block-matching and optical flow techniques. The first stage employs a standard block-matching algorithm (sum of absolute differences) to provide a displacement estimate with accuracy of up to one pixel. At the second stage, this estimate is corrected by estimating the parameters of a local image transform within a test window. The parameters of the image transform are estimated in the least-square sense. In order to account for typical heart motions, like contraction/expansion, translation and rotation, a local affine model is assumed within the test window. The accuracy of the new algorithm is evaluated using a sequence of 500 grayscale B-mode images, which are generated as distorted, but known copies of an original ROI, taken from a real echocardiogram. The accuracy of the motion estimation is expressed in terms of errors: maximum absolute error, root-mean-square error, average error and standard deviation. The errors of the proposed algorithm are compared with these of the known block-matching technique with cross-correlation and interpolation in the sub-pixel space. Statistical analysis of the errors shows that the proposed algorithm provides more accurate estimates of the heart motion than the cross-correlation technique with interpolation in the sub-pixel space.  相似文献   
146.
Viscosity is an important property that influences industrial processes relevant to fluid. The transferring rate of impurities, such as S, P and N, is affected with the viscosity of metallic melts. The interfacial reactions and impurity removal depend on the viscosity of both slag and metallic melt. Viscosity of gas and liquid are all affecting the transferring process and velocity. However, the amount of viscosity data is far from satisfactory for the needs of today抯 technology, especially…  相似文献   
147.
The general theory of approximation of (possibly generalized) Young measures is presented, and concrete cases are investigated. An adjoint-operator approach, combined with quasi-interpolation of test integrands, is systematically used. Applicability is demonstrated on an optimal control problem for an elliptic system, together with one-dimensional illustrative calculations of various options.  相似文献   
148.
149.
A new a posteriori L2 norm error estimator is proposed for thePoisson equation. The error estimator can be applied to anisotropictetrahedral or triangular finite element meshes. The estimatoris rigorously analysed for Dirichlet and Neumann boundary conditions. The lower error bound relies on specifically designed anisotropicbubble functions and the corresponding inverse inequalities.The upper error bound utilizes non-standard anisotropic interpolationestimates. Its proof requires H2 regularity of the Poisson problem,and its quality depends on how good the anisotropic mesh resolvesthe anisotropy of the problem. This is measured by a so-called‘matching function’. A numerical example supports the anisotropic error analysis.  相似文献   
150.
The main objective of statistics of extremes is the prediction of rare events, and its primary problem has been the estimation of the tail index , usually performed on the basis of the largest k order statistics in the sample or on the excesses over a high level u. The question that has been often addressed in practical applications of extreme value theory is the choice of either k or u, and an adaptive estimation of . We shall be here mainly interested in the use of the bootstrap methodology to estimate adaptively, and although the methods provided may be applied, with adequate modifications, to the general domain of attraction of G, , we shall here illustrate the methods for heavy right tails, i.e. for > 0. Special relevance will be given to the use of an auxiliary statistic that is merely the difference of two estimators with the same functional form as the estimator under study, computed at two different levels. We shall also compare, through Monte Carlo simulation, these bootstrap methodologies with other data-driven choices of the optimal sample fraction available in the literature.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号