首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2932篇
  免费   187篇
  国内免费   3篇
化学   209篇
晶体学   3篇
力学   134篇
综合类   34篇
数学   1639篇
物理学   1103篇
  2023年   25篇
  2022年   15篇
  2021年   46篇
  2020年   67篇
  2019年   77篇
  2018年   63篇
  2017年   69篇
  2016年   86篇
  2015年   35篇
  2014年   116篇
  2013年   296篇
  2012年   105篇
  2011年   175篇
  2010年   110篇
  2009年   137篇
  2008年   144篇
  2007年   173篇
  2006年   155篇
  2005年   130篇
  2004年   133篇
  2003年   114篇
  2002年   125篇
  2001年   101篇
  2000年   88篇
  1999年   84篇
  1998年   73篇
  1997年   40篇
  1996年   41篇
  1995年   34篇
  1994年   27篇
  1993年   29篇
  1992年   13篇
  1991年   19篇
  1990年   9篇
  1989年   20篇
  1988年   16篇
  1987年   12篇
  1986年   14篇
  1985年   18篇
  1984年   18篇
  1983年   9篇
  1982年   8篇
  1981年   8篇
  1980年   7篇
  1979年   6篇
  1978年   7篇
  1977年   6篇
  1976年   5篇
  1974年   4篇
  1970年   3篇
排序方式: 共有3122条查询结果,搜索用时 15 毫秒
1.
《Discrete Mathematics》2023,346(5):113303
As widely regarded, one of the most classical and remarkable tools to measure the asymptotic normality of combinatorial statistics is due to Harper's real-rooted method proposed in 1967. However, this classical theorem exists some obvious shortcomings, for example, it requests all the roots of the corresponding generating function, which is impossible in general.Aiming to overcome this shortcoming in some extent, in this paper we present an improved asymptotic normality criterion, along with several variant versions, which usually just ask for one coefficient of the generating function, without knowing any roots. In virtue of these new criteria, the asymptotic normality of some usual combinatorial statistics can be revealed and extended. Among which, we introduce the applications to matching numbers and Laplacian coefficients in detail. Some relevant conjectures, proposed by Godsil (Combinatorica, 1981) and Wang et al. (J. Math. Anal. Appl., 2017), are generalized and verified as corollaries.  相似文献   
2.
Hydrologic design is often based on assessments of large return interval measures; it is vital to be able to conclude them as precisely as possible. Henceforth, the selection of a probability distribution is very crucial for such cases. In view of this scenario, we propose and study a pliant probability distribution for precipitation data analysis. Some mathematical and statistical properties are analyzed. In order to make stronger predictions and judge the realistic return period, we have also characterized the model via Laplace transformation. We have estimated its parameters via the maximum likelihood estimation and constructed its information matrix for developing the confidence belt of population parameters. Moreover, a real-life setup is also considered by applying the model over precipitation data of diverse regions, including Jacksonville, Florida (USA), Barkhan (Pakistan), British Columbia (Canada), and Alexandria (Egypt). This investigated study is based on various statistical parametric and nonparametric tests, which indicates that the proposed model is one of the better strategies for precipitation data analysis when compared with the famous three-parameter Kappa model.  相似文献   
3.
《Discrete Mathematics》2022,345(12):113074
It has previously been observed that the limiting gap distribution of the directions to visible points of planar quasicrystals may vanish near zero, that is, there exist planar quasicrystals with a positive limiting minimal normalised gap between the angles of visible points. The exact values of these limiting minimal normalised gaps have not been determined. In this paper we give explicit formulas for the densities of visible points for planar quasicrystals from several families, which include the Ammann–Beenker point set and the vertex sets of some rhombic Penrose tilings. Combining these results with a known characterisation of the limiting minimal gap in terms of a probability measure on an associated homogeneous space of quasicrystals, we give explicit values of the limiting minimal normalised gap between the angles of visible points for several families of planar quasicrystals, in particular, for the Ammann–Beenker point set and for the vertex sets of some rhombic Penrose tilings. We also compare our results with numerical observations.  相似文献   
4.
Metabolomics is a truly interdisciplinary field of science, which combines analytical chemistry, platform technology, mass spectrometry, and NMR spectroscopy with sophisticated data analysis. Applied to biomarker discovery, it includes aspects of pathobiochemistry, systems biology/medicine, and molecular diagnostics and requires bioinformatics and multivariate statistics. While successfully established in the screening of inborn errors in neonates, metabolomics is now widely used in the characterization and diagnostic research of an ever increasing number of diseases. In this Review we highlight important technical prerequisites as well as recent developments in metabolomics and metabolomics data analysis with special emphasis on their utility in biomarker identification and qualification, as well as targeted metabolomics by employing high‐throughput mass spectrometry.  相似文献   
5.
In this paper, we explore a novel approach for assessing the impact of a professional development programme on classroom practice of in-service middle school mathematics teachers. The particular focus of this study is the assessment of the impact on teachers’ employment of strategies used in the classroom to foster the mathematical habits of mind and mathematical self-efficacy of their students. We describe the creation and testing of a student survey designed to assess teacher classroom practice based primarily on students’ ratings of teacher practices.  相似文献   
6.
7.
Risk-adjusted distributions are commonly used in actuarial science to define premium principles. In this paper, we claim that an appropriate risk-adjusted distribution, besides satisfying other desirable properties, should be well-behaved under conditioning with respect to the original risk distribution. Based on a sequence of such risk-adjusted distributions, we introduce a family of premium principles that gradually incorporate the degree of risk-aversion of the insurer in the risk loading. Members of this family are particular distortion premium principles that can be represented as mixtures of TVaRs, where the weights in the mixture reflect the attitude toward risk of the insurer. We make a systematic study of this family of premium principles.  相似文献   
8.
The calculation of Net Asset Values and Solvency Capital Requirements in a Solvency 2 context–and the derivation of sensitivity analyses with respect to the main financial and actuarial risk drivers–is a complex procedure at the level of a real company, where it is illusory to be able to rely on closed-form formulas. The most general approach to performing these computations is that of nested simulations. However, this method is also hardly realistic because of its huge computation resources demand. The least-squares Monte Carlo method has recently been suggested as a way to overcome these difficulties. The present paper confirms that using this method is indeed relevant for Solvency 2 computations at the level of a company.  相似文献   
9.
We investigate the full counting statistics of a voltage-driven normal metal(N)–superconductor(S) contact. In the low-bias regime below the superconducting gap, the NS contact can be mapped onto a purely normal contact, albeit with doubled voltage and counting fields. Hence in this regime the transport characteristics can be obtained by the corresponding substitution of the normal metal results. The elementary processes are single Andreev transfers and electron- and hole-like Andreev transfers. Considering Lorentzian voltage pulses we find an optimal quantization for half-integer Levitons.  相似文献   
10.
Although it is known that low signal-to-noise ratio (SNR) can affect tensor metrics, few studies reporting disease or treatment effects on fractional anisotropy (FA) report SNR; the implicit assumption is that SNR is adequate. However, the level at which low SNR causes bias in FA may vary with tissue FA, field strength and analytical methodology. We determined the SNR thresholds at 1.5 T vs. 3 T in regions of white matter (WM) with different FA and compared FA derived using manual region-of-interest (ROI) analysis to tract-based spatial statistics (TBSS), an operator-independent whole-brain analysis tool. Using ROI analysis, SNR thresholds on our hardware-software magnetic resonance platforms were 25 at 1.5 T and 20 at 3 T in the callosal genu (CG), 40 at 1.5 and 3 T in the anterior corona radiata (ACR), and 50 at 1.5 T and 70 at 3 T in the putamen (PUT). Using TBSS, SNR thresholds were 20 at 1.5 T and 3 T in the CG, and 35 at 1.5 T and 40 at 3 T in the ACR. Below these thresholds, the mean FA increased logarithmically, and the standard deviations widened. Achieving bias-free SNR in the PUT required at least nine acquisitions at 1.5 T and six acquisitions at 3 T. In the CG and ACR, bias-free SNR was achieved with at least three acquisitions at 1.5 T and one acquisition at 3 T. Using diffusion tensor imaging (DTI) to study regions of low FA, e.g., basal ganglia, cerebral cortex, and WM in the abnormal brain, SNR should be documented. SNR thresholds below which FA is biased varied with the analytical technique, inherent tissue FA and field strength. Studies using DTI to study WM injury should document that bias-free SNR has been achieved in the region of the brain being studied as part of quality control.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号