首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   102904篇
  免费   4863篇
  国内免费   3196篇
化学   41138篇
晶体学   1025篇
力学   8447篇
综合类   217篇
数学   35024篇
物理学   25112篇
  2023年   572篇
  2022年   658篇
  2021年   902篇
  2020年   1089篇
  2019年   1006篇
  2018年   11110篇
  2017年   10865篇
  2016年   7215篇
  2015年   2029篇
  2014年   1749篇
  2013年   2265篇
  2012年   6072篇
  2011年   12763篇
  2010年   7158篇
  2009年   7510篇
  2008年   8224篇
  2007年   10114篇
  2006年   1566篇
  2005年   2427篇
  2004年   2353篇
  2003年   2616篇
  2002年   1610篇
  2001年   756篇
  2000年   802篇
  1999年   778篇
  1998年   718篇
  1997年   657篇
  1996年   662篇
  1995年   572篇
  1994年   457篇
  1993年   434篇
  1992年   333篇
  1991年   313篇
  1990年   285篇
  1989年   242篇
  1988年   204篇
  1987年   183篇
  1986年   149篇
  1985年   140篇
  1984年   104篇
  1983年   86篇
  1982年   93篇
  1981年   60篇
  1980年   58篇
  1979年   46篇
  1914年   45篇
  1913年   40篇
  1912年   40篇
  1909年   41篇
  1908年   40篇
排序方式: 共有10000条查询结果,搜索用时 31 毫秒
121.
We develop a method of randomizing units to treatments that relies on subjective judgement or on possible coarse modeling to produce restrictions on the randomization. The procedure thus fits within the general framework of ranked set sampling. However, instead of selecting a single unit from each set for full measurement, all units within a set are used. The units within a set are assigned to different treatments. Such an assignment translates the positive dependence among units within a set into a reduction in variation of contrasting features of the treatments. A test for treatment versus control comparison, with controlled familywise error rate, is developed along with the associated confidence intervals. The new procedure is shown to be superior to corresponding procedures based on completely randomized or ranked set sample designs. The superiority appears both in asymptotic relative efficiency and in power for finite sample sizes. Importantly, this test does not rely on perfect rankings; rather, the information in the data on the quality of rankings is exploited to maintain the level of the test when rankings are imperfect. The asymptotic relative efficiency of the test is not affected by estimation of the quality of rankings, and the finite sample performance is only mildly affected.  相似文献   
122.
123.
124.
对弱相对论性电子束驱动的回旋激射(maser)不稳定性的一般理论作了详细讨论.对在获得增长率实用表达式过程中若干解析表达式的推导与细节做了仔细的补充讨论和说明,还增加了增长率的近似表达式,并由此得到了回旋激射不稳定性主要特征的解析分析以及与精确计算的比较,使整个理论有一个完整的描述.侧重解析讨论,也提供了部分一般性的数值计算结果. 关键词: 回旋激射不稳定性 弱相对论性电子束 增长率  相似文献   
125.
What strategy should a football (soccer, in American parlance) club adopt when deciding whether to sack its manager? This paper introduces a simple model assuming that a club's objective is to maximize the number of league points that it scores per season. The club's strategy consists of three choices: the length of the honeymoon period during which it will not consider sacking a new manager, the level of the performance trapdoor below which the manager get the sack, and the weight that it will give to more recent games compared to earlier ones. Some data from the last six seasons of the English Premiership are used to calibrate the model. At this early stage of the research, the best strategy appears to have only a short honeymoon period of eight games (much less than the actual shortest period of 12 games), to set the trapdoor at 0.74 points per game, and to put 47% of the weight on the last five games. A club adopting this strategy would obtain on average 56.8 points per season, compared to a Premiership average of 51.8 points.  相似文献   
126.
127.
We consider estimation of loss for generalized Bayes or pseudo-Bayes estimators of a multivariate normal mean vector, θ. In 3 and higher dimensions, the MLEX is UMVUE and minimax but is inadmissible. It is dominated by the James-Stein estimator and by many others. Johnstone (1988, On inadmissibility of some unbiased estimates of loss,Statistical Decision Theory and Related Topics, IV (eds. S. S. Gupta and J. O. Berger), Vol. 1, 361–379, Springer, New York) considered the estimation of loss for the usual estimatorX and the James-Stein estimator. He found improvements over the Stein unbiased estimator of risk. In this paper, for a generalized Bayes point estimator of θ, we compare generalized Bayes estimators to unbiased estimators of loss. We find, somewhat surprisingly, that the unbiased estimator often dominates the corresponding generalized Bayes estimator of loss for priors which give minimax estimators in the original point estimation problem. In particular, we give a class of priors for which the generalized Bayes estimator of θ is admissible and minimax but for which the unbiased estimator of loss dominates the generalized Bayes estimator of loss. We also give a general inadmissibility result for a generalized Bayes estimator of loss. Research supported by NSF Grant DMS-97-04524.  相似文献   
128.
How, in a discretized model, to utilize the duality and complementarity of two saddle point variational principles is considered in the paper. A homology family of optimality conditions, different from the conventional saddle point conditions of the domain-decomposed Hellinger-Reissner principle, is derived to enhance stability of hybrid finite element schemes. Based on this, a stabilized hybrid method is presented by associating element-interior displacement with an element-boundary one in a nonconforming manner. In addition, energy compatibility of strain-enriched displacements with respect to stress terms is introduced to circumvent Poisson-locking.

  相似文献   

129.
On effectiveness of wiretap programs in mapping social networks   总被引:1,自引:0,他引:1  
Snowball sampling methods are known to be a biased toward highly connected actors and consequently produce core-periphery networks when these may not necessarily be present. This leads to a biased perception of the underlying network which can have negative policy consequences, as in the identification of terrorist networks. When snowball sampling is used, the potential overload of the information collection system is a distinct problem due to the exponential growth of the number of suspects to be monitored. In this paper, we focus on evaluating the effectiveness of a wiretapping program in terms of its ability to map the rapidly evolving networks within a covert organization. By running a series of simulation-based experiments, we are able to evaluate a broad spectrum of information gathering regimes based on a consistent set of criteria. We conclude by proposing a set of information gathering programs that achieve higher effectiveness then snowball sampling, and at a lower cost. Maksim Tsvetovat is an Assistant Professor at the Center for Social Complexity and department of Public and International Affairs at George Mason University, Fairfax, VA. He received his Ph.D. from the Computation, Organizations and Society program in the School of Computer Science, Carnegie Mellon University. His dissertation was centered on use of artificial intelligence techniques such as planning and semantic reasoning as a means of studying behavior and evolution of complex social networks, such as these of terrorist organizations. He received a Master of Science degree from University of Minnesota with a specialization in Artificial Intelligence and design of Multi-Agent Systems, and has also extensively studied organization theory and social science research methods. His research is centered on building high-fidelity simulations of social and organizational systems using concepts from distributed artificial intelligence and multi-agent systems. Other projects focus on social network analysis for mapping of internal corporate networks or study of covert and terrorist orgnaizations. Maksim’s vita and publications can be found on Kathleen M. Carley is a professor in the School of Computer Science at Carnegie Mellon University and the director of the center for Compuational Analysis of Social and Organizational Systems (CASOS) which has over 25 members, both students and research staff. Her research combines cognitive science, social networks and computer science to address complex social and organizational problems. Her specific research areas are dynamic network analysis, computational social and organization theory, adaptation and evolution, text mining, and the impact of telecommunication technologies and policy on communication, information diffusion, disease contagion and response within and among groups particularly in disaster or crisis situations. She and her lab have developed infrastructure tools for analyzing large scale dynamic networks and various multi-agent simulation systems. The infrastructure tools include ORA, a statistical toolkit for analyzing and visualizing multi-dimensional networks. ORA results are organized into reports that meet various needs such as the management report, the mental model report, and the intelligence report. Another tool is AutoMap, a text-mining systems for extracting semantic networks from texts and then cross-classifying them using an organizational ontology into the underlying social, knowledge, resource and task networks. Her simulation models meld multi-agent technology with network dynamics and empirical data. Three of the large-scale multi-agent network models she and the CASOS group have developed in the counter-terrorism area are: BioWar a city-scale dynamic-network agent-based model for understanding the spread of disease and illness due to natural epidemics, chemical spills, and weaponized biological attacks; DyNet a model of the change in covert networks, naturally and in response to attacks, under varying levels of information uncertainty; and RTE a model for examining state failure and the escalation of conflict at the city, state, nation, and international as changes occur within and among red, blue, and green forces. She is the founding co-editor with Al. Wallace of the journal Computational Organization Theory and has co-edited several books and written over 100 articles in the computational organizations and dynamic network area. Her publications can be found at: http://www.casos.cs.cmu.edu/bios/carley/publications.php  相似文献   
130.
A new stochastic method of reconstructing porous media   总被引:1,自引:0,他引:1  
We present a new stochastic method of reconstructing porous medium from limited morphological information obtained from two-dimensional micro- images of real porous medium. The method is similar to simulated annealing method in the capability of reconstructing both isotropic and anisotropic structures of multi-phase but differs from the latter in that voxels for exchange are not selected completely randomly as their neighborhood will also be checked and this new method is much simpler to implement and program. We applied it to reconstruct real sandstone utilizing morphological information contained in porosity, two-point probability function and linear-path function. Good agreement of those references verifies our developed method’s powerful capability. The existing isolated regions of both pore phase and matrix phase do quite minor harm to their good connectivity. The lattice Boltzmann method (LBM) is used to compute the permeability of the reconstructed system and the results show its good isotropy and conductivity. However, due to the disadvantage of this method that the connectivity of the reconstructed system’s pore space will decrease when porosity becomes small, we suggest the porosity of the system to be reconstructed be no less than 0.2 to ensure its connectivity and conductivity.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号