首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9128篇
  免费   1209篇
  国内免费   370篇
化学   840篇
晶体学   15篇
力学   1081篇
综合类   330篇
数学   5595篇
物理学   2846篇
  2024年   22篇
  2023年   93篇
  2022年   283篇
  2021年   290篇
  2020年   195篇
  2019年   213篇
  2018年   236篇
  2017年   348篇
  2016年   458篇
  2015年   295篇
  2014年   551篇
  2013年   619篇
  2012年   502篇
  2011年   548篇
  2010年   438篇
  2009年   585篇
  2008年   614篇
  2007年   626篇
  2006年   485篇
  2005年   447篇
  2004年   389篇
  2003年   325篇
  2002年   297篇
  2001年   263篇
  2000年   232篇
  1999年   202篇
  1998年   187篇
  1997年   167篇
  1996年   135篇
  1995年   109篇
  1994年   74篇
  1993年   82篇
  1992年   74篇
  1991年   38篇
  1990年   46篇
  1989年   28篇
  1988年   32篇
  1987年   23篇
  1986年   30篇
  1985年   29篇
  1984年   29篇
  1983年   9篇
  1982年   18篇
  1981年   6篇
  1980年   5篇
  1979年   7篇
  1978年   4篇
  1977年   5篇
  1959年   5篇
  1957年   2篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
231.
在近红外光谱900-1700nm的波长范围内采集南疆羊肉的光谱数据,来研究水分含量的快速无损检测.为减弱非目标因素对光谱的影响,采用SNV和去趋势法对光谱数据进行预处理.为降低建模的复杂度,去除共线性的影响,采用连续投影算法和相关系数法相结合选取8个特征波长变量,最后使用PLS和ELM算法分别进行建模.实验表明,与采用全光谱波段建模相比,采用特征波长变量建模,PLS和ELM算法的运行时间都大大缩短,并且在运行时间和预测精度上,ELM算法均优于PLS算法.ELM算法采用8个特征波段变量建模,预测精度达到0.9768,均方误差为4.4291e-04,相关系数为0.7603,运行时间可控制在1e-04s之下,这可为研发羊肉水分含量的便携式检测装置提供理论参考.  相似文献   
232.
如何根据患者的差异化需求,撮合医生与患者双方形成合理有效的医疗服务供需匹配,是医疗服务运作管理中重要的研究问题。本文针对医疗服务中医生与患者的实际需求,提出了一种考虑患者预约行为的匹配决策方法。在该方法中,首先依据患者的预约行为及特征分类;然后,通过计算不同情形下医患双方的差异度,获得了医患双方的满意度矩阵;在此基础上,提出了匹配预约患者与医生的E-HR算法,并进一步构建了匹配剩余患者和医生的多目标优化模型,通过模型求解得到最优匹配结果;最后,通过算例说明了本文提出方法的可行性和实用性。  相似文献   
233.
234.
闫熙  马昌凤 《计算数学》2019,41(1):37-51
本文针对求矩阵方程AXB+CXD=F唯一解的参数迭代法,分析当矩阵A,B,C,D均是Hermite正(负)定矩阵时,迭代矩阵的特征值表达式,给出了最优参数的确定方法,并提出了相应的加速算法.  相似文献   
235.
L1 regularization and Lp regularization are proposed for processing recovered images based on compressed sensing (CS). L1 regularization can be solved as a convex optimization problem but is less sparse than Lp (0 < p < 1). Lp regularization is sparser than L1 regularization but is more difficult to solve. This paper proposes joint L1/Lp (0 < p < 1) regularization, which combines Lp regularization and L1 regularization. This joint regularization is applied to recover video of remote sensing based on CS. Joint regularization is sparser than L1 regularization but is as easy to solve as L1 regularization. A linearized Bregman reweighted iteration algorithm is proposed to solve the joint L1/Lp regularization problem. The performance and capabilities of the linearized Bregman algorithm and linearized Bregman reweighted algorithm for solving the joint L1/Lp regularization model are analyzed and compared through numerical simulations.  相似文献   
236.
The design of the mold and the choice of the injection parameters for metal injection molding (MIM) is required to maintain homogeneity of the filled mixture. However, powder segregation is unavoidable in MIM because of the significant difference in densities of the metallic powder and the polymer binder. To achieve an effective prediction of segregation effect, a biphasic model based on mixture theory is employed. The viscous behaviors of each phase and the interaction coefficient between the flows of the two phases should be determined. The solution of two coupled Navier–Stokes equations results in a tremendous computation effort. The previous development of an explicit algorithm makes the biphasic simulation much faster than that of the classic methods. However, it is strongly desired to reduce or even eliminate the numerous global solutions for pressure fields at each time step. Hence, a new vectorial algorithm is proposed and developed to perform the simulation only by vectorial operations. It provides the anticipated efficiency in the simulation of biphasic modeling, and the advantage to use the classic elements of equal‐order interpolations. Some results produced by the two algorithms are compared with the experimental values to validate the new vectorial algorithm. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
237.
The hydrohaloalkanes have attracted much attention as potential substitutes of chlorofluorocarbons (CFCs) that deplete the ozone layer and lead to great high global warming. Having a short atmospheric lifetime is very important for the potential substitutes that may also induce ozone depletion and yield high global warming gases to be put in use. Quantitative structure–activity relationship (QSAR) studies were presented for their lifetimes aided by the quantum chemistry parameters including net charges, Mulliken overlaps, E HOMO and E LUMO based on the density functional theory (DFT) at B3PW91 level, and the C-H bond dissociation energy based on AM1 calculations. Outstanding features of the logistic mapping, a simple chaotic system, especially the inherent ability to search the space of interest exhaustively have been utilized. The chaotic mapping aided genetic algorithm artificial neural network training scheme (CGANN) showed better performance than the conventional genetic algorithm ANN training when the structure of the data set was not favorable. The lifetimes of HFCs and HCs appeared to be greatly dependent on their energies of the highest occupied molecular orbitals. The perference of the RMSRE comparing to RMSE as objective function of ANN training was better for the samples of interest with relatively short lifetimes. C2H6 and C3H8 as potential green substitutes of CFCs present relatively short lifetimes.  相似文献   
238.
In recent years, hierarchical model-based clustering has provided promising results in a variety of applications. However, its use with large datasets has been hindered by a time and memory complexity that are at least quadratic in the number of observations. To overcome this difficulty, this article proposes to start the hierarchical agglomeration from an efficient classification of the data in many classes rather than from the usual set of singleton clusters. This initial partition is derived from a subgraph of the minimum spanning tree associated with the data. To this end, we develop graphical tools that assess the presence of clusters in the data and uncover observations difficult to classify. We use this approach to analyze two large, real datasets: a multiband MRI image of the human brain and data on global precipitation climatology. We use the real datasets to discuss ways of integrating the spatial information in the clustering analysis. We focus on two-stage methods, in which a second stage of processing using established methods is applied to the output from the algorithm presented in this article, viewed as a first stage.  相似文献   
239.
We consider a discrete latent variable model for two-way data arrays, which allows one to simultaneously produce clusters along one of the data dimensions (e.g., exchangeable observational units or features) and contiguous groups, or segments, along the other (e.g., consecutively ordered times or locations). The model relies on a hidden Markov structure but, given its complexity, cannot be estimated by full maximum likelihood. Therefore, we introduce a composite likelihood methodology based on considering different subsets of the data. The proposed approach is illustrated by simulation, and with an application to genomic data.  相似文献   
240.
An improved watermarking method, based on the double random phase encoding technique and the cascaded-phases iterative algorithm and random-phase-shift algorithm, is proposed. This method can significantly reduce the needs of watermarking information storage for different multimedia products, and provide a reasonable criterion of determining the authenticity of a product for the copyright owner. This method can also be applied to track the source of copies. The effectiveness of this method was verified through numerical simulations.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号