首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1585篇
  免费   40篇
  国内免费   28篇
化学   354篇
晶体学   1篇
力学   45篇
综合类   10篇
数学   779篇
物理学   464篇
  2024年   3篇
  2023年   21篇
  2022年   21篇
  2021年   29篇
  2020年   16篇
  2019年   34篇
  2018年   33篇
  2017年   48篇
  2016年   43篇
  2015年   48篇
  2014年   101篇
  2013年   166篇
  2012年   96篇
  2011年   104篇
  2010年   93篇
  2009年   140篇
  2008年   86篇
  2007年   110篇
  2006年   73篇
  2005年   52篇
  2004年   43篇
  2003年   29篇
  2002年   31篇
  2001年   28篇
  2000年   15篇
  1999年   11篇
  1998年   17篇
  1997年   35篇
  1996年   21篇
  1995年   15篇
  1994年   10篇
  1993年   7篇
  1992年   9篇
  1991年   11篇
  1990年   6篇
  1989年   5篇
  1988年   4篇
  1987年   5篇
  1986年   6篇
  1985年   6篇
  1983年   2篇
  1982年   3篇
  1981年   2篇
  1980年   2篇
  1979年   3篇
  1978年   3篇
  1974年   1篇
  1973年   1篇
  1969年   1篇
  1966年   1篇
排序方式: 共有1653条查询结果,搜索用时 31 毫秒
21.
This paper describes the k-means range algorithm, a combination of the partitional k-means clustering algorithm with a well known spatial data structure, namely the range tree, which allows fast range searches. It offers a real-time solution for the development of distributed interactive decision aids in e-commerce since it allows the consumer to model his preferences along multiple dimensions, search for product information, and then produce the data clusters of the products retrieved to enhance his purchase decisions. This paper also discusses the implications and advantages of this approach in the development of on-line shopping environments and consumer decision aids in traditional and mobile e-commerce applications.  相似文献   
22.
There is a general interest in ranking schemes applied to complex entities described by multiple attributes. Published rankings for universities are in great demand but are also highly controversial. We compare two classification and ranking schemes involving universities; one from a published report, ‘Top American Research Universities’ by the University of Florida's TheCenter and the other using DEA. Both approaches use the same data and model. We compare the two methods and discover important equivalences. We conclude that the critical aspect in classification and ranking is the model. This suggests that DEA is a suitable tool for these types of studies.  相似文献   
23.
The paper is concerned with the problem of binary classification of data records, given an already classified training set of records. Among the various approaches to the problem, the methodology of the logical analysis of data (LAD) is considered. Such approach is based on discrete mathematics, with special emphasis on Boolean functions. With respect to the standard LAD procedure, enhancements based on probability considerations are presented. In particular, the problem of the selection of the optimal support set is formulated as a weighted set covering problem. Testable statistical hypothesis are used. Accuracy of the modified LAD procedure is compared to that of the standard LAD procedure on datasets of the UCI repository. Encouraging results are obtained and discussed.  相似文献   
24.
Managerial efficiency within the performing arts programming can be understood as the technical efficiency of transforming the resources cultural managers have available into a determined cultural output. Through this explanation different conceptions on the finished performance product it leads us to select two different output variables (number of performances, and number of attendances). In this way, three different models are considered regarding those conceptual points of view. Data on the Circuït Teatral Valencià, a Spanish regional theatres network, is used to develop empirically the concept of Managerial Efficiency and set up a framework to allow us to monitor it.  相似文献   
25.
Orthogonal WAVElet correction (OWAVEC) is a pre-processing method aimed at simultaneously accomplishing two essential needs in multivariate calibration, signal correction and data compression, by combining the application of an orthogonal signal correction algorithm to remove information unrelated to a certain response with the great potential that wavelet analysis has shown for signal processing. In the previous version of the OWAVEC method, once the wavelet coefficients matrix had been computed from NIR spectra and deflated from irrelevant information in the orthogonalization step, effective data compression was achieved by selecting those largest correlation/variance wavelet coefficients serving as the basis for the development of a reliable regression model. This paper presents an evolution of the OWAVEC method, maintaining the first two stages in its application procedure (wavelet signal decomposition and direct orthogonalization) intact but incorporating genetic algorithms as a wavelet coefficients selection method to perform data compression and to improve the quality of the regression models developed later. Several specific applications dealing with diverse NIR regression problems are analyzed to evaluate the actual performance of the new OWAVEC method. Results provided by OWAVEC are also compared with those obtained with original data and with other orthogonal signal correction methods.  相似文献   
26.
Non-negative matrix factorization(NMF)is a technique for dimensionality reduction by placing non-negativity constraints onthe matrix.Based on the PARAFAC model,NMF was extended for three-dimension data decomposition.The three-dimension non-negative matrix factorization(NMF3)algorithm,which was concise and easy to implement,was given in this paper.The NMF3algorithm implementation was based on elements but not on vectors.It could decompose a data array directly without unfolding,which was not similar to that the traditional algorithms do.It has been applied to the simulated data array decomposition andobtained reasonable results.It showed that NMF3 could be introduced for curve resolution in chemometrics.  相似文献   
27.
Formylation is one of the newly discovered post-translational modifications in lysine residue which is responsible for different kinds of diseases. In this work, a novel predictor, named predForm-Site, has been developed to predict formylation sites with higher accuracy. We have integrated multiple sequence features for developing a more informative representation of formylation sites. Moreover, decision function of the underlying classifier have been optimized on skewed formylation dataset during prediction model training for prediction quality improvement. On the dataset used by LFPred and Formator predictor, predForm-Site achieved 99.5% sensitivity, 99.8% specificity and 99.8% overall accuracy with AUC of 0.999 in the jackknife test. In the independent test, it has also achieved more than 97% sensitivity and 99% specificity. Similarly, in benchmarking with recent method CKSAAP_FormSite, the proposed predictor significantly outperformed in all the measures, particularly sensitivity by around 20%, specificity by nearly 30% and overall accuracy by more than 22%. These experimental results show that the proposed predForm-Site can be used as a complementary tool for the fast exploration of formylation sites. For convenience of the scientific community, predForm-Site has been deployed as an online tool, accessible at http://103.99.176.239:8080/predForm-Site.  相似文献   
28.
本文发展了一套分析处理分子束光解反应实验中二级分解产物飞行谱的方法, 它改进了Kroger和Riley的最初讨论。本文表明许多重要的信息都可以从高度平均的实验数据中得出。这包括二级分解产物的平均平动能分布、空间各向异性参数、平行竞争通道间的反应比。模拟的结果可以表现二级分解反应的一些主要特征。  相似文献   
29.
The performances of some numerical methods to improve the signal to noise ratio are compared and applied to enhance noisy signals obtained in gas chromatography with capillary columns and a flame Ionization detector. Several methods have been considered: cutoffs In the Fourier transform of the recorded signal; real time numerical filtering; theoretical model curve fitting; and the correlation of a chromatogram recorded from a pseudorandomly injected sample with the pseudorandom injection function. Numerical real time filtering is shown to be the most convenient method when the main periodic component of the noise has been determined by Fourier analysis.  相似文献   
30.
Using statistically designed experiments, 12,500 observations are generated from a 4-pieced Cobb-Douglas function exhibiting increasing and decreasing returns to scale in its different pieces. Performances of DEA and frontier regressions represented by COLS (Corrected Ordinary Least Squares) are compared at sample sizes ofn=50, 100, 150 and 200. Statistical consistency is exhibited, with performances improving as sample sizes increase. Both DEA and COLS generally give good results at all sample sizes. In evaluating efficiency, DEA generally shows superior performance, with BCC models being best (except at corner points), followed by the CCR model and then by COLS, with log-linear regressions performing better than their translog counterparts at almost all sample sizes. Because of the need to consider locally varying behavior, only the CCR and translog models are used for returns to scale, with CCR being the better performer. An additional set of 7,500 observations were generated under conditions that made it possible to compare efficiency evaluations in the presence of collinearity and with model misspecification in the form of added and omitted variables. Results were similar to the larger experiment: the BCC model is the best performer. However, COLS exhibited surprisingly good performances — which suggests that COLS may have previously unidentified robustness properties — while the CCR model is the poorest performer when one of the variables used to generate the observations is omitted.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号