全文获取类型
收费全文 | 13422篇 |
免费 | 1377篇 |
国内免费 | 333篇 |
专业分类
化学 | 1383篇 |
晶体学 | 13篇 |
力学 | 248篇 |
综合类 | 195篇 |
数学 | 2502篇 |
物理学 | 1738篇 |
无线电 | 9053篇 |
出版年
2024年 | 25篇 |
2023年 | 117篇 |
2022年 | 299篇 |
2021年 | 504篇 |
2020年 | 438篇 |
2019年 | 321篇 |
2018年 | 308篇 |
2017年 | 503篇 |
2016年 | 605篇 |
2015年 | 636篇 |
2014年 | 1066篇 |
2013年 | 1039篇 |
2012年 | 983篇 |
2011年 | 990篇 |
2010年 | 631篇 |
2009年 | 659篇 |
2008年 | 813篇 |
2007年 | 846篇 |
2006年 | 693篇 |
2005年 | 629篇 |
2004年 | 569篇 |
2003年 | 483篇 |
2002年 | 323篇 |
2001年 | 274篇 |
2000年 | 249篇 |
1999年 | 179篇 |
1998年 | 144篇 |
1997年 | 115篇 |
1996年 | 113篇 |
1995年 | 114篇 |
1994年 | 78篇 |
1993年 | 64篇 |
1992年 | 53篇 |
1991年 | 43篇 |
1990年 | 27篇 |
1989年 | 23篇 |
1988年 | 24篇 |
1987年 | 22篇 |
1986年 | 17篇 |
1985年 | 19篇 |
1984年 | 27篇 |
1983年 | 10篇 |
1982年 | 13篇 |
1981年 | 9篇 |
1980年 | 7篇 |
1979年 | 7篇 |
1978年 | 5篇 |
1977年 | 5篇 |
1976年 | 4篇 |
1973年 | 4篇 |
排序方式: 共有10000条查询结果,搜索用时 115 毫秒
151.
152.
A system-on-chip (SOC) usually consists of many memory cores with different sizes and functionality, and they typically represent a significant portion of the SOC and therefore dominate its yield. Diagnostics for yield enhancement of the memory cores thus is a very important issue. In this paper we present two data compression techniques that can be used to speed up the transmission of diagnostic data from the embedded RAM built-in self-test (BIST) circuit that has diagnostic support to the external tester. The proposed syndrome-accumulation approach compresses the faulty-cell address and March syndrome to about 28% of the original size on average under the March-17N diagnostic test algorithm. The key component of the compressor is a novel syndrome-accumulation circuit, which can be realized by a content-addressable memory. Experimental results show that the area overhead is about 0.9% for a 1Mb SRAM with 164 faults. A tree-based compression technique for word-oriented memories is also presented. By using a simplified Huffman coding scheme and partitioning each 256-bit Hamming syndrome into fixed-size symbols, the average compression ratio (size of original data to that of compressed data) is about 10, assuming 16-bit symbols. Also, the additional hardware to implement the tree-based compressor is very small. The proposed compression techniques effectively reduce the memory diagnosis time as well as the tester storage requirement. 相似文献
153.
Statisticians are accustomed to processing numerical, ordinal or nominal data. In many circumstances, such as socio-economic, epidemiologic sample surveys and documentary data bases, this data is juxtaposed with textual data (for example, responses to open questions in surveys). This article presents a series of language-independent procedures based upon applying multivariate techniques (such as correspondence analysis and clustering) to sets of generalized lexical profiles. The generalized lexical profile of a text is a vector whose components are the frequencies of each word (graphical form) or ‘repeated segment’ (sequence of words appearing with a significant frequency in the text). The processing of such large (and often sparse) vectors and matrices requires special algorithms. The main outputs are the following: (1) printouts of the characteristic words and characteristic responses for each category of respondent (these categories are generally derived from available nominal variables); (2) graphical displays of the proximities between words or segments and categories of respondents; (3) when analysing a combination of several texts: graphical displays of proximities between words or segments and each text, or between words or segments and groupings of texts. The systematic use of ‘repeated segments’ provides a valuable help in interpreting the results from a semantic point of view. 相似文献
154.
155.
Stefanie M. Walz Thomas E. Malner Ulrich Mueller Rolf Muelhaupt 《Journal of Polymer Science.Polymer Physics》2003,41(4):360-367
We explored phase separation and self‐assembly of perfluoroalkyl segments at the surface of polymer films obtained from latices of semifluorinated acrylate copolymers and the corresponding latex blends of nonfluorinated and semifluorinated polyacrylates. With laser‐induced secondary mass spectrometry the fluorine distribution was measured after annealing above the minimum film‐forming temperature of the polymers up to a depth of several micrometers. Depth profiles of a semifluorinated acrylate homopolymer and latex blends thereof with fluorine‐free alkylacrylates with 25, 50, and 75 mol % semifluorinated acrylate as well as a copolymer comprised of alkyl acrylate and semifluorinated acrylate (50/50 mol %) were investigated. In the case of latex blends containing both semifluorinated polyacrylates and fluorine‐free or low‐fluorine polymers, self‐assembly accounted for enrichment of the perfluoroalkyl segments at the surface. Coatings exhibiting low surface energy and having a substantially reduced total fluorine content were obtained. © 2003 Wiley Periodicals, Inc. J Polym Sci Part B: Polym Phys 41: 360–367, 2003 相似文献
156.
Validation and verification of social processes within agent-based computational organization models 总被引:1,自引:0,他引:1
Levent Yilmaz 《Computational & Mathematical Organization Theory》2006,12(4):283-312
The use of simulation modeling in computational analysis of organizations is becoming a prominent approach in social science
research. However, relying on simulations to gain intuition about social phenomena has significant implications. While simulations
may give rise to interesting macro-level phenomena, and sometimes even mimic empirical data, the underlying micro and macro
level processes may be far from realistic. Yet, this realism may be important to infer results that are relevant to existing
theories of social systems and to policy making. Therefore, it is important to assess not only predictive capability but also
explanation accuracy of formal models in terms of the degree of realism reflected by the embedded processes. This paper presents
a process-centric perspective for the validation and verification (V&V) of agent-based computational organization models.
Following an overview of the role of V&V within the life cycle of a simulation study, emergent issues in agent-based organization
model V&V are outlined. The notion of social contract that facilitates capturing micro level processes among agents is introduced
to enable reasoning about the integrity and consistency of agent-based organization designs. Social contracts are shown to
enable modular compositional verification of interaction dynamics among peer agents. Two types of consistency are introduced:
horizontal and vertical consistency. It is argued that such local consistency analysis is necessary, but insufficient to validate
emergent macro processes within multi-agent organizations. As such, new formal validation metrics are introduced to substantiate
the operational validity of emergent macro-level behavior.
Levent Yilmaz is Assistant Professor of Computer Science and Engineering in the College of Engineering at Auburn University and co-founder
of the Auburn Modeling and Simulation Laboratory of the M&SNet. Dr. Yilmaz received his Ph.D. and M.S. degrees from Virginia
Polytechnic Institute and State University (Virginia Tech). His research interests are on advancing the theory and methodology
of simulation modeling, agent-directed simulation (to explore dynamics of socio-technical systems, organizations, and human/team
behavior), and education in simulation modeling. Dr. Yilmaz is a member of ACM, IEEE Computer Society, Society for Computer
Simulation International, and Upsilon Pi Epsilon. URL: http://www.eng.auburn.edu/~yilmaz 相似文献
157.
There is a general interest in ranking schemes applied to complex entities described by multiple attributes. Published rankings
for universities are in great demand but are also highly controversial. We compare two classification and ranking schemes
involving universities; one from a published report, ‘Top American Research Universities’ by the University of Florida's TheCenter and the other using DEA. Both approaches use the same data and model. We compare the two methods and discover important equivalences.
We conclude that the critical aspect in classification and ranking is the model. This suggests that DEA is a suitable tool
for these types of studies. 相似文献
158.
Renato Bruni 《Annals of Operations Research》2007,150(1):79-92
The paper is concerned with the problem of binary classification of data records, given an already classified training set
of records. Among the various approaches to the problem, the methodology of the logical analysis of data (LAD) is considered.
Such approach is based on discrete mathematics, with special emphasis on Boolean functions. With respect to the standard LAD
procedure, enhancements based on probability considerations are presented. In particular, the problem of the selection of
the optimal support set is formulated as a weighted set covering problem. Testable statistical hypothesis are used. Accuracy
of the modified LAD procedure is compared to that of the standard LAD procedure on datasets of the UCI repository. Encouraging
results are obtained and discussed. 相似文献
159.
Mathematical Diagnostics (MD) deals with identification problems arising in different practical areas. Some of these problems
can be described by mathematical models where it is required to identify points belonging to two or more sets of points. Most
of the existing tools provide some identification rule (a classifier) by means of which a given point is assigned (attributed)
to one of the given sets. Each classifier can be viewed as a virtual expert. If there exist several classifiers (experts),
the problem of evaluation of experts’ conclusions arises. In the paper for the case of supervised classification the method
of virtual experts (the VE-method) is described. Based on this method, a generalized VE method is proposed where each of the
classifiers can be chosen from a given family of classifiers. As a result, a new optimization problem with a discontinuous
functional is stated. Examples illustrating the proposed approach are provided.
The work of the second author was supported by the Russian Foundation for Fundamental Studies (RFFI) under Grant No 03-01-00668. 相似文献
160.
用飞秒激光(200 fs,1 kHz,800 nm)脉冲在掺杂稀土离子Ce3 的聚甲基丙烯酸甲酯(PMMA)膜中进行了光存储实验研究,包括对样品的吸收光谱、激光照射前后的电子旋转共振(Electron spin resonance,ESR)光谱的测量和讨论。结果表明掺杂稀土离子Ce3 的聚甲基丙烯酸甲酯膜具有较低的写入阈值,有利于高速、并行的三维光存储。实验结果采用传统光学显微镜并行读出。给出了四层存储结果(点间距和层间距分别是4μm和16μm),并讨论了脉冲能量的大小对空腔尺寸的影响,进行高密度存储时,在保证读出信号灰度值足够大的情况下,应选择尽量小的激光脉冲写入能量。实验结果表明这种材料可以应用于三维光信息存储。 相似文献