全文获取类型
收费全文 | 527篇 |
免费 | 23篇 |
国内免费 | 3篇 |
专业分类
化学 | 56篇 |
力学 | 2篇 |
综合类 | 1篇 |
数学 | 254篇 |
物理学 | 45篇 |
无线电 | 195篇 |
出版年
2024年 | 1篇 |
2023年 | 39篇 |
2022年 | 19篇 |
2021年 | 24篇 |
2020年 | 33篇 |
2019年 | 12篇 |
2018年 | 21篇 |
2017年 | 18篇 |
2016年 | 20篇 |
2015年 | 11篇 |
2014年 | 38篇 |
2013年 | 50篇 |
2012年 | 29篇 |
2011年 | 30篇 |
2010年 | 23篇 |
2009年 | 26篇 |
2008年 | 22篇 |
2007年 | 25篇 |
2006年 | 13篇 |
2005年 | 15篇 |
2004年 | 9篇 |
2003年 | 2篇 |
2002年 | 9篇 |
2001年 | 3篇 |
2000年 | 4篇 |
1999年 | 11篇 |
1998年 | 5篇 |
1997年 | 2篇 |
1996年 | 13篇 |
1995年 | 8篇 |
1994年 | 4篇 |
1993年 | 4篇 |
1992年 | 5篇 |
1990年 | 1篇 |
1986年 | 1篇 |
1985年 | 1篇 |
1980年 | 1篇 |
1971年 | 1篇 |
排序方式: 共有553条查询结果,搜索用时 657 毫秒
81.
本文分析了在集成电路制造过程中,使用的基础材料、设备、工艺技术所产生的污染粒子,随机缺陷对IC产品成品率的影响。并主要介绍了从经典学习曲线概念中推导出来的成品率提高模型已被用于追踪产品成品率的提高,该模型建立在能优先迅速解决影响成品率诸多问题的基础上。实际成品率与模型化成品率的比较充分证明了成品率提高模型的实际意义。 相似文献
82.
In studies on scheduling problems, generally setup times and removal times of jobs have been neglected or by including those into processing times. However, in some production systems, setup times and removal times are very important such that they should be considered independent from processing times. Since, in general jobs are done according to automatic machine processes in production systems processing times do not differ according to process sequence. But, since human factor becomes influential when setup times and removal times are taken into consideration, setup times will be decreasing by repeating setup processes frequently. This fact is defined with learning effect in scheduling literature. In this study, a bicriteria m-identical parallel machines scheduling problem with a learning effect of setup times and removal times is considered. The objective function of the problem is minimization of the weighted sum of total completion time and total tardiness. A mathematical programming model is developed for the problem which belongs to NP-hard class. Results of computational tests show that the proposed model is effective in solving problems with up to 15 jobs and five machines. We also proposed three heuristic approaches for solving large jobs problems. According to the best of our knowledge, no work exists on the minimization of the weighted sum of total completion time and total tardiness with a learning effect of setup times and removal times. 相似文献
83.
This paper studies the single machine scheduling problems with learning effect and deteriorating jobs simultaneously. In this model, the processing times of jobs are defined as functions of their starting times and positions in a sequence. It is shown that even with the introduction of learning effect and deteriorating jobs to job processing times, the makespan, the total completion time and the sum of the kth power of completion times minimization problems remain polynomially solvable, respectively. But for the following objective functions: the total weighted completion time and the maximum lateness, this paper proves that the shortest weighted processing time first (WSPT) rule and the earliest due-date first (EDD) rule can construct the optimal sequence under some special cases, respectively. 相似文献
84.
Much of contemporary research in Artificial Immune Systems (AIS) has partitioned into either algorithmic machine learning
and optimisation, or, modelling biologically plausible dynamical systems, with little overlap between. We propose that this dichotomy is somewhat
to blame for the lack of significant advancement of the field in either direction and demonstrate how a simplistic interpretation
of Perelson’s shape-space formalism may have largely contributed to this dichotomy. In this paper, we motivate and derive
an alternative representational abstraction. To do so we consider the validity of shape-space from both the biological and
machine learning perspectives. We then take steps towards formally integrating these perspectives into a coherent computational
model of notions such as life-long learning, degeneracy, constructive representations and contextual recognition—rhetoric
that has long inspired work in AIS, while remaining largely devoid of operational definition. 相似文献
85.
薏仁种类的近红外光谱技术快速鉴别 总被引:1,自引:0,他引:1
薏仁是一种药食两用资源,对其品质快速鉴别的需求也越来越多,近红外光谱技术(near infrared spectroscopy,NIRS)作为一种快速、 无损且环保的方法正适合这一需求。 以不同产地和品种薏仁的近红外光谱为基础,结合化学计量学方法对薏仁种类进行鉴别。 对原光谱用无监督学习算法主成分分析(principal component analysis,PCA)和有监督学习算法学习向量量化(learning vector quantization,LVQ)神经网络、 支持向量机(support vector machine,SVM)进行定性判别分析。 由于不同地区和不同品种的薏仁营养物质组成复杂且含量相近,所选两类薏仁的特征变量很相似,因而PCA得分图重叠严重,很难区分;而LVQ神经网络和SVM都能得到满意结果,LVQ神经网络的预测正确率为90.91%,SVM在经过惩罚参数和核函数参数优选后,分类准确率能达到100%。 结果表明:近红外光谱技术结合化学计量学方法可作为一种快速、 无损、 可靠的方法用于薏仁种类的鉴别,并为市场规范提供技术参考。 相似文献
86.
Signal‐strength‐based location estimation in wireless sensor networks is to locate the physical positions of unknown sensors via the received signal strengths. In this field, there are few localization researches sufficiently exploiting topology structures of the network in both signal space and physical space. The goal of this paper is to first establish two effective localization models based on specific manifold (or local) structures of both signal space and physical (location) space by using our previous locality preserving canonical correlation analysis (LPCCA) model and a newly‐proposed locality correlation analysis (LCA) model, and then develop their corresponding novel location algorithms, called location estimation—LPCCA (LE—LPCCA) and location estimation—LCA (LE—LCA). Since both LPCCA and LCA relatively sufficiently take into account locality characteristics of the manifold structures in both the spaces, our localization algorithms developed from them consequently achieve better localization accuracy than other publicly available advanced algorithms. Copyright © 2011 John Wiley & Sons, Ltd. 相似文献
87.
88.
《Journal of Visual Communication and Image Representation》2013,24(5):579-591
This paper proposes a content-adaptive sharpening algorithm using two-dimensional (2D) FIR filters trained by pre-emphasis for various image pairs. In the learning stage, all low-quality (LQ) and high-quality (HQ) image pairs are first pre-emphasized, i.e., properly sharpened. Then selective 2D FIR filter coefficients for high-frequency synthesis are trained using the pre-emphasized LQ–HQ image pairs, and then are stored in a dictionary that resembles an LUT (look-up table). In the inference stage, each input image is pre-emphasized in the same manner as in the learning stage. The best-matched 2D filter for each LQ patch is then found in the dictionary, and an HQ patch corresponding to the input LQ patch is synthesized using the resultant 2D FIR filter. The experiment results show that the proposed algorithm visually outperforms existing ones and that the mean of absolute errors (MAEs) and MSSSIM (multi-scale structure similarity) of the proposed algorithm are about 10% to 60% lower and about 0.002–0.053 higher, respectively than those of the existing algorithms. 相似文献
89.
The key idea of this model is that firms are the result of an evolutionary process. Based on demand and supply considerations the evolutionary model presented here derives explicitly Gibrat’s law of proportionate effects as the result of the competition between products. Applying a preferential attachment mechanism for firms, the theory allows to establish the size distribution of products and firms. Also established are the growth rate and price distribution of consumer goods. Taking into account the characteristic property of human activities to occur in bursts, the model allows also an explanation of the size–variance relationship of the growth rate distribution of products and firms. Further the product life cycle, the learning (experience) curve and the market size in terms of the mean number of firms that can survive in a market are derived. The model also suggests the existence of an invariant of a market as the ratio of total profit to total revenue. The relationship between a neo-classic and an evolutionary view of a market is discussed. The comparison with empirical investigations suggests that the theory is able to describe the main stylized facts concerning the size and growth of firms. 相似文献
90.