首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12897篇
  免费   1504篇
  国内免费   995篇
化学   3426篇
晶体学   75篇
力学   1116篇
综合类   249篇
数学   5100篇
物理学   5430篇
  2023年   86篇
  2022年   283篇
  2021年   317篇
  2020年   260篇
  2019年   295篇
  2018年   307篇
  2017年   367篇
  2016年   414篇
  2015年   366篇
  2014年   646篇
  2013年   889篇
  2012年   617篇
  2011年   733篇
  2010年   633篇
  2009年   777篇
  2008年   885篇
  2007年   828篇
  2006年   755篇
  2005年   729篇
  2004年   609篇
  2003年   577篇
  2002年   562篇
  2001年   351篇
  2000年   365篇
  1999年   349篇
  1998年   338篇
  1997年   260篇
  1996年   212篇
  1995年   191篇
  1994年   188篇
  1993年   125篇
  1992年   118篇
  1991年   91篇
  1990年   73篇
  1989年   85篇
  1988年   75篇
  1987年   66篇
  1986年   53篇
  1985年   83篇
  1984年   55篇
  1983年   36篇
  1982年   52篇
  1981年   39篇
  1980年   38篇
  1979年   34篇
  1978年   42篇
  1977年   29篇
  1976年   28篇
  1975年   20篇
  1973年   17篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
91.
In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to the (entropic) functional ϕ. We revisit the usual maximum entropy principle—more precisely its inverse problem, starting from the distribution and constraints, which leads to the introduction of state-dependent ϕ-entropies. Then, we examine interrelations between the extended informational measures and generalize relationships such the Cramér–Rao inequality and the de Bruijn identity in this broader context. In this particular framework, the maximum entropy distributions play a central role. Of course, all the results derived in the paper include the usual ones as special cases.  相似文献   
92.
Coronary heart disease (CHD) is the leading cause of cardiovascular death. This study aimed to propose an effective method for mining cardiac mechano-electric coupling information and to evaluate its ability to distinguish patients with varying degrees of coronary artery stenosis (VDCAS). Five minutes of electrocardiogram and phonocardiogram signals was collected synchronously from 191 VDCAS patients to construct heartbeat interval (RRI)–systolic time interval (STI), RRI–diastolic time interval (DTI), HR-corrected QT interval (QTcI)–STI, QTcI–DTI, Tpeak–Tend interval (TpeI)–STI, TpeI–DTI, Tpe/QT interval (Tpe/QTI)–STI, and Tpe/QTI–DTI series. Then, the cross sample entropy (XSampEn), cross fuzzy entropy (XFuzzyEn), joint distribution entropy (JDistEn), magnitude-squared coherence function, cross power spectral density, and mutual information were applied to evaluate the coupling of the series. Subsequently, support vector machine recursive feature elimination and XGBoost were utilized for feature selection and classification, respectively. Results showed that the joint analysis of XSampEn, XFuzzyEn, and JDistEn had the best ability to distinguish patients with VDCAS. The classification accuracy of severe CHD—mild-to-moderate CHD group, severe CHD—chest pain and normal coronary angiography (CPNCA) group, and mild-to-moderate CHD—CPNCA group were 0.8043, 0.7659, and 0.7500, respectively. The study indicates that the joint analysis of XSampEn, XFuzzyEn, and JDistEn can effectively capture the cardiac mechano-electric coupling information of patients with VDCAS, which can provide valuable information for clinicians to diagnose CHD.  相似文献   
93.
In this study, a novel application of neurocomputing technique is presented for solving nonlinear heat transfer and natural convection porous fin problems arising in almost all areas of engineering and technology, especially in mechanical engineering. The mathematical models of the problems are exploited by the intelligent strength of Euler polynomials based Euler neural networks (ENN’s), optimized with a generalized normal distribution optimization (GNDO) algorithm and Interior point algorithm (IPA). In this scheme, ENN’s based differential equation models are constructed in an unsupervised manner, in which the neurons are trained by GNDO as an effective global search technique and IPA, which enhances the local search convergence. Moreover, a temperature distribution of heat transfer and natural convection porous fin are investigated by using an ENN-GNDO-IPA algorithm under the influence of variations in specific heat, thermal conductivity, internal heat generation, and heat transfer rate, respectively. A large number of executions are performed on the proposed technique for different cases to determine the reliability and effectiveness through various performance indicators including Nash–Sutcliffe efficiency (NSE), error in Nash–Sutcliffe efficiency (ENSE), mean absolute error (MAE), and Thiel’s inequality coefficient (TIC). Extensive graphical and statistical analysis shows the dominance of the proposed algorithm with state-of-the-art algorithms and numerical solver RK-4.  相似文献   
94.
In the rate-distortion function and the Maximum Entropy (ME) method, Minimum Mutual Information (MMI) distributions and ME distributions are expressed by Bayes-like formulas, including Negative Exponential Functions (NEFs) and partition functions. Why do these non-probability functions exist in Bayes-like formulas? On the other hand, the rate-distortion function has three disadvantages: (1) the distortion function is subjectively defined; (2) the definition of the distortion function between instances and labels is often difficult; (3) it cannot be used for data compression according to the labels’ semantic meanings. The author has proposed using the semantic information G measure with both statistical probability and logical probability before. We can now explain NEFs as truth functions, partition functions as logical probabilities, Bayes-like formulas as semantic Bayes’ formulas, MMI as Semantic Mutual Information (SMI), and ME as extreme ME minus SMI. In overcoming the above disadvantages, this paper sets up the relationship between truth functions and distortion functions, obtains truth functions from samples by machine learning, and constructs constraint conditions with truth functions to extend rate-distortion functions. Two examples are used to help readers understand the MMI iteration and to support the theoretical results. Using truth functions and the semantic information G measure, we can combine machine learning and data compression, including semantic compression. We need further studies to explore general data compression and recovery, according to the semantic meaning.  相似文献   
95.
Zhongqi Sun 《中国物理 B》2021,30(11):110303-110303
Reference-frame-independent quantum key distribution (RFI-QKD) can allow a quantum key distribution system to obtain the ideal key rate and transmission distance without reference system calibration, which has attracted much attention. Here, we propose an RFI-QKD protocol based on wavelength division multiplexing (WDM) considering finite-key analysis and crosstalk. The finite-key bound for RFI-QKD with decoy states is derived under the crosstalk of WDM. The resulting secret key rate of RFI-QKD, which is more rigorous, is obtained. Simulation results reveal that the secret key rate of RFI-QKD based on WDM is affected by the multiplexing channel number, as well as crosstalk between adjacent channels.  相似文献   
96.
钛合金凭借其强度高、耐蚀性好、耐热性高等特点已经被广泛应用于航天、海洋、生物医药等诸多领域,其中Ti-6Al-4V(TC4)合金的耐热性、强度、塑性、韧性、成形性、可焊性、耐蚀性和生物相容性均较好,已成为钛合金工业中的王牌合金。钛合金在激光焊接时,加入表面活性剂可以增加焊缝熔深、提高焊接效率、改善焊缝微观组织的不均匀性,但是可能会改变熔合区和焊缝区中元素含量及其分布状态,从而可能会对材料的性能产生一定的影响。运用LIBS分析技术对TC4钛合金焊接试样表面进行面扫描同步获得多元素成分信息,同时结合原位统计分布分析方法(OPA),实现了对钛合金母材、熔合区、焊缝成分及其分布状态的快速表征,为活性剂的选择和焊接后钛合金的材料性能提供一种新的评价手段。选取了两个使用不同活性剂进行焊接的TC4钛合金薄板试样,选取焊缝纵切面方向作为分析面,采用320目的氧化铝砂纸进行表面处理,利用LIBSOPA系统进行成分分布统计表征。首先,对激发光斑和剥蚀条件进行条件优化,最终选择200 μm的激发光斑、10个预剥蚀脉冲10个剥蚀脉冲进行实验,并建立了钛合金中C,Al,V,Fe,Si和Ti六个元素的校准曲线(其中Si元素主要来自活性剂);然后对钛合金焊接样品进行了区域扫描,并对元素含量和分布状态进行了统计表征。同时,在钛合金焊接样品的不同部位进行分区取样,采用高频红外法分析C元素含量,并与LIBSOPA结果进行比对,两种测试方法结果吻合。元素Al,V,Fe,Si和Ti分布结果与微束X荧光光谱法对应性较好。运用LIBSOPA 技术实现了对钛合金母材、熔合区、焊缝中多元素的成分分布表征,为快速判定钛合金焊缝中成分及分布状态提供了全新的评价表征手段。  相似文献   
97.
发展了一种基于逆卷积神经网络的图像级重建方法用于聚变等离子体辐射分布的断层反演.通过引入结构相似度(SSIM)作为损失函数,该方法在模拟数据实验中表现出了较好的重建效果.模拟实验结果表明,在弦积分信号噪声强度为10%、15%及20%时,该方法的重建结果依然具有良好的精确度和鲁棒性.  相似文献   
98.
99.
In this article, we propose the exponentiated sine-generated family of distributions. Some important properties are demonstrated, such as the series representation of the probability density function, quantile function, moments, stress-strength reliability, and Rényi entropy. A particular member, called the exponentiated sine Weibull distribution, is highlighted; we analyze its skewness and kurtosis, moments, quantile function, residual mean and reversed mean residual life functions, order statistics, and extreme value distributions. Maximum likelihood estimation and Bayes estimation under the square error loss function are considered. Simulation studies are used to assess the techniques, and their performance gives satisfactory results as discussed by the mean square error, confidence intervals, and coverage probabilities of the estimates. The stress-strength reliability parameter of the exponentiated sine Weibull model is derived and estimated by the maximum likelihood estimation method. Also, nonparametric bootstrap techniques are used to approximate the confidence interval of the reliability parameter. A simulation is conducted to examine the mean square error, standard deviations, confidence intervals, and coverage probabilities of the reliability parameter. Finally, three real applications of the exponentiated sine Weibull model are provided. One of them considers stress-strength data.  相似文献   
100.
The point and interval estimations for the unknown parameters of an exponentiated half-logistic distribution based on adaptive type II progressive censoring are obtained in this article. At the beginning, the maximum likelihood estimators are derived. Afterward, the observed and expected Fisher’s information matrix are obtained to construct the asymptotic confidence intervals. Meanwhile, the percentile bootstrap method and the bootstrap-t method are put forward for the establishment of confidence intervals. With respect to Bayesian estimation, the Lindley method is used under three different loss functions. The importance sampling method is also applied to calculate Bayesian estimates and construct corresponding highest posterior density (HPD) credible intervals. Finally, numerous simulation studies are conducted on the basis of Markov Chain Monte Carlo (MCMC) samples to contrast the performance of the estimations, and an authentic data set is analyzed for exemplifying intention.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号