首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   189篇
  免费   7篇
化学   89篇
晶体学   2篇
力学   6篇
数学   9篇
物理学   45篇
无线电   45篇
  2024年   1篇
  2022年   1篇
  2021年   4篇
  2020年   4篇
  2019年   2篇
  2018年   7篇
  2017年   1篇
  2016年   6篇
  2015年   1篇
  2014年   6篇
  2013年   10篇
  2012年   8篇
  2011年   9篇
  2010年   6篇
  2009年   7篇
  2008年   8篇
  2007年   9篇
  2006年   9篇
  2005年   10篇
  2004年   7篇
  2003年   3篇
  2002年   4篇
  2001年   4篇
  2000年   8篇
  1999年   5篇
  1998年   3篇
  1997年   6篇
  1996年   4篇
  1995年   1篇
  1994年   4篇
  1993年   4篇
  1992年   4篇
  1991年   1篇
  1990年   1篇
  1989年   3篇
  1988年   3篇
  1987年   2篇
  1986年   3篇
  1985年   1篇
  1982年   1篇
  1981年   4篇
  1980年   3篇
  1979年   1篇
  1978年   1篇
  1976年   1篇
  1970年   1篇
  1968年   1篇
  1967年   1篇
  1966年   1篇
  1945年   1篇
排序方式: 共有196条查询结果,搜索用时 390 毫秒
1.
X-ray crystallography and NMR spectroscopy provide the only sources of experimental data from which protein structures can be analyzed at high or even atomic resolution. The degree to which these methods complement each other as sources of structural knowledge is a matter of debate; it is often proposed that small proteins yielding high quality, readily analyzed NMR spectra are a subset of those that readily yield strongly diffracting crystals. We have examined the correlation between NMR spectral quality and success in structure determination by X-ray crystallography for 159 prokaryotic and eukaryotic proteins, prescreened to avoid proteins providing polydisperse and/or aggregated samples. This study demonstrates that, across this protein sample set, the quality of a protein's [15N-1H]-heteronuclear correlation (HSQC) spectrum recorded under conditions generally suitable for 3D structure determination by NMR, a key predictor of the ability to determine a structure by NMR, is not correlated with successful crystallization and structure determination by X-ray crystallography. These results, together with similar results of an independent study presented in the accompanying paper (Yee, et al., J. Am. Chem. Soc., accompanying paper), demonstrate that X-ray crystallography and NMR often provide complementary sources of structural data and that both methods are required in order to optimize success for as many targets as possible in large-scale structural proteomics efforts.  相似文献   
2.
3.
The electrochemical and chemical polymerization of acrylamide (AA) has been studied. The electrolysis of the monomer in N,N-dimethylformamide (DMF) containing (C4H9)4NClO4 as the supporting electrolyte leads to polymer formation in both anode and cathode compartments. The cathodic polymer dissolves in the reaction mixture and the anodic polymer precipitates during the course of polymerization. A plausible mechanism for the anodic and cathodic initiation reaction has been given. The chemical polymerization of acrylamide that has been initiated by HClO4 is analogous to its anodic polymerization. The polymer yield increases with an increase in concentration of the monomer and HClO4. Raising the reaction temperature also enhances the polymerization rate. The overall apparent activation energy of the polymerization was determined to be ca. 19 kcal/mole. The copolymerization of acrylamide was carried out with methyl methacrylate (MMA) in a solution of HClO4 in DMF. The reactivity ratios are r1 (AA) = 0.25 and r2 = 2.50. The polymerization with HClO4 appears to be by a free radical mechanism. When the polymerization of acrylamide is carried out with HClO4 in H2O, a crosslinked water-insoluble gel formation takes place.  相似文献   
4.
A large number of software reliability growth models have been proposed to analyse the reliability of a software application based on the failure data collected during the testing phase of the application. To ensure analytical tractability, most of these models are based on simplifying assumptions of instantaneous & perfect debugging. As a result, the estimates of the residual number of faults, failure rate, reliability, and optimal software release time obtained from these models tend to be optimistic. To obtain realistic estimates, it is desirable that the assumptions of instantaneous & perfect debugging be amended. In this paper we discuss the various policies according to which debugging may be conducted. We then describe a rate-based simulation framework to incorporate explicit debugging activities, which may be conducted according to the different debugging policies, into software reliability growth models. The simulation framework can also consider the possibility of imperfect debugging in conjunction with any of the debugging policies. Further, we also present a technique to compute the failure rate, and the reliability of the software, taking into consideration explicit debugging. An economic cost model to determine the optimal software release time in the presence of debugging activities is also described. We illustrate the potential of the simulation framework using two case studies.  相似文献   
5.

Particle size distribution of nanoparticles plays an important role in modelling many scientific and engineering problems. In this article, we proposed a Finite Volume Method (FVM) to model TiO2 nanoparticles formation using population balance equations (PBEs) by incorporating the simultaneous agglomeration and disintegration processes. The superposition of the PBEs for agglomeration and disintegration with different kernels leads to a system of partial-integro differential equations, which are numerically solved by using FVM. The precipitation of TiO2 nanoparticles in the batch reactor is studied experimentally as well as by numerical simulations based on Austin and Diemer disintegration kernels and Shear agglomeration kernel. Finally, the capability of the precipitation model is evaluated and the experimental results on particle sizes are compared with the numerical results.

  相似文献   
6.
Many seemingly simple questions that individual users face in their daily lives may actually require substantial number of computing resources to identify the right answers. For example, a user may want to determine the right thermostat settings for different rooms of a house based on a tolerance range such that the energy consumption and costs can be maximally reduced while still offering comfortable temperatures in the house. Such answers can be determined through simulations. However, some simulation models as in this example are stochastic, which require the execution of a large number of simulation tasks and aggregation of results to ascertain if the outcomes lie within specified confidence intervals. Some other simulation models, such as the study of traffic conditions using simulations may need multiple instances to be executed for a number of different parameters. Cloud computing has opened up new avenues for individuals and organizations with limited resources to obtain answers to problems that hitherto required expensive and computationally-intensive resources. This paper presents SIMaaS, which is a cloud-based Simulation-as-a-Service to address these challenges. We demonstrate how lightweight solutions using Linux containers (e.g., Docker) are better suited to support such services instead of heavyweight hypervisor-based solutions, which are shown to incur substantial overhead in provisioning virtual machines on-demand. Empirical results validating our claims are presented in the context of two case studies.  相似文献   
7.
8.
9.
An efficient stereoselective three-component reaction for the synthesis of functionalized spiro[4H-pyran-3,3′-oxindole] derivatives was realized through an organocatalyzed domino Knoevenagel/Michael/cyclization reaction using a cinchonidine-derived thiourea as the catalyst. Using water as the additive was found to improve the product ee values significantly. Under the optimized conditions, the reactions between isatins, malononitrile, and 1,3-dicarbonyl compounds yield the desired spirooxindole products in good yields (71–92%) and moderate to high ee values (up to 87% ee).  相似文献   
10.
An ongoing major outbreak of mountain pine beetle in Western Canada has provided a clear opportunity to utilize waste pinewood as a source of renewable energy. Therefore hydrothermal processing of waste pinewood as a feedstock for bio-oil and biochar production using subcritical and supercritical water technology was carried out in semi-batch mode to investigate the effect of pressure (200–400 bar) and temperature (300–400 °C) on the yield and composition of bio-oil. The pinewood samples have very high cellulose and hemicellulose content but low ash content and are thus a formidable feedstock for bioenergy production. The optimum conditions for the hydrothermal processing of the pinewood in a tubular reactor were found to be 400 °C and 250 bars with respect to biochar and bio-oil yield based on the highest calorific value analysis. Detailed characterization of bio-oil and biochar was performed using GCMS, NMR, SEM, calorific value, and elemental analysis, respectively. The critical components of bio-oil were found to be phenols, methoxyphenols, hydroxymethyl furfural (HMF), and vanillin, whereas as compared to the raw pine wood, the biochar was considerably lower H:C and O:C ratios than those of the unprocessed pinewood. The analyses of bio-oil by means of GCMS and 1H NMR showed that it was mainly composed of heterocyclic compounds, phenols, aldehydes and acids.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号