全文获取类型
收费全文 | 146篇 |
免费 | 1篇 |
国内免费 | 3篇 |
专业分类
化学 | 41篇 |
力学 | 10篇 |
数学 | 71篇 |
物理学 | 28篇 |
出版年
2024年 | 1篇 |
2023年 | 3篇 |
2022年 | 3篇 |
2021年 | 1篇 |
2020年 | 1篇 |
2019年 | 7篇 |
2018年 | 5篇 |
2017年 | 4篇 |
2016年 | 1篇 |
2015年 | 3篇 |
2014年 | 6篇 |
2013年 | 9篇 |
2012年 | 5篇 |
2011年 | 11篇 |
2010年 | 8篇 |
2009年 | 9篇 |
2008年 | 11篇 |
2007年 | 11篇 |
2006年 | 8篇 |
2005年 | 7篇 |
2004年 | 7篇 |
2003年 | 4篇 |
2001年 | 6篇 |
2000年 | 1篇 |
1998年 | 2篇 |
1997年 | 3篇 |
1995年 | 3篇 |
1994年 | 1篇 |
1993年 | 2篇 |
1992年 | 1篇 |
1991年 | 2篇 |
1989年 | 1篇 |
1983年 | 2篇 |
1980年 | 1篇 |
排序方式: 共有150条查询结果,搜索用时 31 毫秒
1.
We have investigated the quantum-statistics behavior of the exciton-biexciton system from the photoluminescence properties in (GaAs) m /(AlAs) m type-II superlattices with m = 12 and 13 monolayers, where the lowest-energy type-II exciton consists of the n = 1 X electron of AlAs and n = 1 o heavy hole of GaAs. The long exciton lifetime of the order of w s due to the indirect transition nature enables us to obtain precisely the density relation between the exciton and biexciton from the line-shape analysis of time-resolved photoluminescence spectra. In a relatively low exciton-density region, the biexciton density obeys a well-known square law. At an exciton density around 1 2 10 10 cm m 2 , the biexciton density suddenly increases with a threshold-like nature. This behavior, which is realized at a bath temperature up to 8 K under an excitation power of the order of 100 mW/cm 2 , results from the characteristics of Bose-Einstein statistics of the exciton-biexciton system. 相似文献
2.
We compute the analytic expression of the probability distributions FAEX,+ and FAEX,− of the normalized positive and negative AEX (Netherlands) index daily returns r(t). Furthermore, we define the α re-scaled AEX daily index positive returns r(t)α and negative returns (−r(t))α, which we call, after normalization, the α positive fluctuations and α negative fluctuations. We use the Kolmogorov-Smirnov statistical test as a method to find the values of α that optimize the data collapse of the histogram of the α fluctuations with the Bramwell-Holdsworth-Pinton (BHP) probability density function. The optimal parameters that we found are α+=0.46 and α−=0.43. Since the BHP probability density function appears in several other dissimilar phenomena, our result reveals a universal feature of stock exchange markets. 相似文献
3.
Vitalis Musara Samuel K. Fosuhene Winston T. Ireeta Lorinda Wu Andrew W.R. Leitch 《Optics Communications》2011,284(12):2690-2694
We design a polarisation mode dispersion (PMD) emulator through subdividing a polarisation maintaining fibre (PMF) of 22 m in length. The aim of this emulator design is to show that first-order and second-order PMD can be inversely proportional to each other. Furthermore, the emulator is also used to show that the magnitude of PMD is independent to whether its statistics approach theoretical distributions or not, of most importance is the degree of mode coupling. The same (as former) applies to its autocorrelation function (ACF). The PMD control mechanism for the emulator is not in real time. 相似文献
4.
Multidimensional (MD) separations, especially comprehensive two-dimensional (2D) separations such as comprehensive 2D LC (LC × LC), and comprehensive 2D GC (GC × GC), are potentially powerful separation techniques. It is important to have a clear definition of MD techniques to better understand the scope and boundaries of the subject. Widely accepted definitions of MD Separations have their roots in the definition proposed by Giddings. Giddings also added several comments that clarified the scope of his definition. However, some researchers extend Giddings’ definitions beyond their intended scope. Doing so disqualifies such comprehensive 2D techniques as LC × LC, GC × GC and 2D TLC from being considered as 2D techniques. In other instances, extended treatment of Giddings’ definition is used as a basis to justify design-parameters of comprehensive 2D separations despite the fact that these parameters lead to sub-optimal implementations. We believe that the shortcomings in the definition and its popular interpretations are serious enough to warrant attention, especially by those interested in designing optimal instrumentation for MD separations like comprehensive 2D GC. After discussion of the weaknesses in the currently used definitions, we propose to define n-dimensional analysis as one that generates n-dimensional displacement information. We believe that this definition captures the spirit of Giddings’ definition while avoiding the problems associated with its popular interpretations. 相似文献
5.
We continue our use of “simple” energetic patterns, where simple means the use of parameters derived only from the stoichiometry
of these species in our studies of the entropy of formation (TΔf
S
o) of aqueous anions. Relationships between the entropy of formation and different parameters such as the number of oxygen
atoms, the natural logarithm of the molecular weight and the total number of atoms are explored. The charge of the species,
z− continues to be explicitly considered where we now explore various choices of p and use of z
p as a parameter. 相似文献
6.
“Simple” energetic patterns, where simple means the use of parameters derived only from the stoichiometry of these species,
are relatively rarely discussed in the literature. In addition, entropy studies have been dominated by derivation of the absolute
quantity S° rather than the entropy of formation (TΔf
S
o). Relationships between the entropy of formation and different parameters such as negative value of the charge of the species,
the number of oxygen atoms, the natural logarithm of the molecular weight, the total number of atoms and the number of central
atoms that are gases were recently discussed by us for aqueous polynuclear oxyanions. As shown here hydrogen containing anions
do not follow this pattern. In this study, new approaches for the estimation of the entropy of formation of aqueous hydrogen
containing mono and polynuclear oxyanions are suggested, evaluated and recommended. 相似文献
7.
We propose the use of Doehlert’s experimental design, a second-order uniform shell design, for the optimization of molecularly
imprinted polymers (MIPs). We have chosen a simple model system where the influence of kind and degree of cross-linking on
template recognition was studied using S-propranolol as the template. We found that Doehlert’s design allows—with very few experiments—one to screen the evolution
of the binding capacity of a MIP as a function the different parameters, and thus appears to be a powerful means to screen
for the best composition and synthesis method for MIPs. We believe that this chemometric tool can significantly accelerate
the development of new MIPs as synthetic recognition elements, particularly in the context of a given application, and will
be a versatile complement or alternative to first-order designs to fit complex processes.
Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users. 相似文献
8.
《Journal of computational science》2014,5(2):126-134
Traditional debuggers are of limited value for modern scientific codes that manipulate large complex data structures. Current parallel machines make this even more complicated, because the data structure may be distributed across processors, making it difficult to view/interpret and validate its contents. Therefore, many applications’ developers resort to placing validation code directly in the source program. This paper discusses a novel debug-time assertion, called a “Statistical Assertion”, that allows using extracted statistics instead of raw data to reason about large data structures, therefore help locating coding defects. In this paper, we present the design and implementation of an ‘extendable’ statistical-framework which executes the assertion in parallel by exploiting the underlying parallel system. We illustrate the debugging technique with a molecular dynamics simulation. The performance is evaluated on a 20,000 processor Cray XE6 to show that it is useful for real-time debugging. 相似文献
9.
10.
This paper presents a comprehensive review of the work done, during the 1968–2005, in the application of statistical and intelligent techniques to solve the bankruptcy prediction problem faced by banks and firms. The review is categorized by taking the type of technique applied to solve this problem as an important dimension. Accordingly, the papers are grouped in the following families of techniques: (i) statistical techniques, (ii) neural networks, (iii) case-based reasoning, (iv) decision trees, (iv) operational research, (v) evolutionary approaches, (vi) rough set based techniques, (vii) other techniques subsuming fuzzy logic, support vector machine and isotonic separation and (viii) soft computing subsuming seamless hybridization of all the above-mentioned techniques. Of particular significance is that in each paper, the review highlights the source of data sets, financial ratios used, country of origin, time line of study and the comparative performance of techniques in terms of prediction accuracy wherever available. The review also lists some important directions for future research. 相似文献