首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   112篇
  免费   5篇
化学   84篇
晶体学   2篇
力学   1篇
数学   5篇
物理学   25篇
  2022年   7篇
  2021年   3篇
  2020年   7篇
  2019年   3篇
  2018年   8篇
  2017年   4篇
  2016年   10篇
  2015年   4篇
  2014年   5篇
  2013年   8篇
  2012年   6篇
  2011年   3篇
  2010年   4篇
  2009年   1篇
  2008年   7篇
  2007年   7篇
  2006年   4篇
  2005年   5篇
  2004年   3篇
  2003年   3篇
  2002年   3篇
  2001年   2篇
  2000年   1篇
  1999年   1篇
  1997年   1篇
  1996年   2篇
  1994年   1篇
  1991年   2篇
  1957年   1篇
  1921年   1篇
排序方式: 共有117条查询结果,搜索用时 31 毫秒
1.
We analyze a variant of the EPRB experiment within a framework for quantum mechanics that rests on a radical interpretation of the sum over histories. Within this framework, reality is (just as classically) a single history, e.g. a definite collection of particles undergoing definite motions; and quantum dynamics appears as a kind of stochastic law of motion for that history, a law formulated in terms of non-classical probability-amplitudes. No state vectors enter this framework, and their attendant nonlocality is therefore absent as well.1. That is, before pair creations and annihilations were discovered. (The electronic and nuclear spins might also be regarded as new aspects of their kinematics. But perhaps spin is better construed, within the sum-over-histories framework, as a quality of a more dynamical character, namely as a generalized sort of probability-amplitude.)2. A possible escape would be the so-called Everett interpretation, in which the collapse never occurs, but its effects are supposed to be recovered via a more careful analysis of closed systems in which measurement-like processes take place. Among other things, this approach tends to lead either to the view that nothing really happens [1] or to the view that everything really happens [2] (which perhaps is not that different from the former view).3. For example, the rule, collapse occurs along the past light cone (in the Heisenberg picture), appears to be consistent.4. And Bell's inequality shows thatany theory formulated in terms of an instantaneous state evolving in time would encounter the same trouble. Indeed, the trouble shows up even more glaringly if one adapts Bell's argument to spin-1 systems, using the results of Kochen and Specker[10]. In order to use the Kochen-Specker results in the EPR manner one needs a scheme for measuring the relevant observables, but this can be accomplished by means of suitably concatenated Stern-Gerlach analyzers with recombining beams [13]. Then, as Allen Stairs has pointed out [14], even the perfect correlations become impossible to reproduce, and no reference to probability theory is needed to establish a contradiction with locality. Recently, an analogous experiment using three spin 1/2 particles instead of two spin 1 particles has also been given [15].5. No technical problem obstructs an extension to fermionic fields (indeed the functional integral formalism for Quantum Field Theory is probably the most popular at present), but the realistic interpretation of the individual histories seems to get lost. One way out would be if all fermions were composites or collective excitations of fields quantized according to bosonic commutation relations. Another would be if the particle formulation were taken as basic, with the complementary field formulation being merely a mathematical artifice (at least for fermions).6. In the approach of Gell-Mann-Hartle and Griffiths for example, only a small subset of the possible partitions is granted meaning, in such a way that all interference terms are suppressed and quantum probabilities reduce to classical ones.7. In stating these rules we consider an idealized situation in which the spatio-temporal indeterminacy of particle-locationwithin a given one of our trajectories is ignored; or if you prefer, you can take the experiment as only a Gedanken one affording a simplified illustration of how EPR-like correlations are understood within the sum-over-histories framework. In this connection recall also that the semiclassical propagator is in fact exact for a free particle.8. This can be interpreted either as part of the specification of the initial conditions, or (as suggested by a referee) merely as an example of relativization of probabilities.9. Thus a state vector may be defined as an equivalence-class of sets of partial histories.10. One such generalization applies to open systems, for example to a particle in contact with a heat reservoir. For this example see [11], wherein the two-way path formalism of §5 above is used, and the influence of the reservoir results in an effective dynamics for the particle in which the forward and backward portions of its world-line are coupled to each other by a certain interaction term in the amplitude. In this type of situation a density-operator (though not a state vector ) can still be introduced, but it no longer summarizes all the relevant information about the past (and correspondingly its evolution lacks the Markov property that(t + dt) is determined by(t) alone). For quantum gravity, it may be that not even such a non-Markov will be exactly definable, and only the global probabilities themselves will make sense.11. Ironically it is just this property of the amplitudes which, as mentioned above, makes possible the introduction of the state vectors whose collapse then introduces such a strong appearance ofnonlocality into the theory.  相似文献   
2.
The cuttlefish belongs to the mollusk class Cephalopoda, considered as the most advanced marine invertebrates and thus widely used as models to study the biology of complex behaviors and cognition, as well as their related neurochemical mechanisms. Surprisingly, methods to quantify the biogenic monoamines and their metabolites in cuttlefish brain remain sparse and measure a limited number of analytes. This work aims to validate an HPLC‐ECD method for the simultaneous quantification of dopamine, serotonin, norepinephrine and their main metabolites in cuttlefish brain. In comparison and in order to develop a method suitable to answer both ecological and biomedical questions, the validation was also carried out on a phylogenetically remote species: mouse (mammals). The method was shown to be accurate, precise, selective, repeatable and sensitive over a wide range of concentrations for 5‐hydroxyindole‐3‐acetic acid, serotonin, dopamine, 3,4‐dihydroxyphenylacetic acid and norepinephrine in the both extracts of cuttlefish and mouse brain, though with low precision and recovery for 4‐hydroxy‐3‐methoxyphenylethylene glycol. Homovanillic acid, accurately studied in rodents, was not detectable in the brain of cuttlefish. Overall, we described here the first fully validated HPLC method for the routine measurement of both monoamines and metabolites in cuttlefish brain. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
3.
Visible light excitable rhodamine B derivative (TARDHD) has been developed for fluorescence and naked eye detection of histidine in aqueous medium. TARDHD shows 45 fold fluorescence enhancement in the presence of histidine. It forms Schiff base with histidine and stabilizes via intra-molecular H-bonding. TARDHD can efficiently detect intracellular histidine.  相似文献   
4.
Heterogeneous photocatalysis has been extensively investigated for the degradation of organic pollutants from wastewater. The remarkable advantages of the heterogeneous photocatalysis process depend upon its ability to produce reactive oxygen species under visible/UV/solar light irradiation. However, the long-term stability and reuse potential of these catalysts are of great concern these days, yet understudied. This review aims to systematically present a state of the art understanding of such catalysts' reuse potential. Various important surface characteristics of the photocatalysts for improving the photostability and activity of the catalyst are discussed. Besides, the synergistic effect of different surface modified materials, composite materials and their surface characteristics for their enhanced activity are also covered. Finally, a discussion on various regeneration processes used for such catalysts is also presented, identifying some vital research needs in this field.  相似文献   
5.
6.
The financial crisis began with the collapse of Lehman Brothers and the subprime asset backed securities debacle. Credit risk was turned into liquidity risk, resulting in a lack of confidence among financial institutions. In this article, we will propose a way to model liquidity risk and the credit risk in best practices. We will show that liquidity risk is a new type of risk and the current way to deal with it is based solely on observed variables without any theoretical link. We propose an heuristic approach to combine the numerous liquidity risk indicators with a logistic regression for the first time. In regards to credit risk, several articles prove that the best practice is to use an option model to appreciate this risk. We will present our methodology using stochastic diffusion for the interest rate because currently the yield curves aren’t liquid. This approach is more relevant because the basis model in prior publications has a constant interest rate or a forward rate. Both models allow a better understanding of liquidity and credit risks and the further development of research deals with the link between these two financial risks.  相似文献   
7.
8.

Background

The new REACH legislation requires assessment of a large number of chemicals in the European market for several endpoints. Developmental toxicity is one of the most difficult endpoints to assess, on account of the complexity, length and costs of experiments. Following the encouragement of QSAR (in silico) methods provided in the REACH itself, the CAESAR project has developed several models.

Results

Two QSAR models for developmental toxicity have been developed, using different statistical/mathematical methods. Both models performed well. The first makes a classification based on a random forest algorithm, while the second is based on an adaptive fuzzy partition algorithm. The first model has been implemented and inserted into the CAESAR on-line application, which is java-based software that allows everyone to freely use the models.

Conclusions

The CAESAR QSAR models have been developed with the aim to minimize false negatives in order to make them more usable for REACH. The CAESAR on-line application ensures that both industry and regulators can easily access and use the developmental toxicity model (as well as the models for the other four endpoints).
  相似文献   
9.

Background

The new European Regulation on chemical safety, REACH, (Registration, Evaluation, Authorisation and Restriction of CHemical substances), is in the process of being implemented. Many chemicals used in industry require additional testing to comply with the REACH regulations. At the same time EU member states are attempting to reduce the number of animals used in experiments under the 3 Rs policy, (refining, reducing, and replacing the use of animals in laboratory procedures). Computational techniques such as QSAR have the potential to offer an alternative for generating REACH data. The FP6 project CAESAR was aimed at developing QSAR models for 5 key toxicological endpoints of which skin sensitisation was one.

Results

This paper reports the development of two global QSAR models using two different computational approaches, which contribute to the hybrid model freely available online.

Conclusions

The QSAR models for assessing skin sensitisation have been developed and tested under stringent quality criteria to fulfil the principles laid down by the OECD. The final models, accessible from CAESAR website, offer a robust and reliable method of assessing skin sensitisation for regulatory use.
  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号