The SOLEIL synchrotron radiation source is regularly operated in special filling modes dedicated to pump–probe experiments. Among others, the low‐α mode operation is characterized by shorter pulse duration and represents the natural bridge between 50 ps synchrotron pulses and femtosecond experiments. Here, the capabilities in low‐α mode of the experimental set‐ups developed at the TEMPO beamline to perform pump–probe experiments with soft X‐rays based on photoelectron or photon detection are presented. A 282 kHz repetition‐rate femtosecond laser is synchronized with the synchrotron radiation time structure to induce fast electronic and/or magnetic excitations. Detection is performed using a two‐dimensional space resolution plus time resolution detector based on microchannel plates equipped with a delay line. Results of time‐resolved photoelectron spectroscopy, circular dichroism and magnetic scattering experiments are reported, and their respective advantages and limitations in the framework of high‐time‐resolution pump–probe experiments compared and discussed. 相似文献
Quantum interference, manifest in the two slit experiment, lies at the heart of several quantum computational speed-ups and provides a striking example of a quantum phenomenon with no classical counterpart. An intriguing feature of quantum interference arises in a variant of the standard two slit experiment, in which there are three, rather than two, slits. The interference pattern in this set-up can be written in terms of the two and one slit patterns obtained by blocking one, or more, of the slits. This is in stark contrast with the standard two slit experiment, where the interference pattern cannot be written as a sum of the one slit patterns. This was first noted by Rafael Sorkin, who raised the question of why quantum theory only exhibits irreducible interference in the two slit experiment. One approach to this problem is to compare the predictions of quantum theory to those of operationally-defined ‘foil’ theories, in the hope of determining whether theories that do exhibit higher-order interference suffer from pathological—or at least undesirable—features. In this paper two proposed extensions of quantum theory are considered: the theory of Density Cubes proposed by Daki?, Paterek and Brukner, which has been shown to exhibit irreducible interference in the three slit set-up, and the Quartic Quantum Theory of ?yczkowski. The theory of Density Cubes will be shown to provide an advantage over quantum theory in a certain computational task and to posses a well-defined mechanism which leads to the emergence of quantum theory—analogous to the emergence of classical physics from quantum theory via decoherence. Despite this, the axioms used to define Density Cubes will be shown to be insufficient to uniquely characterise the theory. In comparison, Quartic Quantum Theory is a well-defined theory and we demonstrate that it exhibits irreducible interference to all orders. This feature of ?yczkowski’s theory is argued not to be a genuine phenomenon, but to arise from an ambiguity in the current definition of higher-order interference in operationally-defined theories. Thus, to begin to understand why quantum theory is limited to a certain kind of interference, a new definition of higher-order interference is needed that is applicable to, and makes good operational sense in, arbitrary operationally-defined theories. 相似文献
One of the problems when conducting research in mathematical programming models for operations planning is having an adequate database of experiments that can be used to verify advances and developments with enough factors to understand different consequences. This paper presents a test bed generator and instances database for a rolling horizons analysis for multiechelon planning, multiproduct with alternatives processes, multistroke, multicapacity with different stochastic demand patterns to be used with a stroke-like bill of materials considering production costs, setup, storage and delays for operations management. From the analysis of the operations planning obtained from this test bed, it is concluded that a product structure with an alternative process obtains the lowest total cost and the highest service level. In addition, decreasing seasonal demand could present a lower total cost than constant demand, but would generate a worse service level. This test bed will allow researchers further investigation so as to verify improvements in forecast methods, rolling horizons parameters, employed software, etc.
Organisations are concerned about measuring the performance of the product/service they deliver to their customers. In all types of organisations, if a proper performance assessment is to be developed, it should be measured in different dimensions. At University, the new study programs include the development and assessment of transversal competences due to their importance in enhancing the abilities and improving the employability of students. The achievement of transversal competences can be assessed in different levels/stages; for example, the 1st and 2nd years of a Bachelor’s degree; the 3rd and 4th years of a Bachelor’s degree and at Master’s level. The purpose of this paper is to integrate the research into performance management in organisations to develop an approach consisting of four components (a methodology, a solid and integrated performance management framework, graphical diagrams and quantitative techniques) to assess and manage the achievement of transversal competences through the different levels of study using a consolidated approach. The proposal uses the Analytic Network Process (ANP) to model dependences and feedback among the elements of the competences.
Computational Optimization and Applications - The context of this research is multiobjective optimization where conflicting objectives are present. In this work, these objectives are only available... 相似文献
Tomographic Diffractive Microscopy is a technique, which permits to image transparent living specimens in three dimensions
without staining. It is commonly implemented in two configurations, by either rotating the sample illumination keeping the
specimen fixed, or by rotating the sample using a fixed illumination. Under the first-order Born approximation, the volume
of the frequency domain that can be mapped with the rotating illumination method has the shape of a “doughnut”, which exhibits
a so-called “missing cone” of non-captured frequencies, responsible for the strong resolution anisotropy characteristic of
transmission microscopes. When rotating the sample, the resolution is almost isotropic, but the set of captured frequencies
still exhibits a missing part, the shape of which resembles that of an apple core. Furthermore, its maximal extension is reduced
compared to tomography with rotating illumination. We propose various configurations for tomographic diffractive microscopy,
which combine both approaches, and aim at obtaining a high and isotropic resolution. We illustrate with simulations the expected imaging performances of these configurations. 相似文献
We explain the (non-)validity of close-to-equilibrium entropy production principles in the context of linear electrical circuits.
Both the minimum and the maximum entropy production principles are understood within dynamical fluctuation theory. The starting
point are Langevin equations obtained by combining Kirchoff’s laws with a Johnson-Nyquist noise at each dissipative element
in the circuit. The main observation is that the fluctuation functional for time averages, that can be read off from the path-space
action, is in first order around equilibrium given by an entropy production rate.
That allows to understand beyond the schemes of irreversible thermodynamics (1) the validity of the least dissipation, the
minimum entropy production, and the maximum entropy production principles close to equilibrium; (2) the role of the observables’
parity under time-reversal and, in particular, the origin of Landauer’s counterexample (1975) from the fact that the fluctuating
observable there is odd under time-reversal; (3) the critical remark of Jaynes (1980) concerning the apparent inappropriateness
of entropy production principles in temperature-inhomogeneous circuits. 相似文献
In this work we report some new well-defined carbon nanostructures produced by direct chlorination of metallocenes (ferrocene and cobaltocene) and NbC, at temperatures from 100 to 900 degrees C. Thus, amorphous carbon nanotubes with variable dimensions depending on reaction temperature were produced from ferrocene. When cobaltocene is the carbon precursor the main product are solid amorphous nanospheres. The high refractory metal carbide NbC as carbon source favours the growth of nanospherical cabbage-like particles with a higher degree of graphene sheets order. Besides, NbC crystallites encapsulated in an amorphous carbon shell were also found at lower temperatures (T< or =700 degrees C). 相似文献
The core-shell structure of a range of acrylic-acrylic latexes has been investigated by combining different specimen preparation methods with transmission electron microscopy (TEM), dark-field scanning transmission electron microscopy (DSTEM) and low-voltage scanning electron microscopy (LV-SEM), including the first reported use of LV-SEM to observe composite latex particles at ambient and subambient temperatures. Spin-coating of liquid latex dispersions directly onto TEM grids or SEM stubs is shown to be a relatively straightforward mean of avoiding film formation during specimen preparation. In conjunction with double staining techniques, it has been found to be particularly convenient for characterizing the fine structure of particles with diameters down to below 100 nm. 相似文献
Extensive objective energy-based parameters have been measured in 12 Mudejar-Gothic churches in the south of Spain. Measurements took place in unoccupied churches according to the ISO-3382 standard. Monoaural objective measures in the 125-4000 Hz frequency range and in their spatial distributions were obtained. Acoustic parameters: clarity C80, definition D50, sound strength G and center time Ts have been deduced using impulse response analysis through a maximum length sequence measurement system in each church. These parameters spectrally averaged according to the most extended criteria in auditoria in order to consider acoustic quality were studied as a function of source-receiver distance. The experimental results were compared with predictions given by classical and other existing theoretical models proposed for concert halls and churches. An analytical semi-empirical model based on the measured values of the C80 parameter is proposed in this work for these spaces. The good agreement between predicted values and experimental data for definition, sound strength, and center time in the churches analyzed shows that the model can be used for design predictions and other purposes with reasonable accuracy. 相似文献