首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Particle breakage can be characterised as attrition, chipping, fracture, abrasion and wear. All these types of breakage mechanisms are the effect of the damage caused to these particles. These mechanisms can be differentiated not just on the basis of magnitude and direction of the force but also by the damage caused to the particles. The damage is measured by change in the size distribution and the change in shape of the particles. In the current research, experiments were performed on the newly developed Repeated Impact Test. The unique feature of this test is that about hundred particles can be subjected simultaneously to a monitored number of impacts, without particle‐particle interactions at regulated velocities. The preliminary experiments were performed with single crystalline particles of different shapes and sizes. After fixed number of impacts, the images of the particles were taken. The volume and shape of the particles were determined by image analysis. It was observed that the rate of attrition was very high when the particles are irregular. The rate decreased as the particles became more spherical.  相似文献   

2.
Particle science and technology evolve toward ever increasing complexity with respect to the multidimensional particle properties of size, shape, surface, internal structure, and composition. In this study, the theoretical background is elaborated for multidimensional particle size distributions (PSDs) by transferring the concepts known from 1D size distributions to anisotropic particles comprising at least two different length dimensions, e.g., nanorods and platelets. After introducing 2D PSDs, the calculation of differently weighted probability density functions including their interconversion is presented. This is necessary in order to compare data resulting from different measurement techniques which probe different physical properties and thus provide differently weighted PSDs. In addition, it is shown how 1D distributions with reduced content of information can be deduced from 2D PSDs. As a proof‐of‐concept and for illustration purposes, this approach is applied to a 2D Gaussian size distribution. Furthermore, a generalized scheme is suggested which outlines the conversion of number, surface, and volume weighted densities within the 2D space. The application of these methods to the more general n‐dimensional case is straightforward.  相似文献   

3.
This paper presents an experimental study of the coordination number of ternary mixtures of particles of sizes 24.4 : 11.6 : 6.4 (mm) by the use of the liquid bridge technique. It generates detailed information about the distributed coordination numbers corresponding to different types of contacts between small, medium and large components. The analysis is focused on the mean coordination numbers corresponding to these contacts. The results indicate that these partial mean coordination numbers vary with the volume fractions of the components while the overall mean coordination number is essentially a constant and independent of particle size distribution.  相似文献   

4.
Recently a new class of approximating coarse-grained stochastic processes and associated Monte Carlo algorithms were derived directly from microscopic stochastic lattice models for the adsorption/desorption and diffusion of interacting particles(12,13,15). The resulting hierarchy of stochastic processes is ordered by the level of coarsening in the space/time dimensions and describes mesoscopic scales while retaining a significant amount of microscopic detail on intermolecular forces and particle fluctuations. Here we rigorously compute in terms of specific relative entropy the information loss between non-equilibrium exact and approximating coarse-grained adsorption/desorption lattice dynamics. Our result is an error estimate analogous to rigorous error estimates for finite element/finite difference approximations of Partial Differential Equations. We prove this error to be small as long as the level of coarsening is small compared to the range of interaction of the microscopic model. This result gives a first mathematical reasoning for the parameter regimes for which approximating coarse-grained Monte Carlo algorithms are expected to give errors within a given tolerance. MSC (2000) subject classifications: 82C80; 60J22; 94A17  相似文献   

5.
Standard ensemble or particle filtering schemes do not properly represent states of low priori probability when the number of available samples is too small, as is often the case in practical applications. We introduce here a set of parametric resampling methods to solve this problem. Motivated by a general H-theorem for relative entropy, we construct parametric models for the filter distributions as maximum-entropy/minimum-information models consistent with moments of the particle ensemble. When the prior distributions are modeled as mixtures of Gaussians, our method naturally generalizes the ensemble Kalman filter to systems with highly non-Gaussian statistics. We apply the new particle filters presented here to two simple test cases: a one-dimensional diffusion process in a double-well potential and the three-dimensional chaotic dynamical system of Lorenz.  相似文献   

6.
7.
Particle tracking is performed using a combination of dark field or fluorescence video microscopy with automatic image analysis. The optical detection together with the image analysis software allows for the time resolved localization of individual particles with diameters between 100 and 1000 nm. Observation of their Brownian motion over a set of time intervals leads to the determination of their mean square displacements under the given room temperature and viscosity. Hereby, the radii of a set of particles visible within a given optical frame are derived simultaneously. Rapid data analysis leads to reliable particle size histograms. The applicability of this method is demonstrated on polystyrene latices and PMMA nanospheres with radii between 51 nm and 202 nm.  相似文献   

8.
A parallel Particle in Cell/Monte Carlo Collision (PIC/MCC) numerical code for glow discharge plasma simulations is developed and verified. This method is based on simultaneous solution of the Lorentz equations of motion of super particles, coupled with the Poisson's equation for electric field. Collisions between the particles are modelled by the Monte Carlo method. Proper choice of particle weighting is critically important in order to perform adequate and efficient PIC simulations of plasma. Herein, effects of particle weighting on the simulations of capacitive radio‐frequency argon plasma discharges are studied in details. (© 2014 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

9.
重构核粒子法对光滑粒子法的改进效果   总被引:2,自引:1,他引:1  
殷建伟  马智博 《计算物理》2009,26(4):553-558
光滑粒子法通过核函数进行近似估计,在计算域边界附近,核估计的精度明显下降.重构核粒子法通过校正函数对核函数进行重新构造,提高核估计方法在边界点和内点上对函数的估计精度以及计算的稳定性.研究发现,虽然校正函数的构造立足于对函数的精确估计,但这个优势同样能在对导数的估计中继续保持.通过理论研究及数学、物理模型的模拟,展示重构核粒子法的改进效果,揭示其能够提高精度的原因.  相似文献   

10.
An optical measuring technique is presented allowing the exact in‐situ measurement of local particle flux densities in a confined channel flow by counting single particles penetrating an optically well defined measuring volume. This enables a precise flux determination up to the direct vicinity of planar walls. The measurement set‐up and its calibration as well as the whole test facility are described in detail. This measurement technique is used to study the particle transport in electrostatic precipitators. Exemplarily, results of particle flux profiles as well as precipitation, as gained from balances of parts of the precipitator channel, are presented. Furthermore, the possibility to determine particle velocity fluctuations is demonstrated.  相似文献   

11.
We inject a large number of newly created nano‐particle aggregates into a chamber for the purpose of removing harmful contents in an indoor environment. This study is to experimentally and numerically investigate transient response of particle distributions to particle injections. A room‐sized chamber of 4 m × 2.1 m × 2.4 m is connected to a specially designed particle‐injection system, with two Optical Particle Counters used to simultaneously measure particle‐number densities with the size range from 0.3 μm to 10 μm at the inlet and in the chamber. A velocity probe measures the flow that is up to 1 m/s. An Euler‐type particulate‐phase‐transport model is developed and validated by comparing with experimental data. The study shows that the transient behavior of particle distributions is determined by many factors, including particle size, particle settling speed, sampling location, and velocity distribution. Particle number densities decrease in time more quickly for large particles than for small particles, and locations farther downstream in the chamber correlate more weakly with the inlet injection.  相似文献   

12.
粒子场相移全息术   总被引:2,自引:0,他引:2  
赖天树  潭玉山 《光学学报》1991,11(5):71-476
本文利用相移全息术记录的双曝光粒子场全息图,以改进再现图像的SNR、分辨率和对比度。并发展了这种技术的理论,讨论了相移的引入方法,给出了实验结果,与通常的同轴和离轴双曝光结果进行了比较。  相似文献   

13.
Non‐linear optical spectroscopy is a recently established technique used in the investigation of the properties of colloidal interfaces. Since it is an optical method it is non‐invasive, can be applied in situ, and can provide real time resolution. Until recently, only a few papers concerning this method have been published, but these all show the great potential and the large field of applications of the technique. This paper gives an overview of the fundamentals of the technique and its possible applications.  相似文献   

14.
The objective of this study was to compare the measuring results of a fiber‐optical probe based on a modified spatial filtering technique with given size distributions of different test powders and also with particle velocity values of laser Doppler measurements. Fiber‐optical spatial filtering velocimetry was modified by fiber‐optical spot scanning in order to determine simultaneously the size and the velocity of particles. The fiber‐optical probe system can be used as an in‐line measuring device for sizing of particles in different technical applications. Spherical test particles were narrow‐sized glass beads in the range 30–100 μm and irregularly shaped test particles were limestone particles in the range 10–600 μm. Particles were dispersed by a brush disperser and the measurements were carried out at a fixed position in a free particle‐laden air stream. Owing to the measurement of chord lengths and to the influence of diffraction and divergent angle, the probe results show differences from the given test particle sizes. Owing to the particle‐probe collisions, the mean velocity determined by the probe is smaller than the laser Doppler mean velocity.  相似文献   

15.
Recently, Li and Liu have studied global monopole of tachyon in a four dimensional static space–time. We analyze the motion of massless and massive particles around tachyon monopole. Interestingly, for the bending of light rays due to tachyon monopole instead of getting angle of deficit we find angle of surplus. Also we find that the tachyon monopole exerts an attractive gravitational force towards matter.  相似文献   

16.
The scanning mobility particle sizer (SMPS) is one of the best known instruments for measuring particle size distributions in the submicron range. The SMPS consists of two parts: an electrostatic aerosol classifier (differential mobility particle analyser, DMA), followed by a counting device, in general a condensation particle counter (CPC). Unfortunately, commercial measurement devices such as the TSI DMA Model 3071 and the TSI CPC Model 3022 (TSI Inc., St. Paul, MN, USA), can be used only at nearly atmospheric pressure in the sampling line or in slight overpressure mode, but not in low‐pressure systems. A modification in the sampling line is shown which enhances the operating range of a standard SMPS system to low pressure. Samples taken under standard and low‐pressure conditions show good agreement in the measured particle size distributions and concentration. The behaviour observed in experimental studies agrees well with theoretical predictions.  相似文献   

17.
A phase‐sensitive wide field transmission microscope, combining the advantages of both interferometric and confocal techniques, has been developed and applied to analysis of particulates, both in dry powder form and in suspensions. The microscope has also been used in detecting defects in crystals. Confocal operation is achieved by superimposing speckle illumination of a reference beam in a Mach‐Zehnder interferometer with a matched speckle pattern of the object beam. It is shown that the phase measurement enables particle size to be determined even when the particle is smaller than the focal spot size. The data acquisition time is below 1ms, making the system suitable for dynamic process measurement. The experimental results are in good agreement with modelled results giving rise to the possibility of simultaneous determination of both the size and refractive index of small particles.  相似文献   

18.
Mark A. Thomson 《Pramana》2007,69(6):1101-1107
One of the most important requirements for a detector at the ILC is good jet energy resolution. It is widely believed that the particle flow approach to calorimetry is the key to achieving the goal of 0.3/√E(GeV). This paper describes the current performance of the PandoraPFA particle flow algorithm. For 45 GeV jets in the Tesla TDR detector concept, the ILC jet energy resolution goal is reached. At higher energies the jet energy resolution becomes worse and can be described by the empirical expression: σ E /E ≈ 0.265/√E(GeV) + 1.2 × 10−4 E(GeV).   相似文献   

19.
The fundamental processes related to the removal of fine particles from surfaces in a hydrodynamic flow field are not adequately understood. A critical particle Reynolds number approach is proposed to assess these mechanisms for fine particles when surface roughness is small compared to particle diameter. At and above the critical particle Reynolds number, particle removal occurs, while below the critical value, particles remain attached to a surface. The system under consideration consists of glass particles adhering to a glass surface in laminar channel flow. Our results indicate rolling is the removal mechanism, which is in agreement with the literature. Theoretical results of the critical particle Reynolds number model for rolling removal are in general agreement with experimental data when particle size distribution, particle and surface roughness, and system Hamaker constant are taken into account.  相似文献   

20.
光滑粒子动力学方法中粒子分布与数值稳定性分析   总被引:3,自引:0,他引:3       下载免费PDF全文
刘谋斌  常建忠 《物理学报》2010,59(6):3654-3662
光滑粒子动力学(SPH)作为一种拉格朗日型无网格粒子方法,已经成功地应用于包括含多相流动界面以及移动边界的可压缩和不可压缩流体运动的研究中.通过对Poiseuille流动的深入研究,探索了SPH方法中粒子分布对计算精度的影响,揭示了一种因为粒子不规则分布而导致的数值不稳定现象.研究显示,这种数值不稳定性起源于SPH方法粒子近似过程中的不连续性.使用了一种新的粒子近似格式以确保SPH方法中粒子近似的连续性.计算结果表明,这种新的粒子近似格式对于规则和不规则的粒子分布都能得到稳定精度的结果.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号