首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到18条相似文献,搜索用时 15 毫秒
1.
Temporal clustering analysis (TCA) has been proposed as a method for detecting the brain responses of a functional magnetic resonance imaging (fMRI) time series when the time and location of activation are completely unknown. But TCA is not suitable for treating the time series of the whole brain due to the existence of many inactive pixels. In theory, active pixels are located only in gray matter (GM). In this study, SPM2 was used to segment functional images into GM, white matter and cerebrospinal fluid, and only the pixels in GM were considered. Thus, most of inactive pixels are deleted, so that the sensitivity of TCA is greatly improved in the analysis of the whole brain. The same set of acupuncture fMRI data was treated using both conventional TCA and modified TCA (MTCA) for comparing their analytical ability. The results clearly show a significant improvement in the sensitivity achieved by MTCA.  相似文献   

2.
Clustering analysis has been widely used to detect the functional connectivity from functional magnetic resonance imaging (fMRI) data. However, it has some limitations such as enormous computer memory requirement, and difficulty in estimating the number of clusters. In this study, in order to effectually resolve the deficiencies mentioned above, we have proposed a novel approach (SAAPC) for fMRI data analysis, which combines sparsity, an effective assumption for analyzing fMRI signal, with affinity propagation clustering (APC).  相似文献   

3.
Correctly identifying voxels or regions of interest (ROI) that actively respond to a given stimulus is often an important objective/step in many functional magnetic resonance imaging (fMRI) studies. In this article, we study a nonparametric method to detect active voxels, which makes minimal assumption about the distribution of blood oxygen level-dependent (BOLD) signals. Our proposal has several interesting features. It uses time lagged correlation to take into account the delay in response to the stimulus, due to hemodynamic variations. We introduce an input permutation method (IPM), a type of block permutation method, to approximate the null distribution of the test statistic. Also, we propose to pool the permutation-derived statistics of preselected voxels for a better approximation to the null distribution. Finally, we control multiple testing error rate using the local false discovery rate (FDR) by Efron [Correlation and large-scale simultaneous hypothesis testing. J Am Stat Assoc 102 (2007) 93–103] and Park et al. [Estimation of empirical null using a mixture of normals and its use in local false discovery rate. Comput Stat Data Anal 55 (2011) 2421–2432] to select the active voxels.  相似文献   

4.
Constrained independent component analysis (CICA) eliminates the order ambiguity of standard ICA by incorporating prior information into the learning process to sort the components intrinsically. However, the original CICA (OCICA) and its variants depend on a learning rate, which is not easy to be tuned for various applications. To solve this problem, two learning-rate-free CICA algorithms were derived in this paper using the fixed-point learning concept. A complete stability analysis was provided for the proposed methods, which also made a correction to the stability analysis given to OCICA. Variations for adding constraints either to the components or to the associated time courses were derived too. Using synthetic data, the proposed methods yielded a better stability and a better source separation quality in terms of higher signal-to-noise-ratio and smaller performance index than OCICA. For the artificially generated brain activations, the new CICAs demonstrated a better sensitivity/specificity performance than standard univariate general linear model (GLM) and standard ICA. Original CICA showed a similar sensitivity/specificity gain but failed to converge for several times. Using functional magnetic resonance imaging (fMRI) data acquired with a well-characterized sensorimotor task, the proposed CICAs yielded better sensitivity than OCICA, standard ICA and GLM in all the target functional regions in terms of either higher t values or larger suprathreshold cluster extensions using the same significance threshold. In addition, they were more stable than OCICA and standard ICA for analyzing the sensorimotor fMRI data.  相似文献   

5.
The trust region method which originated from the Levenberg–Marquardt (LM) algorithm for mixed effect model estimation are considered in the context of second level functional magnetic resonance imaging (fMRI) data analysis. We first present the mathematical and optimization details of the method for the mixed effect model analysis, then we compare the proposed methods with the conventional expectation-maximization (EM) algorithm based on a series of datasets (synthetic and real human fMRI datasets). From simulation studies, we found a higher damping factor for the LM algorithm is better than lower damping factor for the fMRI data analysis. More importantly, in most cases, the expectation trust region algorithm is superior to the EM algorithm in terms of accuracy if the random effect variance is large. We also compare these algorithms on real human datasets which comprise repeated measures of fMRI in phased-encoded and random block experiment designs. We observed that the proposed method is faster in computation and robust to Gaussian noise for the fMRI analysis. The advantages and limitations of the suggested methods are discussed.  相似文献   

6.
We combine the spatial phase-shifting technique with the real-time fringe counting capability of temporal phase unwrapping to provide simple solutions for some practical tasks in ESPI. First, we develop a method for automatically matched data storage intervals and apply this technique to a long-term observation of a biological object with strongly varying deformation rate. Second, we easily obtain on-line displacement and deformation data during the observation of a complexly structured discontinuous object.  相似文献   

7.
An improved phase unwrapping method is proposed to reduce the projection fringes in three-dimensional (3D) surface measurement. Color fringe patterns are generated by encoding with sinusoidal fringe and stair phase fringe patterns in red and blue channels. These color fringe patterns are projected onto the tested objects and then captured by a color CCD camera. The recorded fringe patterns are separated into their RGB components. Two groups of four-step phase-shifting fringe patterns are obtained. One group of the stripes are four sinusoidal patterns, which are used to determine the wrapped phase. The other group of stripes are four sinusoidal patterns with the codeword embedded into stair phase, whose stair changes are perfectly aligned with the 2π discontinuities of sinusoidal fringe phase, which are used to determine the fringe order for the phase unwrapping. The experimental results are analyzed and compared with those of the method in Zheng and Da (2012. Opt Express 20(22):24139–24150). The results show that the proposed method needs only four fringe patterns while having less error. It can effectively reduce the number of projection fringes and improve the measuring speed.  相似文献   

8.
轴承是工程实际中常用而又极易损坏的部件,特别是对其早期微弱响应的辨识,具有重要的社会价值和意义。为提高运转轴承的安全可靠性和可维护性,提出了基于主元分析与动态时间弯曲距离的故障诊断方法,它可以准确对早期微弱动态响应辨识、诊断。该方法首先将典型故障样本信号与待测信号小波去噪并EMD分解,并对若干固有模态分量主元分析求取主元,然后对主元分量进行分析,获得相关特征值组成特征向量,计算待测信号与已知故障样本信号特征向量的弯曲距离,弯曲距离越小表明两信号越相似,从而辨识故障。此外,还可将其应用于转子、碰磨、齿轮故障诊断中,工程应用实例表明该方法可以准确故障分类,高效故障诊断。  相似文献   

9.
Laser-induced breakdown spectroscopy(LIBS) is a versatile tool for both qualitative and quantitative analysis.In this paper,LIBS combined with principal component analysis(PCA) and support vector machine(SVM) is applied to rock analysis.Fourteen emission lines including Fe,Mg,Ca,Al,Si,and Ti are selected as analysis lines.A good accuracy(91.38% for the real rock) is achieved by using SVM to analyze the spectroscopic peak area data which are processed by PCA.It can not only reduce the noise and dimensionality which contributes to improving the efficiency of the program,but also solve the problem of linear inseparability by combining PCA and SVM.By this method,the ability of LIBS to classify rock is validated.  相似文献   

10.
利用HL-2A装置实验数据,初步建立了面向ITERL DB2.0版本的能量约束数据库,采用了统计分析系统(SAS)对HL-2A约束数据进行了分析和评估,开展了能量约束时间对密度的定标律研究,并得到了初步结果。最后通过与ITER定标律和原ASDEX数据的比较,对HL-2A装置上L-模约束品质和欧姆加热条件下斯必泽电阻率对温度的依赖关系进行了讨论。  相似文献   

11.
利用HL-2A装置实验数据,初步建立了面向ITERL DB2.0版本的能量约束数据库,采用了统计分析系统(SAS)对HL-2A约束数据进行了分析和评估,开展了能量约束时间对密度的定标律研究,并得到了初步结果。最后通过与ITER定标律和原ASDEX数据的比较,对HL-2A装置上L-模约束品质和欧姆加热条件下斯必泽电阻率对温度的依赖关系进行了讨论。  相似文献   

12.
M.C. Mariani  I. Florescu 《Physica A》2009,388(8):1659-1664
This work is devoted to the study of long correlations, memory effects and other statistical properties of high frequency (tick) data. We use a sample of 25 stocks for this purpose.We verify that the behavior of the return is compatible with that of continuous time Levy processes. We also study the presence of memory effects and long-range correlations in the values of the return.  相似文献   

13.
Given the spectrum of a Hamiltonian, a methodology is developed which employs the Landau-Ginsburg theory for characterizing phase transitions in infinite systems to identify phase transition remnants in finite fermion systems. As a first application of our appproach we discuss pairing in finite nuclei.  相似文献   

14.
We use the data envelopment analysis (DEA) method to estimate the relative efficiency of the current strategy for each player in a game by taking the spatial distribution of strategies as input and the total payoff as output. Based on the optimal value of the DEA model, we present a DEA efficient rule to update the strategy in evolutionary games. Simulations of the prisoner’s dilemma game (PDG) [4] and the snowdrift game (SG) on two-dimensional regular lattices of four, six, and eight neighbors with periodic boundary conditions, are carried out; the results show the emergence of high and stable cooperator frequency. The heuristic analysis of the DEA efficient rule are discussed in detail. Our work may be helpful in exploring the promotion of cooperator behavior.  相似文献   

15.
激光诱导荧光水体污染遥测数据定量分析方法   总被引:4,自引:0,他引:4  
在紫外光的激发下,污染水体中的溶解有机物(DOM)会产生特定的荧光光谱,因此利用激光诱导荧光(LIF)可对水体中的溶解有机物的含量进行定量分析,从而可估计出水体富营养化的程度。提出了一种用于对水质遥测数据进行定量分析的方法,这是一种基于遗传算法(GA)的光谱分离算法。首先确定拉曼散射信号和溶解有机物的荧光在404nm波段的信号强度,然后再利用拉曼散射信号对DOM荧光光谱进行归一化处理。根据浓度校准曲线可得到水体中的溶解有机物的浓度。  相似文献   

16.
The efficient market hypothesis (EMH) states that asset prices fully reflect all available information. As a result, speculators cannot predict the future behavior of asset prices and earn excess profits at least after adjusting for risk. Although initial tests of the EMH were performed on stock market data, the EMH was soon applied to other markets including foreign exchange (FX). This study uses the detrended fluctuation analysis (DFA) technique to test 01:12:2005–18:04:2010 Iranian Rial/US Dollar exchange rate time series data to see if it can be explained by the weak form of the EMH. Moreover, to determine changes in the degree of inefficiency over time, the whole period has been divided into four subperiods. The study shows that the Iranian Forex market (the Rial/Dollar case) is weak-form inefficient over the whole period and in each of the subperiods. However, the degree of inefficiency is not constant over time. The findings suggest that profitable risk-adjusted trades could be made using past data.  相似文献   

17.
Pengjian Shang  Aijing Lin 《Physica A》2009,388(5):720-726
The Detrended Fluctuation Analysis (DFA) and its extensions (MF-DFA) have been used extensively to determine possible long-range correlations in self-affine signals. However, recent studies have reported the susceptibility of DFA to trends which give rise to spurious crossovers and prevent reliable estimation of the scaling exponents. In this study, a smoothing algorithm based on the Chaotic Singular-Value Decomposition (CSVD) is proposed to minimize the effect of exponential trends and distortion in the log-log plots obtained by DFA techniques. The effectiveness of the technique is demonstrated on monofractal and multifractal data corrupted with exponential trends.  相似文献   

18.
A. Berk    F. Solymosi 《Surface science》1998,400(1-3):281-289
A method for independent control of the particle size and distance is presented for rhodium epitaxy on TiO2(110)-(1×2) surface. The real space imaging of the surface morphology was performed by scanning tunneling microscopy. The amount of the deposited rhodium was checked by Auger electron spectrometry. The method consists of two steps: (i) evaporation of 0.001–0.050 ML equivalent of rhodium at room temperature with a post-annealing at 1100 K (“seeding”); (ii) post-deposition of rhodium for growing of the Rh nanoparticles formed in step (i) (“growing”). The mechanism of this procedure is based on the large difference of the surface diffusion coefficient between Rh adatoms and Rh nanocrystallites larger than 1–2 nm. In the first step the average distance between the metal particles is controlled in the range 5–200 nm, the second step determines the particles size (2–50 nm). This work demonstrates that the diffusion processes of metal nanoparticles of different sizes and the growing modes of the crystallites can be studied in detail by application of seeded surfaces.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号