首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 137 毫秒
1.
We further explore the relation between random coefficients regression (RCR) and computerized tomography. Recently, Beran et al. (1996, Ann. Statist., 24, 2569–2592) explored this connection to derive an estimation method for the non-parametric RCR problem which is closely related to image reconstruction methods in X-ray computerized tomography. In this paper we emphasize the close connection of the RCR problem with positron emission tomography (PET). Specifically, we show that the RCR problem can be viewed as an idealized (continuous) version of a PET experiment, by demonstrating that the nonparametric likelihood of the RCR problem is equivalent to that of a specific PET experiment. Consequently, methods independently developed for either of the two problems can be adapted from one problem to the other. To demonstrate the close relation between the two problems we use the estimation method of Beran, Feuerverger and Hall for image reconstruction in PET.  相似文献   

2.
Electron Paramagnetic Resonance (EPR) is a spectroscopic technique that detects and characterizes molecules with unpaired electrons (i.e., free radicals). Unlike the closely related nuclear magnetic resonance (NMR) spectroscopy, EPR is still under development as an imaging modality. Athough a number of physical factors have hindered its development, EPR's potential is quite promising in a number of important application areas, including in vivo oximetry. EPR images are generally reconstructed using a tomographic imaging technique, of which filtered backprojection (FBP) is the most commonly used. We apply two iterative methods for maximum-entropy image reconstruction in EPR. The first is the multiplicative algebraic reconstruction technique (MART), a well-known row-action method. We propose a second method, known as LSEnt (least-squares entropy), that maximizes entropy and performs regularization by maintaining a desired distance from the measurements. LSEnt is in part motivated by the barrier method of interior-point programming. We present studies in which images of two physical phantoms, reconstructed using FBP, MART, and LSEnt, are compared. The images reconstructed using MART and LSEnt have lower variance, better contrast recovery, subjectively better resolution, and reduced streaking artifact than those reconstructed using FBP. These results suggest that maximum-entropy reconstruction methods (particularly the more flexible LSEnt) may be critical in overcoming some of the physical challenges of EPR imaging.  相似文献   

3.
High-Resolution Color Image Reconstruction with Neumann Boundary Conditions   总被引:1,自引:0,他引:1  
This paper studies the application of preconditioned conjugate gradient methods in high-resolution color image reconstruction problems. The high-resolution color images are reconstructed from multiple undersampled, shifted, degraded color frames with subpixel displacements. The resulting degradation matrices are spatially variant. To capture the changes of reflectivity across color channels, the weighted H 1 regularization functional is used in the Tikhonov regularization. The Neumann boundary condition is also employed to reduce the boundary artifacts. The preconditioners are derived by taking the cosine transform approximation of the degradation matrices. Numerical examples are given to illustrate the fast convergence of the preconditioned conjugate gradient method.  相似文献   

4.
Lin He  Ti-Chiun Chang  Stanley Osher  Tong Fang  Peter Speier 《PAMM》2007,7(1):1011207-1011208
Magnetic resonance imaging (MRI) reconstruction from sparsely sampled data has been a difficult problem in medical imaging field. We approach this problem by formulating a cost functional that includes a constraint term that is imposed by the raw measurement data in k-space and the L1 norm of a sparse representation of the reconstructed image. The sparse representation is usually realized by total variational regularization and/or wavelet transform. We have applied the Bregman iteration to minimize this functional to recover finer scales in our recent work. Here we propose nonlinear inverse scale space methods in addition to the iterative refinement procedure. Numerical results from the two methods are presented and it shows that the nonlinear inverse scale space method is a more efficient algorithm than the iterated refinement method. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

5.
The problem of reconstruction of a Riemannian manifold via a nonstationary response operator is considered. An approach to solving this problem which is based on methods of boundary control theory is suggested. The procedure of reconstruction makes use of nonstationary Gaussian beams (quasiphotons). Locality is a significant feature of the procedure. A class of manifolds which can be reconstructed within the framework of this approach is described in terms of boundary control theory. For example, this class includes all analytic manifolds. Bibliography: 21 titles. Translated fromZapiski Nauchnykh Seminarov POMI, Vol. 203, 1992, pp. 21–50. Translated by A. P. Katchalov.  相似文献   

6.
Positron-Emission Tomography (PET) is an imaging technique in nuclear medicine used to image physiological processes. A major obstacle is the need for dynamic image reconstruction from low quality PET-data, which applies in particular for tracers (radioactive water) with fast decay like H215O when looking for improved spatial resolution. Here we present a model-based approach to overcome those difficulties. We derive a set of differential equations able to represent the kinetic behavior of H215O PET tracers during cardiac perfusion. In this model one takes into account the exchange of materials between artery, tissue and vein, which predicts the tracer activity if the reaction rates, velocities, and diffusion coefficients are known. One then interprets the computation of these distributed parameters (spatially dependent only) as a nonlinear inverse problem, which we solve using variational regularization approaches. For the minimization we use the gradient-based methods and Forward-Backward Splitting. The main advantage is the reduction of the degrees of freedom, which makes the problem overdetermined and thus allows to proceed to low quality data. (© 2012 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

7.
Phylogenetic reconstruction methods attempt to reconstruct a tree describing the evolution of a given set of species using sequences of characters (e.g. DNA) extracted from these species as input. A central goal in this area is to design algorithms which guarantee reliable reconstruction of the tree from short input sequences, assuming common stochastic models of evolution. The fast converging reconstruction algorithms introduced in the last decade dramatically reduced the sequence length required to guarantee accurate reconstruction of the entire tree. However, if the tree in question contains even few edges which cannot be reliably reconstructed from the input sequences, then known fast converging algorithms may fail to reliably reconstruct all or most of the other edges. This calls for an adaptive approach suggested in this paper, called adaptive fast convergence, in which the set of edges which can be reliably reconstructed gradually increases with the amount of information (length of input sequences) available to the algorithm. This paper presents an adaptive fast converging algorithm which returns a partially resolved topology containing no false edges: edges that cannot be reliably reconstructed are contracted into high degree vertices. We also present an upper bound on the weights of those contracted edges, which is determined by the length of input sequences and the depth of the tree. As such, the reconstruction guarantee provided by our algorithm for individual edges is significantly stronger than any previously published edge reconstruction guarantee. This fact, together with the optimal complexity of our algorithm (linear space and quadratic‐time), makes it appealing for practical use. © 2011 Wiley Periodicals, Inc. Random Struct. Alg., 40, 350–384, 2011  相似文献   

8.
Abstract

This article introduces an approach for characterizing the classes of empirical distributions that satisfy certain positive dependence notions. Mathematically, this can be expressed as studying certain subsets of the class SN of permutations of 1, …, N, where each subset corresponds to some positive dependence notions. Explicit techniques for it-eratively characterizing subsets of SN that satisfy certain positive dependence concepts are obtained and various counting formulas are given. Based on these techniques, graph-theoretic methods are used to introduce new and more efficient algorithms for constructively generating and enumerating the elements of various of these subsets of SN. For example, the class of positively quadrant dependent permutations in SN is characterized in this fashion.  相似文献   

9.
Abstract

In statistical image reconstruction, data are often recorded on a regular grid of squares, known as pixels, and the reconstructed image is defined on the same pixel grid. Thus, the reconstruction of a continuous planar image is piecewise constant on pixels, and boundaries in the image consist of horizontal and vertical edges lying between pixels. This approximation to the true boundary can result in a loss of information that may be quite noticeable for small objects, only a few pixels in size. Increasing the resolution of the sensor may not be a practical alternative. If some prior assumptions are made about the true image, however, reconstruction to a greater accuracy than that of the recording sensor's pixel grid is possible. We adopt a Bayesian approach, incorporating prior information about the true image in a stochastic model that attaches higher probability to images with shorter total edge length. In reconstructions, pixels may be of a single color or split between two colors. The model is illustrated using both real and simulated data.  相似文献   

10.
The task of this paper is to study and analyse transformed localization and generalized localization for ensemble methods in data assimilation. Localization is an important part of ensemble methods such as the ensemble Kalman filter or square root filter. It guarantees a sufficient number of degrees of freedom when a small number of ensembles or particles, respectively, are used. However, when the observation operators under consideration are non‐local, the localization that is applicable to the problem can be severly limited, with strong effects on the quality of the assimilation step. Here, we study a transformation approach to change non‐local operators to local operators in transformed space, such that localization becomes applicable. We interpret this approach as a generalized localization and study its general algebraic formulation. Examples are provided for a compact integral operator and a non‐local Matrix observation operator to demonstrate the feasibility of the approach and study the quality of the assimilation by transformation. In particular, we apply the approach to temperature profile reconstruction from infrared measurements given by the infrared atmospheric sounding interferometer (IASI) infrared sounder and show that the approach is feasible for this important data type in atmospheric analysis and forecasting. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

11.
Sliced inverse regression (SIR) and related methods were introduced in order to reduce the dimensionality of regression problems. In general semiparametric regression framework, these methods determine linear combinations of a set of explanatory variables X related to the response variable Y, without losing information on the conditional distribution of Y given X. They are based on a “slicing step” in the population and sample versions. They are sensitive to the choice of the number H of slices, and this is particularly true for SIR-II and SAVE methods. At the moment there are no theoretical results nor practical techniques which allows the user to choose an appropriate number of slices. In this paper, we propose an approach based on the quality of the estimation of the effective dimension reduction (EDR) space: the square trace correlation between the true EDR space and its estimate can be used as goodness of estimation. We introduce a na?ve bootstrap estimation of the square trace correlation criterion to allow selection of an “optimal” number of slices. Moreover, this criterion can also simultaneously select the corresponding suitable dimension K (number of the linear combination of X). From a practical point of view, the choice of these two parameters H and K is essential. We propose a 3D-graphical tool, implemented in R, which can be useful to select the suitable couple (H, K). An R package named “edrGraphicalTools” has been developed. In this article, we focus on the SIR-I, SIR-II and SAVE methods. Moreover the proposed criterion can be use to determine which method seems to be efficient to recover the EDR space, that is the structure between Y and X. We indicate how the proposed criterion can be used in practice. A simulation study is performed to illustrate the behavior of this approach and the need for selecting properly the number H of slices and the dimension K. A short real-data example is also provided.  相似文献   

12.
This is a continuing paper of the authors (1998, Ann. Inst. Statist. Math., 50, 361–377). In the Wicksell corpuscle problem, the maximum size of random spheres in a volume part is to be predicted from the sectional circular distribution of spheres cut by a plane. The size of the spheres is assumed to follow the three-parameter generalized gamma distribution. Prediction methods based on the moment estimation are proposed and their performances are evaluated by simulation. For a practically probable case, one of these prediction methods is as good as a method previously proposed by the authors where the two shape parameters are assumed to be known.  相似文献   

13.
Change detection methods are very important in many areas such as medical imaging and remote sensing. In particular, identifying the changes in medical images taken at different times is of great relevance in clinical practice. The key of detecting changes in medical images is to detect disease-related changes while rejecting “unimportant” induced by noise, mis-alignment changes, and other common acquisition-related artifacts (such as inhomogeneity). In this paper we first summarize the existing methods for automatic change detection, and propose a new approach for detecting changes based on local dictionary learning techniques. In addition we aim to automatically ignore insignificant changes. Our new approach uses L2 norm as similarity measure to learn the dictionary. We also apply knowledge of principal component analysis as a feature extraction tool, to eliminate the redundancy and hence to increase the computational efficiency. The performance of the algorithm is validated with synthetic and clinical images.  相似文献   

14.
In this paper, we derive time reversal imaging functionals for two strongly causal acoustic attenuation models, which have been proposed recently. The time reversal techniques are based on recently proposed ideas of Ammari et al. for the thermo‐viscous wave equation. Here and there, an asymptotic analysis provides reconstruction functionals from first order corrections for the attenuating effect. In addition, we present a novel approach for higher order corrections. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

15.
Blough (1985,Ann. Inst. Statist. Math.,37, 545–555) developed a multivariate location region for a randomp-vectorX. The dimension of this region provides information on the degree of symmetry possessed by the distribution ofX. By considering all one-dimensional projections ofX, it is possible to ascertain the dimension of the location region. Projection pursuit techniques can therefore be used to study symmetry in multivariate data sets. An example from an Entomology investigation is presented illustrating these methods.  相似文献   

16.
A Bayesian approach is used to analyze the seismic events with magnitudes at least 4.7 on Taiwan. Following the idea proposed by Ogata (1988,Journal of the American Statistical Association,83, 9–27), an epidemic model for the process of occurrence times given the observed magnitude values is considered, incorporated with gamma prior distributions for the parameters in the model, while the hyper-parameters of the prior are essentially determined by the seismic data in an earlier period. Bayesian inference is made on the conditional intensity function via Markov chain Monte Carlo method. The results yield acceptable accuracies in predicting large earthquake events within short time periods.  相似文献   

17.
High angular resolution diffusion imaging (HARDI) has recently been of great interest in mapping the orientation of intravoxel crossing fibers, and such orientation information allows one to infer the connectivity patterns prevalent among different brain regions and possible changes in such connectivity over time for various neurodegenerative and neuropsychiatric diseases. The aim of this article is to propose a penalized multiscale adaptive regression model (PMARM) framework to spatially and adaptively infer the orientation distribution function (ODF) of water diffusion in regions with complex fiber configurations. In PMARM, we reformulate the HARDI imaging reconstruction as a weighted regularized least-square regression (WRLSR) problem. Similarity and distance weights are introduced to account for spatial smoothness of HARDI, while preserving the unknown discontinuities (e.g., edges between white matter and gray matter) of HARDI. The L1 penalty function is introduced to ensure the sparse solutions of ODFs, while a scaled L1 weighted estimator is calculated to correct the bias introduced by the L1 penalty at each voxel. In PMARM, we integrate the multiscale adaptive regression models, the propagation-separation method, and Lasso (least absolute shrinkage and selection operator) to adaptively estimate ODFs across voxels. Experimental results indicate that PMARM can reduce the angle detection errors on fiber crossing area and provide more accurate reconstruction than standard voxel-wise methods. Supplementary materials for this article are available online.  相似文献   

18.
ABSTRACT

The purpose of this work is to present results about the composition of Fourier integral operators with certain singularities, for which the composition is not again a Fourier integral operator. The singularities considered here are folds and blowdowns. We prove that for such operators, the Schwartz kernel of F*F belongs to a class of distributions associated to two cleanly intersection Lagrangians. Such Fourier integral operators appear in integral geometry, inverse acoustic scattering theory and Synthetic Aperture Radar imaging, where the composition calculus can be used as a tool for finding approximate inversion formulas and for recovering images.  相似文献   

19.
Benjamini asked whether the scenery reconstruction methods of Matzinger (see e.g. [21], [22], [20]) can be done in polynomial time. In this article, we give the following answer for a 2-color scenery and simple random walk with holding: We prove that a piece of the scenery of length of the order 3 n around the origin can be reconstructed – up to a reflection and a small translation – with high probability from the first 2 · 310 αn observations with a constant α > 0 independent of n. Thus, the number of observations needed is polynomial in the length of the piece of scenery which we reconstruct. The probability that the reconstruction fails tends to 0 as n→∞. In contrast to [21], [22], and [20], the proofs in this article are all constructive. Our reconstruction algorithm is an algorithm in the sense of computer science. This is the first article which shows that the scenery reconstruction is also possible in the 2-color case with holding. The case with holding is much more difficult than [22] and requires completely different methods.  相似文献   

20.
Some simple models are introduced which may be used for modelling or generating sequences of dependent discrete random variables with generalized Poisson marginal distribution. Our approach for building these models is similar to that of the Poisson ARMA processes considered by Al-Osh and Alzaid (1987,J. Time Ser. Anal.,8, 261–275; 1988,Statist. Hefte,29, 281–300) and McKenzie (1988,Adv. in Appl. Probab.,20, 822–835). The models have the same autocorrelation structure as their counterparts of standard ARMA models. Various properties, such as joint distribution, time reversibility and regression behavior, for each model are investigated.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号