首页 | 本学科首页   官方微博 | 高级检索  
     


Deriving new soft tissue contrasts from conventional MR images using deep learning
Affiliation:1. Medical Imaging Technologies, Siemens Medical Solutions USA, Inc., Princeton, NJ, USA;2. Department of Biomedical Engineering, University of Minnesota, Minneapolis, MN, USA;3. Siemens Healthcare, Application Development, Erlangen, Germany;4. Radiology, Case Western Reserve University, Cleveland, OH, USA;1. Department of Physics, University of Alberta, 4-181 CCIS, Edmonton, Alberta T6G 2E1, Canada;2. Department of Biomedical Engineering, University of Alberta, 1098 RTF, Edmonton, Alberta T6G 2V2, Canada
Abstract:Versatile soft tissue contrast in magnetic resonance imaging is a unique advantage of the imaging modality. However, the versatility is not fully exploited. In this study, we propose a deep learning-based strategy to derive more soft tissue contrasts from conventional MR images obtained in standard clinical MRI. Two types of experiments are performed. First, MR images corresponding to different pulse sequences are predicted from one or more images already acquired. As an example, we predict T1ρ weighted knee image from T2 weighted image and/or T1 weighted image. Furthermore, we estimate images corresponding to alternative imaging parameter values. In a representative case, variable flip angle images are predicted from a single T1 weighted image, whose accuracy is further validated in quantitative T1 map subsequently derived. To accomplish these tasks, images are retrospectively collected from 56 subjects, and self-attention convolutional neural network models are trained using 1104 knee images from 46 subjects and tested using 240 images from 10 other subjects. High accuracy has been achieved in resultant qualitative images as well as quantitative T1 maps. The proposed deep learning method can be broadly applied to obtain more versatile soft tissue contrasts without additional scans or used to normalize MR data that were inconsistently acquired for quantitative analysis.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号