首页 | 本学科首页   官方微博 | 高级检索  
     检索      

一种基于EfficientNet与BiGRU的多角度SAR图像目标识别方法
引用本文:赵鹏菲,黄丽佳.一种基于EfficientNet与BiGRU的多角度SAR图像目标识别方法[J].雷达学报,2021,10(6):895-904.
作者姓名:赵鹏菲  黄丽佳
作者单位:1.中国科学院空天信息创新研究院 北京 1000942.中国科学院空间信息处理与应用系统技术重点实验室 北京 1001903.中国科学院大学 北京 100049
基金项目:国家自然科学基金(61991420, 62022082),中科院青促会专项支持
摘    要:合成孔径雷达(SAR)的自动目标识别(ATR)技术目前已广泛应用于军事和民用领域。SAR图像对成像的方位角极其敏感,同一目标在不同方位角下的SAR图像存在一定差异,而多方位角的SAR图像序列蕴含着更加丰富的分类识别信息。因此,该文提出一种基于EfficientNet和BiGRU的多角度SAR目标识别模型,并使用孤岛损失来训练模型。该方法在MSTAR数据集10类目标识别任务中可以达到100%的识别准确率,对大俯仰角(擦地角)下成像、存在版本变体、存在配置变体的3种特殊情况下的SAR目标分别达到了99.68%, 99.95%, 99.91%的识别准确率。此外,该方法在小规模的数据集上也能达到令人满意的识别准确率。实验结果表明,该方法在MSTAR的大部分数据集上识别准确率均优于其他多角度SAR目标识别方法,且具有一定的鲁棒性。 

关 键 词:合成孔径雷达    自动目标识别    多角度识别    EfficientNet
收稿时间:2020-10-26

Target Recognition Method for Multi-aspect Synthetic Aperture Radar Images Based on EfficientNet and BiGRU
ZHAO Pengfei,HUANG Lijia.Target Recognition Method for Multi-aspect Synthetic Aperture Radar Images Based on EfficientNet and BiGRU[J].Journal of Radars,2021,10(6):895-904.
Authors:ZHAO Pengfei  HUANG Lijia
Institution:1.Aerospace Information Research Institutue, Chinese Academy of Sciences, Beijing 100094, China2.Key Laboratory of Technology in Geo-spatial Information Processing and Application System, Chinese Academy of Sciences, Beijing 100190, China3.University of Chinese Academy of Sciences, Beijing 100049, China
Abstract:Automatic Target Recognition (ATR) in Synthetic Aperture Radar (SAR) has been extensively applied in military and civilian fields. However, SAR images are very sensitive to the azimuth of the images, as the same target can differ greatly from different aspects. This means that more reliable and robust multiaspect ATR recognition is required. In this paper, we propose a multiaspect ATR model based on EfficientNet and BiGRU. To train this model, we use island loss, which is more suitable for SAR ATR. Experimental results have revealed that our proposed method can achieve 100% accuracy for 10-class recognition on the Moving and Stationary Target Acquisition and Recognition (MSTAR) database. The SAR targets in three special imaging cases with large depression angles, version variants, and configuration variants reached recognition accuracies of 99.68%, 99.95%, and 99.91%, respectively. In addition, the proposed method achieves satisfactory accuracy even with smaller datasets. Our experimental results show that our proposed method outperforms other state-of-the-art ATR methods on most MSTAR datasets and exhibits a certain degree of robustness. 
Keywords:
点击此处可从《雷达学报》浏览原始摘要信息
点击此处可从《雷达学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号