首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于改进ViT的红外人体图像步态识别方法研究
引用本文:杨彦辰,云利军,梅建华,卢琳.基于改进ViT的红外人体图像步态识别方法研究[J].应用光学,2023,44(1):71-78.
作者姓名:杨彦辰  云利军  梅建华  卢琳
作者单位:1.云南师范大学 信息学院,云南 昆明 650500
基金项目:云南省应用基础研究计划重点项目(2018FA033);云南师范大学研究生科研创新基金项目(YJSJJ21-B77)
摘    要:针对卷积神经网络在步态识别时准确率易饱和现象,以及Vision Transformer(ViT)对步态数据集拟合效率较低的问题,提出构建一个对称双重注意力机制模型,保留行走姿态的时间顺序,用若干独立特征子空间有针对性地拟合步态图像块;同时,采用对称架构的方式,增强注意力模块在拟合步态特征时的作用,并利用异类迁移学习进一步提升特征拟合效率。将该模型运用在中科院CASIA C红外人体步态库中进行多次仿真实验,平均识别准确率达到96.8%。结果表明,本文模型在稳定性、数据拟合速度以及识别准确率3方面皆优于传统ViT模型和CNN对比模型。

关 键 词:步态识别  对称双重注意力机制  迁移学习  红外人体图像  Vision  Transformer  卷积神经网络
收稿时间:2022-03-21

Gait recognition method of infrared human body images based on improved ViT
Institution:1.College of Information, Yunnan Normal University, Kunming 650500, China2.Yunnan Provincial Key Laboratory of Optoelectronic Information Technology, Kunming 650500, China3.Department of Equipment Information, Yunnan Tobacco Leaf Company, Kunming 650218, China
Abstract:Aiming at the phenomenon that the accuracy of convolutional neural network is easy to be saturated in gait recognition and the problem of low fitting efficiency of vision transformer (ViT) to gait data set, an idea to construct a symmetrical dual attention mechanism model was proposed to retain the time order of walking posture, and fit the gait image blocks with several independent feature subspaces. At the same time, the symmetrical architecture was adopted to enhance the role of attention module in fitting gait features, and the heterogeneous transfer learning was used to further improve the efficiency of feature fitting. The model was applied to CASIA C infrared human body gait database of Chinese Academy of Sciences for many simulation experiments, and the average recognition accuracy was 96.8%. The results show that the proposed model is superior to the traditional ViT model and CNN comparison model in stability, data fitting speed and recognition accuracy.
Keywords:
点击此处可从《应用光学》浏览原始摘要信息
点击此处可从《应用光学》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号