首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2篇
  免费   0篇
物理学   2篇
  2021年   2篇
排序方式: 共有2条查询结果,搜索用时 0 毫秒
1
1.
桥小脑角区(CPA)肿瘤的精准分割在手术治疗、放疗中有重要影响,本文结合更快速区域卷积神经网络(Faster-RCNN)和水平集(Level-Set)方法对CPA肿瘤的自动分割进行了研究.首先,采集317名CPA肿瘤患者的T1WI-SE序列磁共振图像,使用基于Faster-RCNN主干网络VGG16提取特征,结合区域建议网络(RPN)进行学习训练,建立带有CPA肿瘤位置信息的定位模型,再应用Level-Set对肿瘤进行精准分割.本文对比了不同CPA肿瘤区域勾画范围对分割结果产生的影响,并以精确率、召回率、均值平均精度值(mAP)和戴斯系数(Dice系数)等指标评估了模型定位和分割的性能.实验结果表明,结合Faster-RCNN和Level-Set建立的模型能更有效对CPA肿瘤进行精准分割,减轻临床医生的负担,并提升治疗效果.  相似文献   
2.
With the promotion of intelligent substations, more and more robots have been used in industrial sites. However, most of the meter reading methods are interfered with by the complex background environment, which makes it difficult to extract the meter area and pointer centerline, which is difficult to meet the actual needs of the substation. To solve the current problems of pointer meter reading for industrial use, this paper studies the automatic reading method of pointer instruments by putting forward the Faster Region-based Convolutional Network (Faster-RCNN) based object detection integrating with traditional computer vision. Firstly, the Faster-RCNN is used to detect the target instrument panel region. At the same time, the Poisson fusion method is proposed to expand the data set. The K-fold verification algorithm is used to optimize the quality of the data set, which solves the lack of quantity and low quality of the data set, and the accuracy of target detection is improved. Then, through some image processing methods, the image is preprocessed. Finally, the position of the centerline of the pointer is detected by the Hough transform, and the reading can be obtained. The evaluation of the algorithm performance shows that the method proposed in this paper is suitable for automatic reading of pointer meters in the substation environment, and provides a feasible idea for the target detection and reading of pointer meters.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号