首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于深度学习的语法纠错算法建模研究
引用本文:郭琰,张矛.基于深度学习的语法纠错算法建模研究[J].信息技术,2021(4):148-152,158.
作者姓名:郭琰  张矛
作者单位:商洛职业技术学院
摘    要:基于基础seq2seq深度学习算法在语法纠错准确率和召回率方面存在的不足,提出了融合Attention机制和Transformer模块的改进型seq2seq语法纠错算法。通过引入Attention机制来记录decoder端和encoder端语言信息,提升信息完整性,采用beam-search和copy机制进行启发式搜索,缓解解空间对机器内存的消耗,利用Transformer模块进行自注意力机制的特征抽取,实现了语句向量数据的扩充并得到可解析上下文纠错。最后选择合适的语料库,对不同的语法纠错算法的准确率、召唤率和F 0.5数据语法纠错效果评价指标进行了比较,结果表明了文中改进的算法模型的有效性,提高了语法纠错的准确率和召回率。

关 键 词:seq2seq算法  注意力机制  Transformer模块  语法纠错

Research on modeling grammar error correction algorithm based on deep learning
GUO Yan,ZHANG Mao.Research on modeling grammar error correction algorithm based on deep learning[J].Information Technology,2021(4):148-152,158.
Authors:GUO Yan  ZHANG Mao
Institution:(Shangluo Technical and Vocational College,Shangluo 726000,Shaanxi Province,China)
Abstract:For the shortcomings of basic seq2seq depth learning algorithm in grammar error correction accuracy and recall rate,an improved seq2seq grammar error correction algorithm based on fusion Attention mechanism and Transformer module is proposed.To solve the serious problem of loss of detail information in sentence vector calculation,an Attention mechanism is proposed to record decoder segment and encoder end language information,and to improve the integrity of information.Beam-search and copy mechanism are used to solve the problem of excessive memory overflow,which caused by parsing sentences Heuristic search is carried out to alleviate the consumption of machine memory in solution space,and the feature extraction of self-attention mechanism is carried out by using Transformer module to extract local errors in sentence translation.The expansion of statement vector data is realized and the resolvable context error correction is obtained.Finally,the appropriate corpus is selected to compare the accuracy rate,call rate and F0.5 data grammar error correction effect evaluation index of different grammar error correction algorithms.The results show that the improved algorithm model is effective and the accuracy and recall rate of grammar error correction are improved.
Keywords:seq2seq algorithm  attention mechanism  Transformer module  grammar error correction
本文献已被 维普 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号