基于ChineseBERT的中文知识图谱问答方法
DOI:
作者:
作者单位:

桂林理工大学信息科学与工程学院

作者简介:

通讯作者:

中图分类号:

TP391

基金项目:

国家自然科学基金(NO81660031)


Chinese Knowledge Graph Question Answering Method Based on ChineseBERT
Author:
Affiliation:

School of Information Science and Engineering,Guilin University of Technology

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对中文字形复杂、语义信息复杂导致问答性能不佳的问题,提出基于中文预训练语言模型ChineseBERT的知识图谱问答方法(ChineseBERT-KBQA)。该方法采用中文预训练语言模型 ChineseBERT,将其作为文本的语义嵌入层,其融合了字形和拼音信息,提升了传统语义解析方法在实体提及识别与关系预测子任务上的性能。具体而言,分别提出基于ChineseBERT-CRF的实体提及识别模型和基于ChineseBERT-TextCNN-Softmax的关系预测模型,以综合提高对中文文本的语义理解能力。最后结合子任务间的相关信息,进行最终的答案预测。在教育问答数据集MOOC Q&A和开放域问答数据集NLPCC2018上的实验结果表明了该方法的有效性。

    Abstract:

    Aiming at the problem of poor Q&A performance due to the complexity of Chinese glyphs and semantic information, a knowledge graph question answering method based on Chinese pre-trained language model ChineseBERT was proposed. The method employs a Chinese pre-trained language model, ChineseBERT, as the semantic embedding layer of the text, which incorporates both glyph and pinyin information to improve the performance of traditional semantic parsing methods on the entity mention recognition and relationship prediction subtasks. Specifically, this paper proposes an entity mention recognition model based on ChineseBERT-CRF and a relation prediction model based on ChineseBERT-TextCNN-Softmax, respectively, to comprehensively improve the semantic understanding ability of Chinese texts. Finally, the relevant information between subtasks was combined to predict the final answer. Experimental results on the educational Q&A dataset MOOC Q&A and the open domain Q&A dataset NLPCC2018 show the effectiveness of the proposed method.

    参考文献
    相似文献
    引证文献
引用本文

邓健志,方雨桐. 基于ChineseBERT的中文知识图谱问答方法[J]. 科学技术与工程, , ():

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-05-28
  • 最后修改日期:2023-08-26
  • 录用日期:2023-09-01
  • 在线发布日期:
  • 出版日期:
×
亟待确认版面费归属稿件,敬请作者关注