中国综合性科技类核心期刊(北大核心)

中国科学引文数据库来源期刊(CSCD)

美国《化学文摘》(CA)收录

美国《数学评论》(MR)收录

俄罗斯《文摘杂志》收录

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于多通道卷积神经网络的中文文本关系抽取

梁艳春 房爱莲

梁艳春, 房爱莲. 基于多通道卷积神经网络的中文文本关系抽取[J]. 华东师范大学学报(自然科学版), 2021, (3): 96-104. doi: 10.3969/j.issn.1000-5641.2021.03.010
引用本文: 梁艳春, 房爱莲. 基于多通道卷积神经网络的中文文本关系抽取[J]. 华东师范大学学报(自然科学版), 2021, (3): 96-104. doi: 10.3969/j.issn.1000-5641.2021.03.010
LIANG Yanchun, FANG Ailian. Chinese text relation extraction based on a multi-channel convolutional neural network[J]. Journal of East China Normal University (Natural Sciences), 2021, (3): 96-104. doi: 10.3969/j.issn.1000-5641.2021.03.010
Citation: LIANG Yanchun, FANG Ailian. Chinese text relation extraction based on a multi-channel convolutional neural network[J]. Journal of East China Normal University (Natural Sciences), 2021, (3): 96-104. doi: 10.3969/j.issn.1000-5641.2021.03.010

基于多通道卷积神经网络的中文文本关系抽取

doi: 10.3969/j.issn.1000-5641.2021.03.010
详细信息
    通讯作者:

    房爱莲,女,副教授,硕士生导师,研究方向为计算机应用技术. E-mail: alfang@cs.ecnu.cdu.cn

  • 中图分类号: TP391

Chinese text relation extraction based on a multi-channel convolutional neural network

  • 摘要: 给出了一种多通道卷积神经网络(Convolutional Neural Network, CNN)方法实现中文文本端到端的关系抽取. 每个通道用分层的网络结构, 在传播过程中互不影响, 使神经网络能学习到不同的表示. 结合中文语言的难点, 加入注意力机制(Attention Mechanism, Att)获取更多的语义特征, 并通过分段平均池化融入句子的结构信息. 经过最大池化层获得句子的最终表示后, 计算关系得分, 并用排序损失函数(Ranking-Loss Function, RL)代替交叉熵函数进行训练. 实验结果表明, 提出的MCNN_Att_RL (Multi CNN_Att_RL)模型能有效提高关系抽取的查准率、召回率和F1值.
  • 图  1  MCNN_Att_RL模型结构

    Fig.  1  MCNN_Att_RL model structure

    图  2  预处理数据

    Fig.  2  Pre-processed data

    表  1  关系类别

    Tab.  1  Relationship category

    序号关系类别举例占比/%
    1Greate(创造)男人-陶器2.93
    2Use(使用)奶奶-蒲扇4.76
    3Near(邻近)山-县城2.76
    4Social(社交关系)母亲-邻居6.02
    5Located(位于)幽兰-山谷37.43
    6Ownership(拥有)村民-旧屋5.10
    7General-Special(一般-特殊)鱼-鲫鱼6.99
    8Family(家人)母亲-奶奶37.43
    9Part-Whole(部分-整体)花-仙人掌23.76
    下载: 导出CSV

    表  2  实验的参数

    Tab.  2  Experimental parameters

    参数数值参数数值
    词向量维度100z2
    位置向量维度10a2.5
    卷积核大小9b0.5
    学习速率0.001batch10
    卷积核个数500epoch100
    隐藏层个数100Dropout0.5
    下载: 导出CSV

    表  3  实验结果

    Tab.  3  Experimental results

    模型P/%R/%F1/%
    BiLSTM57.1856.2456.70
    Att-BiLSTM56.2659.2057.70
    CNN49.9164.7456.37
    MCNN59.4259.6359.52
    MCNN_Att60.7264.2762.45
    MCNN_Att_P61.5765.3663.41
    MCNN_Att_RL62.8766.0364.41
    下载: 导出CSV
  • [1] LIU C Y, SUN W B, CHAO W H, et al. Convolution neural network for relation extraction [C]// International Conference on Advanced Data Mining and Applications. Berlin: Springer, 2013: 231-242.
    [2] ZENG D J, LIU K, LAI S W, et al. Relation classification via convolutional deep neural network [C]// International Conference on Computational Linguistocs. 2014: 2335-2344.
    [3] ZHANG D, WANG D. Relation classification via recurrent neural network [J]. Computer Ence, 2015, 36(36): 257-266.
    [4] ZHOU P, SHI W, TIAN J, et al. Attention-based bidirectional long short-term memory networks for relation classification [C]// Meeting of the Association for Computational Linguistics. 2016: 207-212.
    [5] ZHU J Z, QIAO J Z, DAI X X, et al. Relation classification via target-concentrated attention CNNs [C]// International Conference On Neural Information Processing. 2017: 137-146.
    [6] LI J, HUANG G M, CHEN J H, et al. Dual CNN for relation extraction with knowledge-based attention and word embeddings [J]. Computational Intelligence and Neuroscience, 2019, 171: 1-10. doi:  10.1155/2019/6789520
    [7] HONG Y, LIU Y X, YANG S Z, et al. Improving graph convolutional networks based on relation-aware attention for end-to-end relation extraction [J]. IEEE Access, 2020(8): 51315-51323. doi:  10.1109/ACCESS.2020.2980859
    [8] ZENG D J, LIU K, CHEN Y B, et al. Distant supervision for relation extraction via piecewise convolutional neural networks [C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1753-1762.
    [9] LIN Y K, SHEN S Q, LIU Z Y, et al. Neural relation extraction with selective attention over instances [C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016: 2124-2133.
    [10] KIM E K, CHOI K S. Improving distantly supervised relation extraction by knowledge base-driven zero subject resolution [J]. ICE Transactions on Information and Systems, 2018, 26(3): 142-151. doi:  10.1587/transinf.2017EDL8270
    [11] SUN T T, ZHANG C H, JI Y, et al. MSnet: Multi-head self-attention network for distantly supervised relation extraction [J]. IEEE Access, 2019(7): 54472-54482. doi:  10.1109/ACCESS.2019.2913316
    [12] CHEN T T, WANG N B, HE M, et al. Reducing wrong labels for distantly supervised relation extraction with reinforcement learning [J]. IEEE Access, 2020, 99: 1-12. doi:  10.1109/ACCESS.2020.2990680
    [13] WU W Y, CHEN Y F, XU J N, et al. Attention-based convolutional neural for Chinese relation extraction [C]// Lecture Notes in Computer Science, vol 11221. Cham: Springer, 2018: 147-158.
    [14] WEN J, SUN X, REN X C, et al. Structure regularized neural network for entity relation classification for Chinese literature text [J]. Computing Research Repository, 2018, 171: 103-112. doi:  10.18653/v1/n18-2059
    [15] LI Z R, DING N, LIU Z Y, et al. Chinese relation extraction with multi-grained information and external linguistic knowledge [C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 4377-4386.
    [16] 张志昌, 周侗, 张瑞芳, 等. 融合双向GRU与注意力机制的医疗实体关系识别 [J]. 计算机工程, 2020, 46(6): 296-302.
    [17] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality [J]. Advances in Neural Information Processing Systems, 2013, 26: 3111-3119.
    [18] XU J J, WEN J, SUN X, et al. A discourse-level named entity recognition and relation extraction dataset for Chinese literature text [J]. IEEE Access, 2017(3): 25-32.
    [19] OLSON D L, DELEN D. Advanced Data Mining Techniques [M]. Berlin: Springer, 2008.
  • 加载中
图(2) / 表(3)
计量
  • 文章访问数:  169
  • HTML全文浏览量:  136
  • PDF下载量:  18
  • 被引次数: 0
出版历程
  • 收稿日期:  2020-05-18
  • 刊出日期:  2021-05-01

目录

    /

    返回文章
    返回