中国综合性科技类核心期刊(北大核心)

中国科学引文数据库来源期刊(CSCD)

美国《化学文摘》(CA)收录

美国《数学评论》(MR)收录

俄罗斯《文摘杂志》收录

Message Board

Respected readers, authors and reviewers, you can add comments to this page on any questions about the contribution, review, editing and publication of this journal. We will give you an answer as soon as possible. Thank you for your support!

Name
E-mail
Phone
Title
Content
Verification Code
Issue 5
Dec.  2019
Turn off MathJax
Article Contents
YANG Dong-ming, YANG Da-wei, GU Hang, HONG Dao-cheng, GAO Ming, WANG Ye. Research on knowledge point relationship extraction for elementary mathematics[J]. Journal of East China Normal University (Natural Sciences), 2019, (5): 53-65. doi: 10.3969/j.issn.1000-5641.2019.05.004
Citation: YANG Dong-ming, YANG Da-wei, GU Hang, HONG Dao-cheng, GAO Ming, WANG Ye. Research on knowledge point relationship extraction for elementary mathematics[J]. Journal of East China Normal University (Natural Sciences), 2019, (5): 53-65. doi: 10.3969/j.issn.1000-5641.2019.05.004

Research on knowledge point relationship extraction for elementary mathematics

doi: 10.3969/j.issn.1000-5641.2019.05.004
  • Received Date: 2019-07-29
  • Publish Date: 2019-09-25
  • With the development of Internet technology, online education has changed the learning style of students. However, given the lack of a complete knowledge system, online education has a low degree of intelligence and a/knowledge trek0problem. The relation-extraction concept is one of the key elements of knowledge system construction. Therefore, building knowledge systems has become the core technology of online education platforms. At present, the more efficient relationship extraction algorithms are usually supervised. However, such methods suffer from low text quality, scarcity of corpus, difficulty in labeling data, low efficiency of feature engineering, and difficulty in extracting directional relationships. Therefore, this paper studies the relation-extraction algorithm between concepts based on an encyclopedic corpus and distant supervision methods. An attention mechanism based on relational representation is proposed, which can extract the forward relationship information between knowledge points. Combining the advantages of GCN and LSTM, GCLSTM is proposed, which better extracts multipoint information in sentences. Based on the attention mechanism of Transform architecture and relational representation, a BTRE model suitable for the extraction of directional relationships is proposed, which reduces the complexity of the model. Hence, a knowledge point relationship extraction system is designed and implemented. The performance and efficiency of the model are verified by designing three sets of comparative experiments.
  • loading
  • [1]
    LIU H, MA W, YANG Y, et al. Learning concept graphs from online educational data[J]. Journal of Artificial Intelligence Research, 2016, 55:1059-1090. doi:  10.1613/jair.5002
    [2]
    NOVAK J D, BOB GOWIN D, JOHANSEN G T. The use of concept mapping and knowledge vee mapping with junior high school science students[J]. Science education, 1983, 67(5):625-645. doi:  10.1002/sce.3730670511
    [3]
    MIWA M, BANSAL M. End-to-End Relation Extraction using LSTMs on Sequencesand Tree Structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2016: 1105-1116.
    [4]
    MINTZ M, BILIS S, SNOW R, et al. Distant supervision for relation extraction without labeled data[C]//Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 2, Association for Computational Linguistics, 2009: 1003-1011.
    [5]
    ZHOU P, SHI W, TIAN J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), 2016: 207-212.
    [6]
    LIN Y, SHEN S, LIU Z, et al. Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2016: 2124-2133.
    [7]
    LUONG M T, PHAM H, MANNING C D. Effective approaches to attention-based neural machine translation[C]//Empirical Methods in Natural Language Processing, 2015: 1412-1421.
    [8]
    NGUYEN T H, GRISHMAN R. Relation extraction: Perspective from convolutional neural networks[C]//Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, 2015: 39-48.
    [9]
    ZHANG D, WANG D. Relation classification via recurrent neural network[J]. CoRR abs/1508.01006, 2015.
    [10]
    HASHIMOTO K, MIWA M, TSURUOKA Y, et al. Simple customization of recursive neural networks for semantic relation classification[C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, 2013: 1372-1376.
    [11]
    EBRAHIMI J, DOU D. Chain based RNN for relation classification[C]//Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2015: 1244-1249.
    [12]
    SUNDERMEYER M, SCHLÜTER R, NEY H. LSTM neural networks for language modeling[C]//Thirteenth annual conference of the international speech communication association, 2012.
    [13]
    XU Y, MOU L, LI G, et al. Classifying relations via long short term memory networks along shortest dependency paths[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015: 1785-1794.
    [14]
    HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural computation, 1997, 9(8):1735-1780. doi:  10.1162/neco.1997.9.8.1735
    [15]
    ZENG D, LIU K, LAI S, et al. Relation classification via convolutional deep neural network[C]//25th International Conference on Computational Linguistics, Proceedings of the Conference: Technical Papers, August 23-29, 2014: 2335-2344.
    [16]
    DAUPHIN Y N, FAN A, AULI M, et al. Language modeling with gated convolutional networks[C]//Proceedings of the 34th International Conference on Machine Learning-Volume 70, 2017: 933-941.
    [17]
    KINCHIN I M, HAY D B, ADAMS A. How a qualitative approach to concept map analysis can be used to aid learning by illustrating patterns of conceptual development[J]. Educational research, 2000, 42(1):43-57. doi:  10.1080/001318800363908
    [18]
    BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[C]//International Conference on Learning Representations, 2015.
    [19]
    ZENG D, LIU K, CHEN Y, et al. Distant supervision for relation extraction via piecewise convolutional neural networks[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015: 1753-1762.
    [20]
    MNIH V, HEESS N, GRAVES A. Recurrent models of visual attention[C]//Advances in neural information processing systems, 2014: 2204-2212.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(5)  / Tables(4)

    Article views (165) PDF downloads(1) Cited by()
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return