Journal of Scientific Information Research
Keywords
temporal knowledge graph; temporal knowledge graph construction; temporal knowledge graph completion; graph contrastive learning; sudden events
Abstract
[Purpose/significance] During emergencies, social media short texts contain critical information but are heavily interfered with by noise. Traditional static knowledge graph completion techniques struggle to effectively address their dynamic evolution and data sparsity issues, making it imperative to introduce temporal modeling methods. [Method/process] This study proposes a dynamic completion framework that combines the temporal feature capture capability of Bidirectional Gated Recurrent Units (BiGRU) with the noise-resistant representation learning advantages of Graph Contrastive Learning (GCL). At the completion level, the ConBiTE method is introduced, which captures temporal dependencies through self-attention mechanisms and BiGRU, while leveraging GCL to enhance the completion of missing entities and relationships. At the construction level, RoBERTa-CNN-BiLSTM-CRF is employed for entity recognition, and the Wenxin large language model is utilized for relationship extraction, thereby improving the quality and efficiency of graph construction. [Result/conclusion] Experiments demonstrate that the proposed method outperforms traditional approaches in both completion and construction tasks, providing comprehensive technical support for dynamic information analysis and emergency response during emergencies, with significant theoretical and practical implications.
First Page
1
Last Page
11
Submission Date
May 2025
Revision Date
October 2025
Acceptance Date
October 2025
Publication Date
January 2026
Digital Object Identifier (DOI)
ꎺ 10.19809/j.cnki.kjqbyj.2026.01.001
Recommended Citation
WU, Peng; LU, Zhenyu; and ZHANG, Xuechen
(2026)
"Research on Temporal Knowledge Graph Completion Method for Emergent Events Based on BiGRU and Graph Contrastive Learning,"
Journal of Scientific Information Research: Vol. 8:
Iss.
1, Article 1.
DOI: ꎺ 10.19809/j.cnki.kjqbyj.2026.01.001
Available at:
https://eng.kjqbyj.com/journal/vol8/iss1/1
Reference
[1] 许凯嘉,柳林,王海龙,等.时序知识图谱补全研究综述[J].计算机工程与应用,2024,60(22):38-57.
[2] BAI L Y,ZHANG M C,ZHANG H,et al.FTMF:few-shot temporal knowledge graph completion based on meta-optimization and fault-tolerant mechanism[J].World Wide Web,2023,26(3):1243-1270.
[3] JIANG T,LIU T,GE T,et al.Encoding temporal information for time-aware link prediction[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing.Austin,TX,USA:Association for Computational Linguistics,2016:2350-2354.
[4] DASGUPTA S S,RAY S N,TALUKDAR P P,et al.HyTE:Hyperplane-based temporally aware knowledge graph embedding[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing.Brussels,Belgium:Association for Computational Linguistics,2018:2001-2011.
[5] SADEGHIAN A,ARMANDPOUR M,COLAS A,et al.ChronoR:Rotation based temporal knowledge graph embedding[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Palo Alto,CA,USA:AAAI Press,2021,35(7):6471-6479.
[6] LI J,SU X,GAO G.TeAST:temporal knowledge graph embedding via archimedean spiral timeline[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics.Toronto,Canada:Association for Computational Linguistics,2023:15460-15474.
[7] HU S,WANG B,WANG J,et al.Transformer-based temporal knowledge graph completion[C]//Proceedings of the 2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence. Taiyuan, China: IEEE, 2023: 443-448.
[8] CHEN Z,XU C,SU F,et al.Incorporating structured sentences with time-enhanced BERT for fully-inductive temporal relation prediction[C]//Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval.Taipei,China:ACM,2023:889-899.
[9] VELIČKOVIĆ P,CUCURULL G,CASANOVA A,et al.Graph attention networks[C]//Proceedings of the 6th International Conference on Learning Representations.Vancouver,BC,Canada:ICLR,2018.
[10] ZHANG S,HU Z,ARJUN S,et al.Motif-driven contrastive learning of graph representations[J].IEEE Transactions on Knowledge and Data Engineering,2024,36(8):4063-4075.
[11] LUO X,JU W,GU Y Y,et al.Self-supervised graph-level representation learning with adversarial contrastive learning[J].ACM Transactions on Knowledge Discovery from Data,2024,18(2):34.1-34.23.
[12] WU C,WANG C,XU J,et al.Graph contrastive learning with generative adversarial network[C]//Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.Long Beach,CA,USA:ACM,2023:2721-2730.
[13] WEI C,WANG Y,BAI B,et al.Boosting graph contrastive learning via graph contrastive saliency[C]//Proceedings of the 40th International Conference on Machine Learning.Honolulu,HI,USA:PMLR,2023,202:36839-36855.
[14] JU W,GU Y,LUO X,et al.Unsupervised graph-level representation learning with hierarchical contrasts[J].Neural Networks,2023,158:359-368.
[15] LIU Y,ZHAO Y,WANG X,et al.Multi-scale subgraph contrastive learning[C]//Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence.Macao,China:IJCAI,2023:2215-2223.
[16] ZHANG Y,CHEN Y,SONG Z,et al.Contrastive cross-scale graph knowledge synergy[C]//Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.Long Beach,CA,USA:ACM,2023:3422-3433.
[17] 刘丹阳,方全,张晓伟,等.基于图对比注意力网络的知识图谱补全[J].北京航空航天大学学报,2022,48(8):1428-1435.
[18] 乔梓峰,秦宏超,胡晶晶,等.融合多视图对比学习的知识图谱补全算法[J].计算机科学与探索,2024,18(4):1001-1009.
[19] XU Y,OU J,XU H,et al.Temporal knowledge graph reasoning with historical contrastive learning[J].Proceedings of the AAAI Conference on Artificial Intelligence,2023,37(4):4765-4773.
[20] 陆佳民,张晶,冯钧,等.时序知识图谱构建研究综述[J].计算机科学与探索2025,19(2):295-135.
[21] ZHUANG L Y,WAYNE L,YA S,et al.A robustly optimized BERT pre-training approach with post-training[C]//Proceedings of the 20th Chinese National Conference on Computational Linguistics.Huhhot,China:CCL,2021:1218-1227.
[22] WIDIASTUTI N I.Convolution neural network for text mining and natural language processing[C]//IOP Conference Series: Materials Science and Engineering.Bristol,UK:IOP Publishing,2019,662(5):052010.
[23] YAN R G, JIANG X, DANG D P.Named entity recognition by using XLNet-BiLSTM-CRF[J].Neural Processing Letters,2021,53(5):3339-3356.
[24] ZHENG Y F,KE W J,LIU Q,et al.Making LLMs as fine-grained relation extraction data augmentor[C]//Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence.Jeju Island,Korea:IJCAI,2024:6660-6668.
[25] BAGHERSHAHI P,HOSSEINI R,MORADI H.Self-attention presents low-dimensional knowledge graph embeddings for link prediction[J].Knowledge-Based Systems,2023,260:110124.
[26] CHENG Y G,LIU K N,WANG S.Fault knowledge graph construction for aviation equipment based on BiGRU-Attention improvement[J].Acta Aeronautica et Astronautica Sinica,2024,45(18):229916.
[27] ZHU C C,CHEN M H,FAN C J,et al.Learning from history:Modeling temporal knowledge graphs with sequential copy-generation networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Vancouver,BC,Canada:AAAI Press,2021,35(5):4732-4740.
[28] ZHU Y Y,WANG G X,KARLSSON B F.CAN-NER:Convolutional attention network for Chinese named entity recognition[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Minneapolis,MN,USA:Association for Computational Linguistics,2019:3384-3393.
[29] 陈茹,卢先领.融合空洞卷积神经网络与层次注意力机制的中文命名实体识别[J].中文信息学报,2020,34(8):70-77.
[30] LI D Y,YAN L,YANG J Z,et al.Dependency syntax guided BERT-BiLSTM-GAM-CRF for Chinese NER[J].Expert Systems with Applications,2022,196:116682.
[31] KONG J,ZHANG L X,JIANG M,et al.Incorporating multi-level CNN and attention mechanism for Chinese clinical named entity recognition[J].Journal of Biomedical Informatics,2021,116:103737.
[32] WAN Z,CHENG F,MAO Z Y,et al.GPT-RE:In-context learning for relation extraction using large language models[C]//Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. Singapore:Association for Computational Linguistics,2023:3534-3547.
[33] SOARES L B,FITZGERALD N,LING J,et al.Matching the blanks:Distributional similarity for relation learning[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Florence,Italy:Association for Computational Linguistics,2019:2895-2905.
[34] JIN W,QU M,JIN X,et al.Recurrent event network:Autoregressive structure inference over temporal knowledge graphs[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing.Online:Association for Computational Linguistics,2020:6669-6683.
Included in
Graphics and Human Computer Interfaces Commons, Health Information Technology Commons, Library and Information Science Commons