The incompleteness of knowledge graphs limits their application to various downstream tasks. To tackle this, we suggest advancing the field of reasoning over knowledge graphs with missing links in different scenarios. This includes measuring the conflicting predictions of individual models, leveraging the capabilities of LLMs to answer questions more effectively when new information becomes available, and generalizing reasoning approaches to knowledge graphs beyond triple-shaped facts, such as hyper-relational temporal knowledge graphs. Below are the abstracts for each paper:
Predictive Multiplicity of Knowledge Graph Embeddings in Link Prediction: Given the non-convexity of the training problem of knowledge graph embedding (KGE) methods, the same KG may lead to various KGE models by converging to different local minima. While these models perform similarly in accuracy, they might encode different inductive patterns leading to conflicting (individual) predictions in deployment scenarios—a phenomenon known as predictive multiplicity. Zhu et al. introduce two metrics to measure predictive multiplicity for KGE-based link prediction and propose the use of voting methods from social choice theory to alleviate this issue [1].
Temporal Fact Reasoning over Hyper-Relational Knowledge Graphs: This paper focuses on enhancing reasoning in temporal knowledge graphs (TKGs) by incorporating hyper-relational information, which is often lacking in temporal knowledge graphs. For instance, given two temporal knowledge facts, (Al Gore, received award, Nobel Peace Prize, 2007) and (Al Gore, received award, Primetime Emmy Award, 2007), we can associate additional information to enhance the expressiveness of each fact by using key-value pairs: (Al Gore, received award, Nobel Peace Prize, 2007, (together with, Intergovernmental Panel on Climate Change)) and (Al Gore, received award, Primetime Emmy Award, 2007, (for work, Current TV)). To address this, the authors propose the Hyper-Relational Temporal Knowledge Graph (HTKG) and introduce two new datasets, YAGO-hy and Wiki-hy, which incorporate more information into temporal knowledge graphs. To evaluate the effectiveness of the hyper-relational information, the authors also propose the HypeTKG model, which leverages hyper-relational temporal knowledge graphs to provide richer context, thereby improving the reasoning capabilities of TKGs [2].
LLM-Based Multi-Hop Question Answering with Knowledge Graph Integration in Evolving Environments: This work introduces Graph Memory-based Editing for Large Language Models (GMeLLo), an approach that combines two powerful tools for complex reasoning: the flexible language abilities of LLMs and the structured, easy-to-update format of knowledge graphs—a kind of information network that connects facts in a clear, organized way. This combination allows the AI to not only answer questions more effectively but also quickly update its knowledge when new information becomes available. The results show that GMeLLo outperforms other leading methods, especially in situations where many facts need to be updated, and multiple steps of reasoning are required to answer a question accurately. This breakthrough could lead to smarter, more reliable AI systems that can handle increasingly complex real-world tasks [3].
References:
[1] Y. Zhu, N. Potyka, M. Nayyeri, B. Xiong, Y. He, E. Kharlamov, S. Staab. Predictive Multiplicity of Knowledge Graph Embeddings in Link Prediction. Findings of the 2024 Conference on Empirical Methods in Natural Language Processing 2024.
[2] Z. Ding, J. Wu, J. Wu, Y. Xia, B. Xiong, V. Tresp. Temporal Fact Reasoning over Hyper-Relational Knowledge Graphs. Findings of the 2024 Conference on Empirical Methods in Natural Language Processing.
[3] R. Chen, W. Jiang, C. Qin, I.S. Rawal, C. Tan, D. Choi, B. Xiong, B. Ai. LLM-Based Multi-Hop Question Answering with Knowledge Graph Integration in Evolving Environments. Findings of the 2024 Conference on Empirical Methods in Natural Language Processing.