Dr. Sarah Chen
Associate Professor of Computer Science
Stanford University
My research focuses on natural language processing, machine learning, and the intersection of language understanding with knowledge representation.
KnowledgeBridge: Integrating Structured Knowledge Graphs with Large Language Models
Published in ICLR, 2026
Ryo Nakamura, Sarah Chen, Liang Wei
Abstract
Large language models possess remarkable generative capabilities but struggle with multi-hop reasoning tasks that require chaining multiple facts together, particularly when those facts are not well-represented in the training data. Knowledge graphs offer a complementary source of structured, verifiable information, but integrating them effectively with autoregressive language models remains an open challenge.
We introduce KnowledgeBridge, a framework that dynamically retrieves and grounds relevant knowledge graph subgraphs during language model generation. Our approach consists of three components: a query-driven subgraph retriever that identifies relevant KG neighborhoods at each generation step, a graph encoder that projects subgraph structure into the language model's representation space, and a gated fusion mechanism that allows the model to selectively attend to structured knowledge when beneficial.
We evaluate KnowledgeBridge on four multi-hop question answering datasets and two knowledge-intensive dialogue tasks. Our framework improves multi-hop QA accuracy by 18% over retrieval-augmented baselines and by 12% over the best existing KG-augmented method. Critically, KnowledgeBridge maintains strong performance on single-hop questions where KG integration is unnecessary, demonstrating that our gated fusion mechanism successfully avoids the "knowledge noise" problem that plagues simpler integration approaches.
Citation
R. Nakamura, S. Chen, L. Wei. (2026). "KnowledgeBridge: Integrating Structured Knowledge Graphs with Large Language Models." In Proceedings of ICLR 2026.