报告人
Junnan Dong Hong Kong Polytechnic University 时间 2025年3月11日 星期二 下午 13:00-14:00 地点
102会议室
Abstract Knowledge-based question answering has been extensively studied to answer questions that necessitate domain knowledge from external sources, e.g., knowledge graphs (KGs). With the emergence of large language models (LLMs), it remains important but under-explored to comprehensively integrate KGs and LLMs for more advanced KBQA frameworks. In this talk, I will mainly present my research with three pipelines of solutions: (i) LLM-guided KG reasoning that leverages the comprehension ability of LLMs to enhance complex question understanding; (ii) KG-enhanced LLMs, a novel GraphRAG framework which enrich LLMs with structured domain knowledge background and reduce thire hallucinations; and (iii) cost-efficient combination, which optimizes the integration of KGs and LLMs to reduce computational overhead while improving performance.
Biography Junnan Dong is a final-year Ph.D. candidate at the Hong Kong Polytechnic University. He focuses on Large Language Models, Knowledge Graphs and Graph Retrieval Augmented Generation for advanced and knowledgeable domain-specific reasoning. He has published more than ten papers on top-tier conferences, including NeurIPS, ACL, WWW, KDD, SIGIR and WSDM, and has been recognized with the 2024 NeurIPS Scholar Award. He was also nominated as the 'Best Research Postgraduate Student' by the Department of Computing at PolyU. Junnan has been continuously serving as a program committee member for NeurIPS, ICML, ICLR, ACL and WWW, etc. Personal Website: https://junnandong.github.io




