MPrompt: A Pretraining-Prompting Scheme for Enhanced Fewshot Subgraph Classification
Author(s)
Xu, Muhua
DownloadThesis PDF (1.036Mb)
Advisor
Arvind
Chen, Jie
Chen, Xuhao
Terms of use
Metadata
Show full item recordAbstract
Motivated by the significant progress in NLP prompt learning, there have been great research interests recently in adopting the prompting mechanism for graph machine learning. Despite the prior success of prompting methods applied in node-level and graph-level learning tasks, subgraph-level tasks are highly underexplored, and the potential of prompting remains unclear. This thesis fills this gap by exploring the prompting mechanism for subgraph classification, which is a much more challenging task as it requires understanding both global and local graph structures. In this work, we build upon state-of-the-art self-supervised graph learning models to develop a subgraph-specific prompting scheme Membership Prompt (MPrompt) based on traditional graph neural networks (GNN). Our proposed prompting scheme relies on node membership knowledge to help GNN distinguish between border and local connections, which increases its expressive power while maintaining the prompt’s independence from any specific dataset or model architecture. Additionally, we also present Subgraph Reconstructive Pretraining (SRP) which can provide MPrompt with better structural embeddings during pretraining. Experiments are conducted on both synthetic and real-world datasets, including protein function prediction and social network analysis. Our method demonstrated performance improvement under few-shot experiment setting and maintained comparable performance in full-shot settings while requiring less computation.
Date issued
2024-05Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology