Yifang Qin's research while affiliated with Peking University and other places

What is this page?


This page lists the scientific contributions of an author, who either does not have a ResearchGate profile, or has not yet added these contributions to their profile.

It was automatically created by ResearchGate to create a record of this author's body of work. We create such pages to advance our goal of creating and maintaining the most comprehensive scientific repository possible. In doing so, we process publicly available (personal) data relating to the author as a member of the scientific community.

If you're a ResearchGate member, you can follow this page to keep up with this author's work.

If you are this author, and you don't want us to display this page anymore, please let us know.

Publications (30)


Learning Graph ODE for Continuous-Time Sequential Recommendation
  • Article

July 2024

·

10 Reads

·

9 Citations

IEEE Transactions on Knowledge and Data Engineering

Yifang Qin

·

·

·

[...]

·

Sequential recommendation aims at understanding user preference by capturing successive behavior correlations, which are usually represented as the item purchasing sequences based on their past interactions. Existing efforts generally predict the next item via modeling the sequential patterns. Despite effectiveness, there exist two natural deficiencies: (i) user preference is dynamic in nature, and the evolution of collaborative signals is often ignored; and (ii) the observed interactions are often irregularly-sampled, while existing methods model item transitions assuming uniform intervals. Thus, how to effectively model and predict the underlying dynamics for user preference becomes a critical research problem. To tackle the above challenges, in this paper, we focus on continuous-time sequential recommendation and propose a principled graph ordinary differential equation framework named GDERec. Technically, GDERec is characterized by an autoregressive graph ordinary differential equation consisting of two components, which are parameterized by two tailored graph neural networks (GNNs) respectively to capture user preference from the perspective of hybrid dynamical systems. On the one hand, we introduce a novel ordinary differential equation based GNN to implicitly model the temporal evolution of the user-item interaction graph. On the other hand, an attention-based GNN is proposed to explicitly incorporate collaborative attention to interaction signals when the interaction graph evolves over time. The two customized GNNs are trained alternately in an autoregressive manner to track the evolution of the underlying system from irregular observations, and thus learn effective representations of users and items beneficial to the sequential recommendation. Extensive experiments on five benchmark datasets demonstrate the superiority of our model over various state-of-the-art recommendation methods

Share

Summary of graph augmentation strategies for GCL.
Towards Graph Contrastive Learning: A Survey and Beyond
  • Preprint
  • File available

May 2024

·

18 Reads

In recent years, deep learning on graphs has achieved remarkable success in various domains. However, the reliance on annotated graph data remains a significant bottleneck due to its prohibitive cost and time-intensive nature. To address this challenge, self-supervised learning (SSL) on graphs has gained increasing attention and has made significant progress. SSL enables machine learning models to produce informative representations from unlabeled graph data, reducing the reliance on expensive labeled data. While SSL on graphs has witnessed widespread adoption, one critical component, Graph Contrastive Learning (GCL), has not been thoroughly investigated in the existing literature. Thus, this survey aims to fill this gap by offering a dedicated survey on GCL. We provide a comprehensive overview of the fundamental principles of GCL, including data augmentation strategies, contrastive modes, and contrastive optimization objectives. Furthermore, we explore the extensions of GCL to other aspects of data-efficient graph learning, such as weakly supervised learning, transfer learning, and related scenarios. We also discuss practical applications spanning domains such as drug discovery, genomics analysis, recommender systems, and finally outline the challenges and potential future directions in this field.

Download




DEER: Distribution Divergence-based Graph Contrast for Partial Label Learning on Graphs

January 2024

IEEE Transactions on Multimedia

Graph neural networks (GNNs) have emerged as powerful tools for graph classification tasks. However, contemporary graph classification methods are predominantly studied in fully supervised scenarios, while there could be label ambiguity and noise in real-world applications. In this work, we explore the weakly supervised problem of partial label learning on graphs, where each graph sample is assigned a collection of candidate labels. A novel method called D istribution Div e rgence-bas e d Graph Cont r ast (DEER) is proposed to address this issue. At the heart of our DEER is to measure the divergence among the underlying semantic distributions in the hidden space and this metric enables the identification of accurate positive graph pairs for effective graph contrastive learning. Specifically, we generate graph representations of augmented graph views that retain semantics and can be regarded as samples from the underlying semantic distributions. We employ a non-parametric metric to measure distribution divergence, which is then combined with pseudo-labeling to generate unbiased and target-oriented graph pairs. Furthermore, we introduce a label-correction method to eliminate noisy candidate labels, updating target labels using posterior distributions in a soft manner. Comprehensive experiments on various benchmarks demonstrate the superiority of our DEER in different settings compared to a range of state-of-the-art baselines.




Toward Effective Semi-supervised Node Classification with Hybrid Curriculum Pseudo-labeling

October 2023

·

20 Reads

·

7 Citations

ACM Transactions on Multimedia Computing, Communications and Applications

Semi-supervised node classification is a crucial challenge in relational data mining and has attracted increasing interest in research on graph neural networks (GNNs). However, previous approaches merely utilize labeled nodes to supervise the overall optimization, but fail to sufficiently explore the information of their underlying label distribution. Even worse, they often overlook the robustness of models, which may cause instability of network outputs to random perturbations. To address the aforementioned shortcomings, we develop a novel framework termed Hybrid Curriculum Pseudo-Labeling (HCPL) for efficient semi-supervised node classification. Technically, HCPL iteratively annotates unlabeled nodes by training a GNN model on the labeled samples and any previously pseudo-labeled samples, and repeatedly conducts this process. To improve the model robustness, we introduce a hybrid pseudo-labeling strategy which incorporates both prediction confidence and uncertainty under random perturbations, therefore mitigating the influence of erroneous pseudo-labels. Finally, we leverage the idea of curriculum learning to start from annotating easy samples, and gradually explore hard samples as the iteration grows. Extensive experiments on a number of benchmarks demonstrate that our HCPL beats various state-of-the-art baselines in diverse settings.


Redundancy-Free Self-Supervised Relational Learning for Graph Clustering

September 2023

·

14 Reads

·

8 Citations

IEEE Transactions on Neural Networks and Learning Systems

Graph clustering, which learns the node representations for effective cluster assignments, is a fundamental yet challenging task in data analysis and has received considerable attention accompanied by graph neural networks (GNNs) in recent years. However, most existing methods overlook the inherent relational information among the nonindependent and nonidentically distributed nodes in a graph. Due to the lack of exploration of relational attributes, the semantic information of the graph-structured data fails to be fully exploited which leads to poor clustering performance. In this article, we propose a novel self-supervised deep graph clustering method named relational redundancy-free graph clustering (R $^2$ FGC) to tackle the problem. It extracts the attribute-and structure-level relational information from both global and local views based on an autoencoder (AE) and a graph AE (GAE). To obtain effective representations of the semantic information, we preserve the consistent relationship among augmented nodes, whereas the redundant relationship is further reduced for learning discriminative embeddings. In addition, a simple yet valid strategy is used to alleviate the oversmoothing issue. Extensive experiments are performed on widely used benchmark datasets to validate the superiority of our R $^2$ FGC over state-of-the-art baselines. Our codes are available at https://github.com/yisiyu95/R2FGC.


Citations (15)


... To capture this heterogeneity, The model employ hierarchical attention, comprising node-level attention and meta-path-level attention. -COOL [17]: To capture a range of long-term transitional patterns, the model employs a unified selfattention decoder that combines sequential representations through multi-rank and multi-scale attention branches.. The superior performance of ST-MEN and ST-MEN-F is not limited to the New York Taxi dataset but also extends to the Cheng Du Taxi dataset, where they outperform the baseline models. ...

Reference:

Spatial-temporal memory enhanced multi-level attention network for origin-destination demand prediction
COOL: A Conjoint Perspective on Spatio-Temporal Graph Neural Network for Traffic Forecasting
  • Citing Article
  • March 2024

Information Fusion

... The literature examines graph neural networks (GNNs) in depth, covering general knowledge, network structures, potential gaps, and perspectives through various surveys [1,2,13,16,19,20,[37][38][39][40][41][42][43][44][45][46][47][48]. In addition, the existing literature includes surveys and reviews focusing on specific applications [49][50][51][52]. ...

A Comprehensive Survey on Deep Graph Representation Learning
  • Citing Article
  • February 2024

Neural Networks

... Graph-structured data is ubiquitous and pervasive across various domains, ranging from social networks [3,136] to recommender systems [62,122,173], biological networks [23,220], and knowledge graphs [12,185]. With the rise in popularity and remarkable success of Graph Neural Networks (GNNs), deep learning on graphs has garnered significant attention in numerous fields [57,65,67,175]. ...

Learning Graph ODE for Continuous-Time Sequential Recommendation
  • Citing Article
  • July 2024

IEEE Transactions on Knowledge and Data Engineering

... This bias results in unreliable predictions for underrepresented classes. Researchers seek GCL-based methods to address this issue [111,190], aiming to design models that can effectively learn from imbalanced graph data, ensuring accurate and balanced predictions across all classes. ...

RAHNet: Retrieval Augmented Hybrid Network for Long-tailed Graph Classification
  • Citing Conference Paper
  • October 2023

... Despite valuable theoretical insights, they often assume that the source and target domain are drawn from exactly the same underlying model, which usually does not hold in practical scenarios where domain shifts occur. Third, to address the domain shifts challenge, various domain adaptation techniques have been proposed, such as unsupervised adaptation [24], local structure transfer [25], adversarial domain alignment [7], and noise-resistant transfer [26]. Despite promising results, they often lack theoretical guarantees or can be sensitive to hyperparameter choices. ...

ALEX: Towards Effective Graph Transfer Learning with Noisy Labels
  • Citing Conference Paper
  • October 2023

... In recent times, GNNs have gained considerable traction for effectively processing such graph-structured data, exhibiting remarkable performance across various domains (Lv et al. 2023;Luo et al. 2023). Inspired by this success, we intend to develop a GNN to derive tweet representations from the graph and subsequently determine their stances. ...

Toward Effective Semi-supervised Node Classification with Hybrid Curriculum Pseudo-labeling
  • Citing Article
  • October 2023

ACM Transactions on Multimedia Computing, Communications and Applications

... AGC-DRR [3] reduces information redundancy in input and latent feature spaces to distinguish samples effectively. R 2 FGC [26] captures intrinsic relational and semantic information among non-independent and differently distributed nodes to reduce redundancy and obtain discriminative embeddings. Although these advanced methods demonstrated strong clustering performance, they have become increasingly complex. ...

Redundancy-Free Self-Supervised Relational Learning for Graph Clustering
  • Citing Article
  • September 2023

IEEE Transactions on Neural Networks and Learning Systems

... Then, graph neural networks (GNN) based models [8,9,18,40] took a step further by integrating graphaugmented POI sequences, which capitalized on collaborative signals from semantically similar POIs and unveiled sequential trends, thereby outperforming RNN-based approaches in terms of accuracy. Then, Diff-POI [30], by leveraging the powerful generality of the diffusion model, establishes a new standard for cutting-edge accuracy in the field. These approaches, however, predominantly rely on cloud-based infrastructure, which brings the need for substantial cloud computing capabilities. ...

A Diffusion Model for POI Recommendation

ACM Transactions on Information Systems

... Graph-level tasks. These tasks focus on predicting properties, relationships, and structures for entire graphs, including graph classification [61,105], graph matching [25], graph generation [167,229], and graph-level clustering [58]. For example, graph classification aims to predict the property category to which an entire graph belongs. ...

GLCC: A General Framework for Graph-Level Clustering

Proceedings of the AAAI Conference on Artificial Intelligence

... Three other methods, CGNN, CR-GNN, and MCLC, are informed by graph contrastive learning to refine noisy label learning. CGNN (Yuan et al., 2023b) utilizes unsupervised graph contrastive learning for feature representation and neighbor label correction. CR-GNN leverages a dual-channel feature from unsupervised graph contrast to identify prediction-consistent nodes as confident ones for learning amidst noise. ...

Learning on Graphs under Label Noise
  • Citing Conference Paper
  • June 2023