Rethinking Multi-view Representation Learning via Distilled Disentangling
Multi-view representation learning aims to derive robust representations that are both view-
consistent and view-specific from diverse data sources. This paper presents an in-depth …
consistent and view-specific from diverse data sources. This paper presents an in-depth …
Sequential attention layer-wise fusion network for multi-view classification
Q Teng, X Yang, Q Sun, P Wang, X Wang… - International Journal of …, 2024 - Springer
Graph convolutional network has shown excellent performance in multi-view classification.
Currently, to output a fused node embedding representation in multi-view scenarios, existing …
Currently, to output a fused node embedding representation in multi-view scenarios, existing …
Inhomogeneous Diffusion-Induced Network for Multiview Semi-Supervised Classification
The challenges posed by heterogeneous data in practical applications have made multiview
semi-supervised classification a focus of attention for researchers. While several graph …
semi-supervised classification a focus of attention for researchers. While several graph …
Self-adaptive label discovery and multi-view fusion for complementary label learning
L Tang, P Yan, Y Tian, PM Pardalos - Neural Networks, 2025 - Elsevier
Unlike traditional supervised classification, complementary label learning (CLL) operates
under a weak supervision framework, where each sample is annotated by excluding several …
under a weak supervision framework, where each sample is annotated by excluding several …
Robust multi-view clustering via collaborative constraints and multi-layer concept factorization
G Liu, H Ge, T Li, S Su, P Gao - Applied Intelligence, 2024 - Springer
The design of effective multi-view clustering algorithms has recently garnered significant
research attention. In this paper, we develop a robust multi-view clustering via collaborative …
research attention. In this paper, we develop a robust multi-view clustering via collaborative …
Duet: Dually Guided Knowledge Distillation from Explicit Feedback
HK Bae, J Kim, J Lee, SW Kim - Available at SSRN 4764505 - papers.ssrn.com
Various knowledge distillation (KD) methods for recommender systems have been recently
introduced to achieve two goals:(i) obtaining an inference time shorter than the cumbersome …
introduced to achieve two goals:(i) obtaining an inference time shorter than the cumbersome …