The neural process family: Survey, applications and perspectives
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …
approximation capabilities but are limited in their abilities to learn meta representations and …
Beyond unimodal: Generalising neural processes for multimodal uncertainty estimation
Uncertainty estimation is an important research area to make deep neural networks (DNNs)
more trustworthy. While extensive research on uncertainty estimation has been conducted …
more trustworthy. While extensive research on uncertainty estimation has been conducted …
CLAP4CLIP: Continual Learning with Probabilistic Finetuning for Vision-Language Models
Continual learning (CL) aims to help deep neural networks to learn new knowledge while
retaining what has been learned. Recently, pre-trained vision-language models such as …
retaining what has been learned. Recently, pre-trained vision-language models such as …
Self-Expansion of Pre-trained Models with Mixture of Adapters for Continual Learning
Continual learning aims to learn from a stream of continuously arriving data with minimum
forgetting of previously learned knowledge. While previous works have explored the …
forgetting of previously learned knowledge. While previous works have explored the …
Adaptive Rank, Reduced Forgetting: Knowledge Retention in Continual Learning Vision-Language Models with Dynamic Rank-Selective LoRA
We investigate whether the pre-trained knowledge of vision-language models (VLMs), such
as CLIP, can be retained or even enhanced during continual learning (CL) while absorbing …
as CLIP, can be retained or even enhanced during continual learning (CL) while absorbing …