Stylip: Multi-scale style-conditioned prompt learning for clip-based domain generalization
Abstract arge-scale foundation models, such as CLIP, have demonstrated impressive zero-
shot generalization performance on downstream tasks, leveraging well-designed language …
shot generalization performance on downstream tasks, leveraging well-designed language …
Continual zero-shot learning through semantically guided generative random walks
Learning novel concepts, remembering previous knowledge, and adapting it to future tasks
occur simultaneously throughout a human's lifetime. To model such comprehensive abilities …
occur simultaneously throughout a human's lifetime. To model such comprehensive abilities …
SEIC: Semantic Embedding with Intermediate Classes for Zero-Shot Domain Generalization
B Mondal, S Biswas - … of the Asian Conference on Computer …, 2022 - openaccess.thecvf.com
In this work, we address the Zero-Shot Domain Generalization (ZSDG) task, where the goal
is to learn a model from multiple source domains, such that it can generalize well to both …
is to learn a model from multiple source domains, such that it can generalize well to both …
[PDF][PDF] Handling Class-Imbalance for Improved Zero-Shot Domain Generalization.
A Arfeen, T Dutta, S Biswas - BMVC, 2022 - bmvc2022.mpi-inf.mpg.de
Zero-shot domain generalization (ZSDG) simultaneously addresses the challenges of
dissimilar distribution and disjoint label-spaces of the training and test data in the context of …
dissimilar distribution and disjoint label-spaces of the training and test data in the context of …
Less but Better: Enabling Generalized Zero-shot Learning Towards Unseen Domains by Intrinsic Learning from Redundant LLM Semantics
Generalized zero-shot learning (GZSL) focuses on recognizing seen and unseen classes
against domain shift problem (DSP) where data of unseen classes may be misclassified as …
against domain shift problem (DSP) where data of unseen classes may be misclassified as …
INDIGO: Intrinsic Multimodality for Domain Generalization
For models to generalize under unseen domains (aka domain generalization), it is crucial to
learn feature representations that are domain-agnostic and capture the underlying …
learn feature representations that are domain-agnostic and capture the underlying …
Prompt Tuning Is All We Need?
H Yu, H Zheng, Y Zhang, S Xie, X Cao, Z Fang - openreview.net
Recent advances in pre-trained vision-language models, eg, CLIP, have demonstrated
remarkable success in domain generalization (DG) by tuning prompts. To promote DG, one …
remarkable success in domain generalization (DG) by tuning prompts. To promote DG, one …