Few-shot learning for feature selection with hilbert-schmidt independence criterion
We propose a few-shot learning method for feature selection that can select relevant
features given a small number of labeled instances. Existing methods require many labeled …
features given a small number of labeled instances. Existing methods require many labeled …
A short survey on importance weighting for machine learning
M Kimura, H Hino - arXiv preprint arXiv:2403.10175, 2024 - arxiv.org
Importance weighting is a fundamental procedure in statistics and machine learning that
weights the objective function or probability distribution based on the importance of the …
weights the objective function or probability distribution based on the importance of the …
Zero-Shot Task Adaptation with Relevant Feature Information
We propose a method to learn prediction models such as classifiers for unseen target tasks
where labeled and unlabeled data are absent but a few relevant input features for solving …
where labeled and unlabeled data are absent but a few relevant input features for solving …
Class-prior probability estimation using density ratio between unlabeled instances and positively labeled noisy instances
T Yoshida, E Shin'ya, T Washio - Neurocomputing, 2025 - Elsevier
In this paper, the authors propose a new approach for determining the mixture proportions of
positive and negative instances in both a small set of positively labeled instances …
positive and negative instances in both a small set of positively labeled instances …
Meta-learning for Robust Anomaly Detection
We propose a meta-learning method to improve the anomaly detection performance on
unseen target tasks that have only unlabeled data. Existing meta-learning methods for …
unseen target tasks that have only unlabeled data. Existing meta-learning methods for …
Meta-learning for Positive-unlabeled Classification
We propose a meta-learning method for positive and unlabeled (PU) classification, which
improves the performance of binary classifiers obtained from only PU data in unseen target …
improves the performance of binary classifiers obtained from only PU data in unseen target …
Importance-weighted variational inference model estimation for offline Bayesian model-based reinforcement learning
T Hishinuma, K Senda - IEEE Access, 2023 - ieeexplore.ieee.org
This paper proposes a model estimation method in offline Bayesian model-based
reinforcement learning (MBRL). Learning a Bayes-adaptive Markov decision process …
reinforcement learning (MBRL). Learning a Bayes-adaptive Markov decision process …
Unified Perspective on Probability Divergence via the Density-Ratio Likelihood: Bridging KL-Divergence and Integral Probability Metrics
This paper provides a unified perspective for the Kullback-Leibler (KL)-divergence and the
integral probability metrics (IPMs) from the perspective of maximum likelihood density-ratio …
integral probability metrics (IPMs) from the perspective of maximum likelihood density-ratio …
Some notes concerning a generalized KMM-type optimization method for density ratio estimation
CD Alecsa - arXiv preprint arXiv:2309.07887, 2023 - arxiv.org
In the present paper we introduce new optimization algorithms for the task of density ratio
estimation. More precisely, we consider extending the well-known KMM method using the …
estimation. More precisely, we consider extending the well-known KMM method using the …
On the improvement of density ratio estimation via probabilistic classifier: theoretical study and its applications
J Yin - 2023 - open.library.ubc.ca
Density ratio estimation has a broad application in the world of machine learning and data
science, especially in transfer learning and contrastive learning. This work mainly focuses …
science, especially in transfer learning and contrastive learning. This work mainly focuses …