On mixup regularization
… can be interpreted as an empiricial risk minimization on modified data with random
perturbations. In Section 4 we analyze the regularization effect of Mixup through a quadratic Taylor …
perturbations. In Section 4 we analyze the regularization effect of Mixup through a quadratic Taylor …
Mixup as locally linear out-of-manifold regularization
… that MixUp is also a data-dependent regularization scheme in the sense that the imposed
constraints on the … Next we will argue that MixUp may be viewed as another data-dependent …
constraints on the … Next we will argue that MixUp may be viewed as another data-dependent …
How does mixup help with robustness and generalization?
… In this section, we first introduce a lemma that characterizes the regularization effect of
Mixup. Based on this lemma, we then derive our main theoretical results on adversarial …
Mixup. Based on this lemma, we then derive our main theoretical results on adversarial …
Dual mixup regularized learning for adversarial domain adaptation
… First, we apply category mixup regularization on source and target domains. Specifically,
for unlabeled target data, pseudo-labels are introduced. Since there are obviously false labels …
for unlabeled target data, pseudo-labels are introduced. Since there are obviously false labels …
Using mixup as a regularizer can surprisingly improve accuracy & out-of-distribution robustness
… In fact, we observe that Mixup otherwise yields much degraded performance on detecting
out-of-distribution samples possibly, as we show empirically, due to its tendency to learn …
out-of-distribution samples possibly, as we show empirically, due to its tendency to learn …
C-mixup: Improving generalization in regression
… The first line of research directly imposes regularization on meta-learning algorithms [21,
30, 63, 79]. The second line of approaches introduces task augmentation to produce more …
30, 63, 79]. The second line of approaches introduces task augmentation to produce more …
Understanding mixup training methods
D Liang, F Yang, T Zhang, P Yang - IEEE access, 2018 - ieeexplore.ieee.org
… that mixup-HV and mixup-HC use more data augmentation than mixup-C and mixup-H, because
the regularization of … Based on the conclusion that the input information of the mixup has …
the regularization of … Based on the conclusion that the input information of the mixup has …
Fair mixup: Fairness via interpolation
… regularizing the models on paths of interpolated samples between the groups. We use mixup,
… We analyze fair mixup and empirically show that it ensures a better generalization for both …
… We analyze fair mixup and empirically show that it ensures a better generalization for both …
Remix: rebalanced mixup
… In order to come up with a solution that is convenient to incorporate for large-scale datasets,
we focus on regularization techniques which normally introduce little extra costs. Despite the …
we focus on regularization techniques which normally introduce little extra costs. Despite the …
Noisy feature mixup
… regularization, showing that NFM amplifies the regularizing … , amplifying the regularizing
effects of manifold mixup and noise … In Subsection 4.3, we focus on demonstrating how NFM can …
effects of manifold mixup and noise … In Subsection 4.3, we focus on demonstrating how NFM can …