Learning models with uniform performance via distributionally robust optimization
JC Duchi, H Namkoong - The Annals of Statistics, 2021 - projecteuclid.org
Learning models with uniform performance via distributionally robust optimization Page 1 The
Annals of Statistics 2021, Vol. 49, No. 3, 1378–1406 https://doi.org/10.1214/20-AOS2004 © …
Annals of Statistics 2021, Vol. 49, No. 3, 1378–1406 https://doi.org/10.1214/20-AOS2004 © …
Quantifying distributional model risk via optimal transport
J Blanchet, K Murthy - Mathematics of Operations Research, 2019 - pubsonline.informs.org
This paper deals with the problem of quantifying the impact of model misspecification when
computing general expected values of interest. The methodology that we propose is …
computing general expected values of interest. The methodology that we propose is …
PAC-Bayesian bounds based on the Rényi divergence
L Bégin, P Germain, F Laviolette… - Artificial Intelligence …, 2016 - proceedings.mlr.press
We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows
to divide the proof in four successive inequalities, easing the" customization" of PAC …
to divide the proof in four successive inequalities, easing the" customization" of PAC …
Advanced tutorial: Input uncertainty and robust analysis in stochastic simulation
H Lam - 2016 Winter Simulation Conference (WSC), 2016 - ieeexplore.ieee.org
Input uncertainty refers to errors caused by a lack of complete knowledge about the
probability distributions used to generate input variates in stochastic simulation. The …
probability distributions used to generate input variates in stochastic simulation. The …
(f, Gamma)-Divergences: Interpolating between f-Divergences and Integral Probability Metrics
We develop a rigorous and general framework for constructing information-theoretic
divergences that subsume both f-divergences and integral probability metrics (IPMs), such …
divergences that subsume both f-divergences and integral probability metrics (IPMs), such …
Improved bounds on lossless source coding and guessing moments via Rényi measures
This paper provides upper and lower bounds on the optimal guessing moments of a random
variable taking values on a finite set when side information may be available. These …
variable taking values on a finite set when side information may be available. These …
An operational approach to information leakage via generalized gain functions
We introduce a gain function viewpoint of information leakage by proposing maximal-
leakage, a rich class of operationally meaningful leakage measures that subsumes recently …
leakage, a rich class of operationally meaningful leakage measures that subsumes recently …
Dependence measures bounding the exploration bias for general measurements
We propose a framework to analyze and quantify the bias in adaptive data analysis. It
generalizes that proposed by Russo and Zou'15, applying to measurements whose moment …
generalizes that proposed by Russo and Zou'15, applying to measurements whose moment …
Variational representations and neural network estimation of Rényi divergences
We derive a new variational formula for the Rényi family of divergences, R_α(Q‖P),
between probability measures Q and P. Our result generalizes the classical Donsker …
between probability measures Q and P. Our result generalizes the classical Donsker …
Sensitivity analysis of Wasserstein distributionally robust optimization problems
We consider sensitivity of a generic stochastic optimization problem to model uncertainty.
We take a non-parametric approach and capture model uncertainty using Wasserstein balls …
We take a non-parametric approach and capture model uncertainty using Wasserstein balls …