Sample Complexity for Distributionally Robust Learning under chi-square divergence
Zhengyu Zhou, Weiwei Liu; 24(230):1−27, 2023.
Abstract
This paper investigates the sample complexity of learning a distributionally robust predictor under a particular distributional shift based on $\chi^2$-divergence, which is well known for its computational feasibility and statistical properties. We demonstrate that any hypothesis class $\mathcal{H}$ with finite VC dimension is distributionally robustly learnable. Moreover, we show that when the perturbation size is smaller than a constant, finite VC dimension is also necessary for distributionally robust learning by deriving a lower bound of sample complexity in terms of VC dimension.
[abs]
[pdf][bib]© JMLR 2023. (edit, beta) |