An Exponential Tail Bound for the Deleted Estimate

Authors

  • Karim Abou–Moustafa University of Alberta
  • Csaba Szepesvári University of Alberta

DOI:

https://doi.org/10.1609/aaai.v33i01.33013143

Abstract

There is an accumulating evidence in the literature that stability of learning algorithms is a key characteristic that permits a learning algorithm to generalize. Despite various insightful results in this direction, there seems to be an overlooked dichotomy in the type of stability-based generalization bounds we have in the literature. On one hand, the literature seems to suggest that exponential generalization bounds for the estimated risk, which are optimal, can be only obtained through stringent, distribution independent and computationally intractable notions of stability such as uniform stability. On the other hand, it seems that weaker notions of stability such as hypothesis stability, although it is distribution dependent and more amenable to computation, can only yield polynomial generalization bounds for the estimated risk, which are suboptimal. In this paper, we address the gap between these two regimes of results. In particular, the main question we address here is whether it is possible to derive exponential generalization bounds for the estimated risk using a notion of stability that is computationally tractable and distribution dependent, but weaker than uniform stability. Using recent advances in concentration inequalities, and using a notion of stability that is weaker than uniform stability but distribution dependent and amenable to computation, we derive an exponential tail bound for the concentration of the estimated risk of a hypothesis returned by a general learning rule, where the estimated risk is expressed in terms of the deleted estimate. Interestingly, we note that our final bound has similarities to previous exponential generalization bounds for the deleted estimate, in particular, the result of Bousquet and Elisseeff (2002) for the regression case.

Downloads

Published

2019-07-17

How to Cite

Abou–Moustafa, K., & Szepesvári, C. (2019). An Exponential Tail Bound for the Deleted Estimate. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 3143-3150. https://doi.org/10.1609/aaai.v33i01.33013143

Issue

Section

AAAI Technical Track: Machine Learning