The extended littlestone's dimension for learning with mistakes and abstentions

Chicheng Zhang, Kamalika Chaudhuri

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations


This paper studies classification with an abstention option in the online setting. In this setting, examples arrive sequentially, the learner is given a hypothesis class H, and the goal of the learner is to either predict a label on each example or abstain, while ensuring that it does not make more than a pre-specified number of mistakes when it does predict a label. Previous work on this problem has left open two main challenges. First, not much is known about the optimality of algorithms, and in particular, about what an optimal algorithmic strategy is for any individual hypothesis class. Second, while the realizable case has been studied, the more realistic non-realizable scenario is not well-understood. In this paper, we address both challenges. First, we provide a novel measure, called the Extended Littlestone's Dimension, which captures the number of abstentions needed to ensure a certain number of mistakes. Second, we explore the non-realizable case, and provide upper and lower bounds on the number of abstentions required by an algorithm to guarantee a specified number of mistakes.

Original languageEnglish (US)
Pages (from-to)1584-1616
Number of pages33
JournalJournal of Machine Learning Research
Issue numberJune
StatePublished - Jun 6 2016
Externally publishedYes
Event29th Conference on Learning Theory, COLT 2016 - New York, United States
Duration: Jun 23 2016Jun 26 2016

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence


Dive into the research topics of 'The extended littlestone's dimension for learning with mistakes and abstentions'. Together they form a unique fingerprint.

Cite this