Shay Moran, Ph.D. Thesis Seminar
Thursday, 19.5.2016, 12:00
[Note the change of time and date]
Generalization and simplification are deeply related to each other:
simpler explanations often reveal principles that apply more generally,
and predictive scientific theories often boil down
to a few simple (and deep) laws.
This talk is about a manifestation of the
within computational learning theory.
We will consider the following formal
frameworks for `generalization' and `simplification'.
(i) Probably Approximately Correct (PAC) learning --
a statistical framework that formalizes generalization.
(ii) Sample compression schemes --
a combinatorial framework that formalizes simplification.
Littlestone and Warmuth showed that,
for binary classification problems,
the ability to simplify implies the ability to generalize,
and asked whether the other direction holds.
In the first part of this talk we will give
an affirmative answer to this question.
Apart from being a formal manifestation of the
the connection between compression and generalization
found applications in binary-classification theory.
A notable example is Freund and Schapire's boosting theory,
which is based on a weak type of compression.
It is therefore interesting, to extend the notion of compression
beyond binary classification problems.
In the second part of the talk we will discuss such extensions,
and establish their equivalence with learnability.
No knowledge in machine learning will be assumed in the talk.