דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

קירוב מספר המשתנים הרלוונטיים בפריטי (Parity) גורר למידה פרופר
event speaker icon
ג'ורג' חדאד (הרצאה סמינריונית למגיסטר)
event date icon
יום שני, 13.04.2026, 12:30
event location icon
טאוב 601
event speaker icon
מנחה: פרופ' נאדר בשותי

Consider a model in which we can access a parity function through random uniformly distributed labeled examples in the presence of random classification noise. In this thesis, we study learning in this model and show that approximating the number of relevant variables of a parity function is as hard as properly learning it.

More specifically, let $\gamma : \mathbb{R}^+ \to \mathbb{R}^+$ be any strictly increasing function satisfying $\gamma(x) \ge x$. In our first result, we show that from any polynomial-time algorithm that returns a $\gamma$-approximation $D$ (i.e., $\gamma^{-1}(d(f)) \leq D \leq \gamma(d(f))$), of the number of relevant variables~$d(f)$ of any parity function $f$, we can, in polynomial time, construct a solution to the long-standing open problem of polynomial-time learning $k(n)$-sparse parities (parities with $k(n)\le n$ relevant variables), where $k(n) = \omega_n(1)$.

In our second result, we show that from any $T(n)$-time algorithm that, for any parity $f$, returns a $\gamma$-approximation of the number of relevant variables $d(f)$ of $f$, we can, in polynomial time, construct a $poly(\Gamma(n))T(\Gamma(n)^2)$-time algorithm that properly learns parities, where $\Gamma(x)=\gamma(\gamma(x))$.

If $T(\Gamma(n)^2)=\exp({o(n/\log n)})$, this would resolve another long-standing open problem of properly learning parities in the presence of random classification noise in time~$\exp({o(n/\log n)})$.