Ronen Shaltiel (Haifa University)
Wednesday, 30.11.2016, 12:30
A celebrated result by Impagliazzo and Wigderson is that under complexity theoretic hardness assumptions, every randomized algorithm can be transformed into one that uses only logarithmically many bits, with polynomial slowdown. Such algorithms can then be completely derandomized, with polynomial slowdown. In the talk I will discuss recent work attempting to extend this approach to:
1. Randomized algorithms that err with probability $1-\epsilon$ for small $\epsilon$. (Here, the goal is to minimize the number of random bits/slowdown as a function of $\epsilon$).
2. Known SAT-solving randomized algorithms. (Here, polynomial slowdown is a deal breaker as it gives trivial
algorithms that run in super exponential time).
3. Randomized algorithms that sample from probability distributions. (Here, the goal is to sample a statistically-close distribution using only few random bits).
Based on joint work with Artemenko, Impagliazzo and Kabanets.