Boaz Nadler (Weizmann Institute of Science)
Tuesday, 29.12.2015, 11:30
Collecting large amounts of data is now common in multiple applications. In certain cases, particularly involving real-time processing, we might, however, not even be able to analyze all of it. This raises the following question: How well can we perform some task under severe computational constraints.
In this talk I'll discuss such challenges for two specific problems:
1) edge detection from large and noisy images; and
2) detection of strong correlations / near duplicates in a large set of elements.
For both problems we develop sub-linear time algorithms and study the tradeoff between statistical accuracy and computational complexity.
Based on joint works with Inbal Horev, Meirav Galun, Ronen Basri, Ery Arias-Castro and with Ofer Shwartz.