דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

event speaker icon
אביעד אברדם (מדעי המחשב/הנדסת חשמל, טכניון)
event date icon
יום שלישי, 08.02.2022, 10:30
event location icon
Zoom Lecture: https://technion.zoom.us/my/chaimbaskin
Neural networks that are based on unfolding of an iterativesolver, such as LISTA (learned iterative soft threshold algorithm), are widelyused due to their accelerated performance. Nevertheless, as opposed tonon-learned solvers, these networks are trained on a certain dictionary, andtherefore they are inapplicable for varying model scenarios. This talkintroduces an adaptive learned solver, termed Ada-LISTA, which receives pairsof signals and their corresponding dictionaries as inputs, and learns auniversal architecture to serve them all. We prove that this scheme isguaranteed to solve sparse coding in linear rate for varying models, includingdictionary perturbations and permutations. We also provide an extensivenumerical study demonstrating its practical adaptation capabilities. Anotherway to improve performance is to adopt coordinate descent algorithms which arepopular in machine learning and large-scale data analysis problems due to theirlow computational cost. In this talk, we define a monotone acceleratedc ordinate gradient descent-type method for problems consisting of minimizing f+ g, where f is quadratic and g is nonsmooth and non-separable. The algorithmis enabled by employing the forward–backward envelope, a composite envelopethat possess an exact smooth reformulation of f + g. We prove the algorithmachieves a convergence rate of O(1/k^1.5) in terms of the original objective function,improving current coordinate descent-type algorithms. In addition, we describean adaptive variant of the algorithm that backtracks the spectral informationand coordinate Lipschitz constants of the problem. We numerically examine ouralgorithms on various settings, including two-dimensional total-variation-basedimage inpainting problems, showing a clear advantage in performance overcurrent coordinate descent-type methods. PHD student under supervision of Prof. Michael Elad.