דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

event speaker icon
לאה בר (הנדסת חשמל, אונ' מינסוטה)
event date icon
יום שלישי, 23.12.2008, 11:30
event location icon
חדר 1061, בניין מאייר, הפקולטה להנדסת חשמל
Many problems in image processing are addressed via the minimization of a cost functional. The most prominent optimization technique is the gradient descent, often used due to its simplicity and applicability where other techniques, e.g., those coming from discrete optimization, can not be used. Yet, gradient descent suffers from a slow convergence, and often to just local minima which highly depends on the condition number of the functional Hessian. Newton-type methods, on the other hand, are known to have a rapid, quadratic, convergence. In its classical form, the Newton method relies on the $L^2$-type norm to define the descent direction. In this work, we generalize and reformulate this very important optimization method by introducing a novel Newton method based on more general norms. This generalization opens up new possibilities in the extraction of the Newton step, including benefits such as mathematical stability and the incorporation of smoothness constraints. We first present the derivation of the modified Newton step in the calculus of variation framework needed for image processing. Then, we demonstrate the method with two common objective functionals: variational image deblurring and geodesic active contours for image segmentation. We show that in addition to the fast convergence, norms adapted to the problem at hand yield different and superior results.

*Joint work with Guillermo Sapiro