Skip to content (access key 's')
Logo of Technion
Logo of CS Department
Logo of CS4People

The Taub Faculty of Computer Science Events and Talks

Curvature Shift in Neural Networks
event speaker icon
Akiva Michael Block (M.Sc. Thesis Seminar)
event date icon
Sunday, 16.06.2024, 13:00
event location icon
Zoom Lecture: 94919856641 & Taub 9
event speaker icon
Advisor: Prof. Assaf Schuster
Neural network optimization is a challenging task, in large part because of the nonconvexity, in general, of the loss function. Various algorithms have been proposed to solve this task, but most of them focus on the convex subspaces of the loss function to avoid the challenge of finding the optimal step in concave subspaces. In addition to avoiding the concave spaces entirely (which can bear a significant computational burden!), they also simply take the minimizer of a second-degree Taylor approximation of the loss function for their step, without addressing the prospect of a significant gap between the loss function and its second-order approximation. In our work, we tackle this problem by making use of the Lipschitz continuity of the loss function to develop the regret-optimal step in all subspaces, whether convex or concave, and discuss the validity and (sometimes) experimental success of such optimizers. We hope that our work will provide practitioners with new tools for selecting neural network training algorithms.