Skip to content (access key 's')
Logo of Technion
Logo of CS Department
Logo of CS4People
Events

The Taub Faculty of Computer Science Events and Talks

Pixel Club: On the Connection between Deep Neural Networks and Kernel Methods
event speaker icon
Ronen Basri (Weizmann Institute of Science)
event date icon
Tuesday, 27.04.2021, 11:30
event location icon
Zoom Lecture: 91488539030
Recent theoretical work has shown that massively overparameterized neural networks are equivalent to kernel regressors that use Neural Tangent Kernels (NTKs). Experiments indicate that these kernel methods perform similarly to real neural networks. My work in this subject aims to better understand the properties of NTK and relate them to properties of real neural networks. In particular, I will argue that for input data distributed uniformly on the sphere NTK favors low frequency predictions over high frequency ones, potentially explaining why overparameterized networks do not overfit their training data. I will further discuss the behavior of NTK when data is distributed nonuniformly, and finally show that NTK is tightly related to the classical Laplace kernel, which has a simple closed form. Our results suggest that much insight about neural networks can be obtained from analysis of NTK. Bio: Ronen Basri is Professor of Computer Science and Dean of Mathematics and Computer Science at the Weizmann Institute of Science. Additionally, he is the incumbent of the Elaine and Bram Goldsmith Chair of Applied Mathematics. Ronen Basri received his Ph.D. degree from the Weizmann Institute of Science, and was postdoctoral fellow at the Massachusetts Institute of Technology before assuming a faculty position at Weizmann. He further held visiting positions at NEC Research Institute, Toyota Technological Institute at Chicago, Howard Hughes Janelia Farm Campus and the University of Maryland at College Park. His research interests include computer vision and machine learning. His recent work focuses primarily on shape modeling and 3D reconstruction, as well as theoretical aspects of deep learning.