Skip to content (access key 's')
Logo of Technion
Logo of CS Department


On the Generalization of Gaussian Dropout using PAC-Bayesian bounds and Log-Sobolev Inequalities
event speaker icon
Yaniv Nemcovsky, M.Sc. Thesis Seminar
event date icon
Monday, 29.7.2019, 10:00
event location icon
Room 601 Taub Bld.
event speaker icon
Advisor:  Dr. Tamir Hazan
The omnipresence of increasingly large deep networks derives primarily from their empirical successes. Indeed, empirical evidence places a strong emphasis on operating at scale, more parameters and layers, in order to help both optimization and generalization. This counter-intuitive trend has eluded theoretical analysis. In this work, we present a PAC-Bayesian generalization bound for the Gaussian dropout that only requires an on-average loss function bound and on-average norm-gradient bounds by relying on log-Sobolev inequalities for Gaussian measures. Our preliminary experimental evaluation shows that our bounds \emph{decrease} when adding more layers or more parameters to the network.
[Back to the index of events]