# Colloquia and Seminars

To join the email distribution list of the cs colloquia, please visit the list subscription page.

Computer Science events calendar in HTTP ICS format for of Google calendars, and for Outlook.

Academic Calendar at Technion site.

## Upcoming Colloquia & Seminars

• ### AI WILL CHANGE THE WORLD

Speaker:
Jansen Huang (NVIDIA) - CS Special Guest Talk
Date:
Thursday, 19.10.2017, 11:00
Place:
CS Taub Auditorium 2

Jansen Huang, NVIDIA president and world techniology leader, will visit CS and will give a special talk: "AI WILL CHANGE THE WORLD". The lecture is open to all.

• ### A Deep Learning Approach to Autonomous Driving

Speaker:
Larry Jackel (NVIDIA) - CS Special Guest Talk
Date:
Thursday, 19.10.2017, 14:30
Place:
Room 337 Taub Bld.

This talk describes a system, known as PilotNet, that provides perception and control for self-driving cars. PilotNet learns from data, emulating the behavior of human drivers. The use of hand-crafted rules is kept to a minimum. As we gather more data we find that PilotNet performance keeps improving. Our test car, controlled by PilotNet, can drive, on average, over 20 miles on a highway before a human intervention is required. The core technology in PilotNet is convolutional neural networks (CNNs). These nets can perform lane keeping, lane changes, and turns at intersections. Similar nets have shown promise in identifying the presence and location of stop signs using only weakly supervised learning. To better understand how PilotNet works, we built a visualization tool that runs real-time on our test car and that highlights objects in the input image that are most salient in determining PilotNet’s output.

• ### Large Batch Training of Convolutional Networks with Linear-wise Adaptive Rate Scaling

Speaker:
Boris Ginsburg (NVIDIA) - CS Special Guest Talk
Date:
Thursday, 19.10.2017, 14:30
Place:
Room 337 Taub Bld.

A common way to speed up training of large convolutional networks is to add computational units. Training is then performed using data-parallel synchronous Stochastic Gradient Descent (SGD) with a mini-batch divided between computational units. With an increase in the number of nodes, the batch size grows. However, training with a large batch often results in lower model accuracy. We argue that the current recipe for large batch training (linear learning rate scaling with warm-up) is not general enough and training may diverge. To overcome these optimization difficulties, we propose a new training algorithm based on Layer-wise Adaptive Rate Scaling (LARS). Using LARS, we scaled AlexNet and ResNet-50 to a batch size of 16K without loss in accuracy

• ### Restricted Optimism via Posterior Sampling

Speaker:
Snir Cohen, M.Sc. Thesis Seminar
Date:
Wednesday, 25.10.2017, 14:00
Place:
Taub 601
Prof. S. Mannor

Optimistic methods for solving Reinforcement Learning problems are very popular in the literature. In practice, however, these methods show inferior performance compared to other methods, such as Posterior Sampling. We propose a novel concept of Restricted Optimism to balance the well known exploration vs. exploitation trade-off for finite-horizon MDPs. We harness Posterior Sampling to construct two algorithms in the spirit of our Restricted Optimism principle. We provide theoretical guarantees for them and demonstrate through experiments that there exists a trade-off between the average cumulative regret suffered by the agent and the variance. The agent can influence this trade-off by tuning the level of optimism carried out by our proposed algorithms through a regularization parameter.

• ### Complexity-Theoretic Foundations of Quantum Supremacy Experiments

Speaker:
Scott Aaronson - COLLOQUIUM LECTURE
Date:
Tuesday, 7.11.2017, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
University of Texas, Austin, Computer Science
Host:
Yuval Filmus

In the near future, there will likely be special-purpose quantum computers with 50 or so high-quality qubits. In this talk, I'll discuss general theoretical foundations for how to use such devices to demonstrate "quantum supremacy": that is, a clear quantum speedup for *some* task, motivated by the goal of overturning the Extended Church-Turing Thesis (which says that all physical systems can be efficiently simulated by classical computers) as confidently as possible. Based on recent joint work with Lijie Chen, https://arxiv.org/abs/1612.05903 Short Bio: ========== Scott Aaronson is David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin. He received his bachelor's from Cornell University and his PhD from UC Berkeley, and did postdoctoral fellowships at the Institute for Advanced Study as well as the University of Waterloo. Before coming to UT Austin, he spent nine years as a professor in Electrical Engineering and Computer Science at MIT. Aaronson's research in theoretical computer science has focused mainly on the capabilities and limits of quantum computers. His first book, Quantum Computing Since Democritus, was published in 2013 by Cambridge University Press. He has received the National Science Foundation's Alan T. Waterman Award, the United States PECASE Award, the Vannevar Bush Fellowship, the Simons Investigator award, and MIT's Junior Bose Award for Excellence in Teaching. ======================================== Refreshments will be served from 14:15 Lecture starts at 14:30

• ### Opening the Black Box of Deep Neural Networks via Information

Speaker:
Naftali Tishby - COLLOQUIUM LECTURE
Date:
Tuesday, 21.11.2017, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
Hebrew University
Host:
Yuval Filmus
• ### Faster and Simpler Distributed CONGEST-Algorithms for Testing and Correcting Graph Properties

Speaker:
Guy Even - COLLOQUIUM LECTURE
Date:
Tuesday, 5.12.2017, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
Dept. of Electrical Engineering-Systems, Tel-Aviv University
Host:
Yuval Filmus

We consider the following problem introduced by [Censor-Hillel et al., DISC 2016]. Design a distributed algorithm (called an $\epsilon$-tester) that tests whether the network over which the algorithm is running satisfies a given property (e.g., acyclic, bipartite) or is $\epsilon$-far from satisfying the property. If the network satisfies the property, then all processors must accept. If the network is $\epsilon$-far from satisfying the property, then (with probability at least $2/3$) at least one processor must reject. Being $\epsilon$-far from a property means that at least $\epsilon\cdot |E|$ edges need to be deleted or inserted to satisfy the property. Suppose we have an $\epsilon$-tester that runs in $O(Diameter)$ rounds. We show how to transform this tester to an $\epsilon$-tester that runs in $O((\log |V|)/\epsilon))$ rounds. Since cycle-freeness and bipartiteness are easily tested in $O(Diameter)$ rounds, we obtain $\epsilon$-testers for these properties with a logarithmic number of rounds. Moreover, for cycle-freeness, we obtain a \emph{corrector} of the graph that locally corrects the graph so that the corrected graph is acyclic. The corrector deletes at most $\epsilon \cdot |E| + distance(G,P)$ edges and requires $O((\log |V|)/\epsilon))$ rounds. Joint work with Reut Levy and Moti Medina. Short Bio: ========== ============================================= Refreshments will be served from 14:15 Lecture starts at 14:30

• ### TBA

Speaker:
Amnon Shashua - COLLOQUIUM LECTURE
Date:
Tuesday, 2.1.2018, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
Hebrew University; CEO & CTO, Mobileye; Senior Vice President, Intel Corporation
Host:
Yuval Filmus
• ### Bridging the Gap between End-users and Knowledge Sources: Discovery, Selection and Utilization

Speaker:
Yael Amsterdamer - COLLOQUIUM LECTURE
Date:
Tuesday, 9.1.2018, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
Department of Computer Science, Bar Ilan University
Host:
Yuval Filmus

TBA

• ### Personalization is a Two-Way Street

Speaker:
Ronny Lempel - COLLOQUIUM LECTURE
Date:
Tuesday, 16.1.2018, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
Outbrain
Host:
Yuval Filmus

Recommender systems are first and foremost about matching users with items the systems believe will delight them. The "main street" of personalization is thus about modeling users and items, and matching per user the items predicted to best satisfy the user. This holds for both collaborative filtering and content-based methods. In content discovery engines, difficulties arise from the fact that the content users natively consume on publisher sites does not necessarily match the sponsored content that drives the monetization and sustains those engines. The first part of this talk addresses this gap by sharing lessons learned and by discussing how the gap may be bridged at scale with proper techniques. The second part of the talk focuses on personalization of audiences on behalf of content marketing campaigns. From the marketers' side, optimizing audiences was traditionally done by refining targeting criteria, basically limiting the set of users to be exposed to their campaigns. Marketers then began sharing conversion data with systems, and the systems began optimizing campaign conversions by serving the campaign to users likely to transact with the marketer. Today, a hybrid approach of lookalike modeling combines marketers' targeting criteria with recommendation systems' models to personalize audiences for campaigns, with marketer ROI as the target. This talk was given as a keynote in ACM RecSys 2017. Short bio: ========== Ronny Lempel joined Outbrain in May 2014 as VP of Outbrain's Recommendations Group, where he oversees the computation, delivery and auction mechanisms of the company's recommendations. Prior to joining Outbrain, Ronny spent 6.5 years as a Senior Director at Yahoo Labs. Ronny joined Yahoo in October 2007 to open and establish its Research Lab in Haifa, Israel. During his tenure at Yahoo, Ronny led R&D activities in diverse areas, including Web Search, Web Page Optimization, Recommender Systems and Ad Targeting. In January 2013 Ronny was appointed Yahoo Labs' Chief Data Scientist in addition to his managerial duties. Prior to joining Yahoo, Ronny spent 4.5 years at IBM Research, where his duties included research and development in the area of enterprise search systems. During his tenure at IBM, Ronny managed the Information Retrieval Group at IBM's Haifa Research Lab for two years. Ronny received his PhD, which focused on search engine technology, from the Faculty of Computer Science at Technion, Israel Institute of Technology in early 2003. Ronny has authored over 40 research papers in leading conferences and journals, and holds 18 granted US patents. He regularly serves on program and organization committees of Web-focused conferences, and has taught advanced courses on Search Engine Technologies and Big Data Technologies at Technion. ========================================= Refreshments will be served from 14:15 Lecture starts at 14:30