Colloquia and Seminars

To join the email distribution list of the cs colloquia, please visit the list subscription page.


Computer Science events calendar in HTTP ICS format for of Google calendars, and for Outlook.

Academic Calendar at Technion site.

Upcoming Colloquia & Seminars

  • Interplays between Machine Learning and Optimization

    Speaker:
    Tomer Koren - CS-Lecture
    Date:
    Monday, 18.12.2017, 10:30
    Place:
    Room 601 Taub Bld.
    Affiliation:
    Google, Mountain View

    Over the past two decades, machine learning has rapidly evolved and emerged as a highly influential discipline of computer science and engineering. One of the pillars of machine learning is mathematical optimization, and the connection between the two fields has been a primary focus of research. In this talk, I will present two recent works that contribute to this study, focusing on online learning---a central model in machine learning for sequential decision making and learning under uncertainty. In the first part of the talk, I will describe a foundational result concerned with the power of optimization in online learning, and give answer to the question: does there exist a generic and efficient reduction from online learning to black-box optimization? In the second part, I will discuss a recent work that employs online learning techniques to design a new efficient and adaptive preconditioned algorithm for large-scale optimization. Despite employing preconditioning, the algorithm is practical even in modern optimization scenarios such as those arising in training state-of-the-art deep neural networks. I will present the new algorithm along with its theoretical guarantees and demonstrate its performance empirically. Short Bio: ========== Tomer Koren is a Research Scientist at Google, Mountain View. His research focuses on machine learning and optimization, with an emphasis on online and statistical learning, sequential decision making, and stochastic optimization. Tomer joined Google in 2016 after receiving his Ph.D. from the Technion---Israel Institute of Technology, under the guidance of Prof. Elad Hazan. During his doctoral studies, he was also a research intern with Microsoft Research Herzliya, Microsoft Research Redmond, and Yahoo Research Labs in Haifa.

  • Faster and Simpler Distributed CONGEST-Algorithms for Testing and Correcting Graph Properties

    Speaker:
    Guy Even - COLLOQUIUM LECTURE
    Date:
    Tuesday, 19.12.2017, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    Dept. of Electrical Engineering-Systems, Tel-Aviv University
    Host:
    Yuval Filmus

    We consider the following problem introduced by [Censor-Hillel et al., DISC 2016]. Design a distributed algorithm (called an $\epsilon$-tester) that tests whether the network over which the algorithm is running satisfies a given property (e.g., acyclic, bipartite) or is $\epsilon$-far from satisfying the property. If the network satisfies the property, then all processors must accept. If the network is $\epsilon$-far from satisfying the property, then (with probability at least $2/3$) at least one processor must reject. Being $\epsilon$-far from a property means that at least $\epsilon\cdot |E|$ edges need to be deleted or inserted to satisfy the property. Suppose we have an $\epsilon$-tester that runs in $O(Diameter)$ rounds. We show how to transform this tester to an $\epsilon$-tester that runs in $O((\log |V|)/\epsilon))$ rounds. Since cycle-freeness and bipartiteness are easily tested in $O(Diameter)$ rounds, we obtain $\epsilon$-testers for these properties with a logarithmic number of rounds. Moreover, for cycle-freeness, we obtain a \emph{corrector} of the graph that locally corrects the graph so that the corrected graph is acyclic. The corrector deletes at most $\epsilon \cdot |E| + distance(G,P)$ edges and requires $O((\log |V|)/\epsilon))$ rounds. Joint work with Reut Levy and Moti Medina. Short Bio: Guy Even received his B.Sc. degree in 1988 in Mathematics and Computer Science from the Hebrew University. Received his M.Sc. and D.Sc. in Computer Science from the Technion in 1991 and 1994, respectively. Dr. Even spent his post-doctorate in the University of the Saarland during 1995–1997 with Prof. Wolfgang Paul. Since 1997, he is a faculty member in the School of Electrical Engineering in Tel-Aviv University. Dr. Even is interested in algorithms and their applications in various fields. He has published papers on the theory of VLSI, approximation algorithms, computer arithmetic, online algorithms, frequency assignment, scheduling, packet-routing, linear-programming decoding of LDPC codes, and rateless codes. He is on the editorial board of "Theory of Computing Systems". Together with Moti Medina, Dr. Even wrote a digital hardware textbook titled "Digital Logic Design: A Rigorous Approach" published in 2012 by Cambridge University Press. ============================================= Refreshments will be served from 14:15 Lecture starts at 14:30

  • Overcoming Intractability in Learning

    Speaker:
    Roi Livni - CS-Lecture
    Date:
    Thursday, 21.12.2017, 10:30
    Place:
    Room 601 Taub Bld.
    Affiliation:
    Princeton University, CS Dept.

    Machine learning has recently been revolutionized by the introduction of Deep Neural Networks. However, from a theoretical viewpoint these methods are still poorly understood. Indeed the key challenge in Machine Learning today is to derive rigorous results for optimization and generalization in deep learning. In this talk I will present several tractable approaches to training neural networks. At the second part I will discuss a new sequential algorithm for decision making that can take into account the structure in the action space and is more tuned with realistic decision making scenarios. I will present our work that provides some of the first positive results and yield new, provably efficient, and practical algorithms for training certain types of neural networks. In a second work I will present a new online algorithm that learns by sequentially sampling random networks and asymptotically converges, in performance, to the optimal network. Our approach improves on previous random features based learning in terms of sample/computational complexity, and expressiveness. In a more recent work we take a different perspective on this problem. I will provide sufficient conditions that guarantee tractable learning, using the notion of refutation complexity. I will then discuss how this new idea can lead to new interesting generalization bounds that can potentially explain generalization in settings that are not always captured by classical theory. In the setting of reinforcement learning I will present a recently developed new algorithm for decision making in a metrical action space. As an application, we consider a dynamic pricing problem in which a seller is faced with a stream of patient buyers. Each buyer buy at the lowest price in a certain time window. We use our algorithm to achieve an optimal regret, improving on previously known regret bound. Short Bio: ========== Roi Livni is a research instructor at the computer science department in Princeton University. He is a recipient of the Eric and Wendy Schmidt fellowship for strategic innovation, and a Yad Hanadiv fellow for postdoctoral position. He completed his PhD at the Hebrew University, during which he was a recipient of a Google Europe fellowship in learning theory. He also served, during his graduate studies, as a long term intern at Microsoft Research. His graduate work won a Best paper award in ICML'13, and a best student paper award in COLT'13.

  • On the Expressive Power of ConvNets and RNNs as a Function of their Architecture

    Speaker:
    Amnon Shashua - COLLOQUIUM LECTURE
    Date:
    Tuesday, 2.1.2018, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    Hebrew University; CEO & CTO, Mobileye; Senior Vice President, Intel Corporation
    Host:
    Yuval Filmus
  • Bridging the Gap between End-users and Knowledge Sources: Discovery, Selection and Utilization

    Speaker:
    Yael Amsterdamer - COLLOQUIUM LECTURE
    Date:
    Tuesday, 9.1.2018, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    Department of Computer Science, Bar Ilan University
    Host:
    Yuval Filmus

    TBA

  • Personalization is a Two-Way Street

    Speaker:
    Ronny Lempel - COLLOQUIUM LECTURE
    Date:
    Tuesday, 16.1.2018, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    Outbrain
    Host:
    Yuval Filmus

    Recommender systems are first and foremost about matching users with items the systems believe will delight them. The "main street" of personalization is thus about modeling users and items, and matching per user the items predicted to best satisfy the user. This holds for both collaborative filtering and content-based methods. In content discovery engines, difficulties arise from the fact that the content users natively consume on publisher sites does not necessarily match the sponsored content that drives the monetization and sustains those engines. The first part of this talk addresses this gap by sharing lessons learned and by discussing how the gap may be bridged at scale with proper techniques. The second part of the talk focuses on personalization of audiences on behalf of content marketing campaigns. From the marketers' side, optimizing audiences was traditionally done by refining targeting criteria, basically limiting the set of users to be exposed to their campaigns. Marketers then began sharing conversion data with systems, and the systems began optimizing campaign conversions by serving the campaign to users likely to transact with the marketer. Today, a hybrid approach of lookalike modeling combines marketers' targeting criteria with recommendation systems' models to personalize audiences for campaigns, with marketer ROI as the target. This talk was given as a keynote in ACM RecSys 2017. Short bio: ========== Ronny Lempel joined Outbrain in May 2014 as VP of Outbrain's Recommendations Group, where he oversees the computation, delivery and auction mechanisms of the company's recommendations. Prior to joining Outbrain, Ronny spent 6.5 years as a Senior Director at Yahoo Labs. Ronny joined Yahoo in October 2007 to open and establish its Research Lab in Haifa, Israel. During his tenure at Yahoo, Ronny led R&D activities in diverse areas, including Web Search, Web Page Optimization, Recommender Systems and Ad Targeting. In January 2013 Ronny was appointed Yahoo Labs' Chief Data Scientist in addition to his managerial duties. Prior to joining Yahoo, Ronny spent 4.5 years at IBM Research, where his duties included research and development in the area of enterprise search systems. During his tenure at IBM, Ronny managed the Information Retrieval Group at IBM's Haifa Research Lab for two years. Ronny received his PhD, which focused on search engine technology, from the Faculty of Computer Science at Technion, Israel Institute of Technology in early 2003. Ronny has authored over 40 research papers in leading conferences and journals, and holds 18 granted US patents. He regularly serves on program and organization committees of Web-focused conferences, and has taught advanced courses on Search Engine Technologies and Big Data Technologies at Technion. ========================================= Refreshments will be served from 14:15 Lecture starts at 14:30

  • Network Measurement meets Virtual Switching

    Speaker:
    Ran Ben-Basat, Ph.D. Thesis Seminar
    Date:
    Wednesday, 14.2.2018, 14:30
    Place:
    Taub 601
    Advisor:
    Prof. Roy Friedman

    In modern cloud infrastructures, each physical server often runs multiple virtual machines and employs a software Virtual Switch (VS) to handle their traffic. In addition to switching, the VS performs network measurements, such as identifying the most frequent flows, which are essential for networking applications such as load balancing and intrusion detection. Unlike traditional streaming algorithms, which minimize the space requirements, the bottleneck in virtual switching measurement is the CPU utilization. In this talk, I will present new hardware-oriented algorithms and acceleration methods that optimize the update time for software, at the cost of a slight memory overhead.