Colloquia and Seminars

To join the email distribution list of the cs colloquia, please visit the list subscription page.


Computer Science events calendar in HTTP ICS format for of Google calendars, and for Outlook.

Academic Calendar at Technion site.

Upcoming Colloquia & Seminars

  • Pixel Club: From Voxels to Pixels and Back: Self-Supervision in Natural-Image Reconstruction from fMRI

    Speaker:
    Guy Gaziv (Weizmann Institute of Science)
    Date:
    Tuesday, 28.1.2020, 11:30
    Place:
    Electrical Eng. Building 1061

    Reconstructing observed images from fMRI brain recordings is challenging. Unfortunately, acquiring sufficient "labeled" pairs of {Image, fMRI} (i.e., images with their corresponding fMRI responses) to span the huge space of natural images is prohibitive for many reasons. We present a novel approach which, in addition to the scarce labeled data (training pairs), allows to train fMRI-to-image reconstruction networks also on "unlabeled" data (i.e., images without fMRI recording, and fMRI recording without images). The proposed model utilizes both an Encoder network (image-to-fMRI) and a Decoder network (fMRI-to-image). Concatenating these two networks back-to-back (Encoder-Decoder & Decoder-Encoder) allows augmenting the training with both types of unlabeled data. Importantly, it allows training on the unlabeled test-fMRI data. This self-supervision adapts the reconstruction network to the new input test-data, despite its deviation from the statistics of the scarce training data.

    Short bio:
    Guy is a PhD student from the Michal Irani Computer Vision Lab at WIS. His research established a new line of work in his group on reconstruction of presented visual stimuli from evoked brain activity using Deep Learning. Guy holds a BSc in Computer Engineering & Physics (Hebrew U.), MSc in Physics & Biology (WIS, advised by Prof. Uri Alon) and is now pursuing his PhD in CS & Neuroscience (WIS).

  • Theory Seminar: Strong Average-Case Circuit Lower Bounds from Non-trivial Derandomization

    Speaker:
    Lijie Chen (MIT)
    Date:
    Wednesday, 29.1.2020, 12:30
    Place:
    Taub 201 Taub Bld.

    We prove that for all constants a, NQP = NTIME[n^polylog(n)] cannot be (1/2 + 2^(-log^a n) )-approximated by 2^(log^a n)-size ACC^0 circuits. Previously, it was even open whether E^NP can be (1/2+1/sqrt(n))-approximated by AC^0[2] circuits. As a straightforward application, we obtain an infinitely often non-deterministic pseudorandom generator for poly-size ACC^0 circuits with seed length 2^{log^eps n}, for all eps > 0.

    More generally, we establish a connection showing that, for a typical circuit class C, non-trivial nondeterministic CAPP algorithms imply strong (1/2 + 1/n^{omega(1)}) average-case lower bounds for nondeterministic time classes against C circuits. The existence of such (deterministic) algorithms is much weaker than the widely believed conjecture PromiseBPP = PromiseP.

    Our new results build on a line of recent works, including [Murray and Williams, STOC 2018], [Chen and Williams, CCC 2019], and [Chen, FOCS 2019]. In particular, it strengthens the corresponding (1/2 + 1/polylog(n))-inapproximability average-case lower bounds in [Chen, FOCS 2019]. The two important technical ingredients are techniques from Cryptography in NC^0 [Applebaum et al., SICOMP 2006], and Probabilistic Checkable Proofs of Proximity with NC^1-computable proofs.

    This is joint work with Hanlin Ren from Tsinghua University.

  • Pixel Club :Learning-Based Strong Solutions to Forward and Inverse Problems in Partial Differential Equations

    Speaker:
    Lea Bar (Tel Aviv University)
    Date:
    Tuesday, 18.2.2020, 11:30
    Place:
    Electrical Eng. Building 1061

    We introduce a novel neural network-based partial differential equations solver for forward and inverse problems. The solver is grid free, mesh free and shape free, and the solution is approximated by a neural network.

    We employ an unsupervised approach such that the input to the network is a points set in an arbitrary domain, and the output is the set of the corresponding function values. The network is trained to minimize deviations of the learned function from the PDE solution and satisfy the boundary conditions.

    The resulting solution in turn is an explicit smooth differentiable function with a known analytical form.

    Unlike other numerical methods such as finite differences and finite elements, the derivatives of the desired function can be analytically calculated to any order. This framework therefore, enables the solution of high order non-linear PDEs. The proposed algorithm is a unified formulation of both forward and inverse problems where the optimized loss function consists of few elements: fidelity terms of L2 and L infinity norms, boundary and initial conditions constraints, and additional regularizers. This setting is flexible in the sense that regularizers can be tailored to specific problems. We demonstrate our method on several free shape 2D second order systems with application to Electrical Impedance Tomography (EIT), diffusion and wave equations.

    Short bio:
    Leah Bar holds B.Sc. in Physics, M.Sc. in Bio-Medical Engineering and PhD in Electrical Engineering from Tel-Aviv University. She worked as a post-doctoral fellow in the Department of Electrical Engineering at the University of Minnesota. She is currently a senior researcher at MaxQ-AI, a medical AI start-up, and in addition a researcher at the Mathematics Department in Tel-Aviv University. Her research interest are: machine learning, image processing, computer vision and variational methods. more details» copy to my calendar

  • Hypernetworks and a New Feedback Model

    Speaker:
    Lior Wolf - COLLOQUIUM LECTURE
    Date:
    Tuesday, 31.3.2020, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    School of Computer Science, Tel-Aviv University
    Host:
    Yuval Filmus

    Hypernetworks, also known as dynamic networks, are neural networks in which the weights of at least some of the layers vary dynamically based on the input. Such networks have composite architectures in which one network predicts the weights of another network. I will briefly describe the early days of dynamic layers and present recent results from diverse domains: 3D reconstruction from a single image, image retouching, electrical circuit design, decoding block codes, graph hypernetworks for bioinformatics, and action recognition in video. Finally, I will present a new hypernetwork-based model for the role of feedback in neural computations. Short Bio: ========== Lior Wolf is a faculty member at the School of Computer Science at Tel Aviv University and a research scientist at Facebook AI Research. Before, he was a postdoc working with Prof. Poggio at CBCL, MIT. He received his PhD working with Prof. Shashua at the Hebrew U, Jerusalem. ====================================== Refreshments will be served from 14:15 Lecture starts at 14:30

  • Second-order Optimization for Machine Learning, Made Practical

    Speaker:
    Tomer Koren - COLLOQUIUM LECTURE
    Date:
    Tuesday, 5.5.2020, 14:30
    Place:
    Room 337 Taub Bld.
    Affiliation:
    School of Computer Science at Tel Aviv University
    Host:
    Yuval Filmus

    Optimization in machine learning, both theoretical and applied, is presently dominated by first-order gradient methods such as stochastic gradient descent. Second-order optimization methods---that involve second-order derivatives and/or second-order statistics of the data---have become far less prevalent despite strong theoretical properties, due to their impractical computation, memory and communication costs. I will present some recent theoretical, algorithmic and infrastructural advances that allow for overcoming these challenges in using second-order methods and obtaining significant performance gains in practice, at very large scale, and on highly-parallel computing architectures. Short Bio: =========== Tomer Koren is an Assistant Professor in the School of Computer Science at Tel Aviv University since Fall 2019. Previously, he was a Senior Research Scientist at Google Brain, Mountain View. He received his PhD in December 2016 from the Technion - Israel Institute of Technology, where his advisor was Prof. Elad Hazan. His research interests are in machine learning and optimization. ===================================== Rereshments will be served from 14:15 Lecture starts at 14:30