To join the email distribution list of the cs colloquia, please visit the list subscription page.
Computer Science events calendar in HTTP ICS format for of Google calendars, and for Outlook.
Academic Calendar at Technion site.
We extend the Coupon Collector's Problem (CCP) and present a novel generalized model, referred as the \ekecd problem, where one is interested in recovering a bipartite graph with a perfect matching, which represents the coupons and their matching labels. We show two extra-extensions to this variation: the heterogeneous sample size case (\efmecd) and the partly recovering case.
Taub 301
The software landscape is showing consistent, accelerated growth in the volume of code developed using dynamic languages. These languages are characterized by dynamic typing and more lax preemptive checks, which allow rapid application development and shorter development cycles. The vast majority of these languages are interpreted; that is, programs are executed by an interpreter in a managed runtime environment. Such interpreters incur significant performance hits, and, to counter that, modern runtime environments usually employ some sort of optimization. The most common one is Just-in-Time compilation (JIT), which translates source code on-demand into native code that can run much faster. Some notable JIT engines (such as V8 for JavaScript) exhibit impressive speedups. Still, in most realistic scenarios, they cannot surpass the performance of hand-crafted native code written in a low-level language like C.
There are inherent reasons for why Ahead-of-Time compilation (AOT) is rarely practiced with dynamic languages. Since variables are dynamically typed, this will require most of the type-checking to be done at runtime still, thus limiting the range of optimization that can be performed ahead of time, consequently limiting the benefit of AOT compilation. We propose an approach that utilizes static analysis for the purpose of sound type inference, which can then be leveraged for code generation requiring minimal amount of runtime type checks. Unlike previous work in this area, our approach eliminates the need for the JIT at runtime, or, indeed, any managed runtime at all. We claim that real-world JavaScript applications can, in fact, benefit much from using AOT, in terms of increased speed. We show empirical evidence on a set of representative benchmarks supports this claim.
Inverse problems in scientific imaging often seek physical characterization of heterogeneous scene materials. The scene is thus represented by physical quantities, such as the density and sizes of particles (microphysics) across a domain. Moreover, the forward image formation model is physical. An important case is that of clouds, where microphysics in three dimensions (3D) dictate the cloud dynamics, lifetime and albedo, with implications to Earth’s energy balance, sustainable energy and rainfall. Current methods, however, recover very degenerate representations of microphysics. To enable 3D volumetric recovery of all the required microphysical parameters, we introduce the neural microphysics field (NeMF). It is based on a deep neural network, whose input is multi-view polarization images. NeMF is pre-trained through supervised learning. Training relies on polarized radiative transfer, and noise modeling in polarization-sensitive sensors. The results offer unprecedented recovery, including droplet effective variance. We test NeMF in rigorous simulations and demonstrate it using real-world polarization-image data.
M.Sc. student under the supervision of Prof. Yoav Schechner.
Min-Plus Weighted Finite Automata (WFAs) are a quantitative extension of Boolean automata whereby each word is assigned an integer, instead of being accepted or rejected.
Applications of WFAs fall on a wide spectrum including verification, rewriting systems, tropical algebra, speech and image processing and have been key to proving the star-height conjecture.
Unlike Boolean automata, WFAs cannot always be determinized. The decidability of whether a WFA admits an equivalent deterministic WFA is a long standing open problem.
We prove that this problem is decidable.
As part of the proof, we develop a new toolbox for reasoning about the run structure of weighted automata.
Taub 401
In Approval-Based Committee (ABC) voting, each voter lists the candidates they approve and then a voting rule aggregates the individual approvals into a committee that represents the collective choice of the voters.
An extensively studied class of such rules is the class of ABC scoring rules, where each voter contributes to each possible committee a score based on the voter's approvals.
We initiate a study of ABC voting in the presence of constraints about the general context surrounding the candidates.
Specifically, we consider a framework in which there is a relational database with information about the candidates together with integrity constraints on the relational database extended with a virtual relation representing the committee.
For an ABC scoring rule, the goal is to find a committee of maximum score such that all integrity constraints hold in the extended database.
We focus on two well-known types of integrity constraints in relational databases:
tuple-generating dependencies (TGDs) and denial constraints (DCs).
The former can express, for example, desired representations of groups, while the latter can express conflicts among candidates.
ABC voting is known to be computationally hard without integrity constraints, except for the case of approval voting where it is tractable.
We show that integrity constraints make the problem NP-hard for approval voting, but we also identify certain tractable cases when key constraints are used.
We then present an implementation of the framework via a reduction to Mixed Integer Programming (MIP) that supports arbitrary ABC scoring rules, TGDs and DCs.
We devise heuristics for optimizing the resulting MIP, and describe an empirical study that illustrates the effectiveness of the optimized MIP over databases in three different domains.
In this seminar I will present our paper that was published in NeurIPS 2024. We introduce hierarchical selective classification, which extends selective classification to a hierarchical setting. Our approach leverages the inherent structure of class relationships, enabling models to reduce the specificity of their predictions when faced with uncertainty. We formalize hierarchical risk and coverage, and introduce hierarchical risk-coverage curves. Next, we develop algorithms for hierarchical selective classification (which we refer to as "inference rules"), and propose an efficient algorithm that guarantees a target accuracy constraint with high probability. Lastly, we conduct extensive empirical studies on over a thousand ImageNet classifiers, revealing that training regimes such as CLIP, pretraining on ImageNet21k and knowledge distillation boost hierarchical selective performance.
Taub 337
I will describe several old and new applications of topological and algebraic methods in the derivation of combinatorial results. In all of them the proofs provide no efficient procedures for solving the corresponding algorithmic questions. The problem of finding such procedures (or convincing reasons indicating that they are unlikely to exist) is an intriguing challenge and I will mention some progress in the study of this problem too.
Bio:
Noga Alon is a Professor of Mathematics at Princeton University and a Professor Emeritus of Mathematics and Computer Science at Tel Aviv University.
He works in Discrete Mathematics and its applications in Theoretical Computer Science, Information Theory, Combinatorial Geometry, and Combinatorial Number Theory.
He is a member of the Israel Academy of Sciences and Humanities and of the Academia Europaea, and an honorary member of the Hungarian Academy of Sciences. He received several awards, three recent ones are the 2022 Shaw Prize in Mathematical Sciences, the 2022 Knuth Prize for outstanding contributions to the foundations of computer science and the 2024 Wolf Prize in Mathematics.