Time+Place: Thursday 16/06/2016 15:00 Room 337-8 Taub Bld.
Title: Understanding Word Embeddings
Speaker: Omer Levy - CS-Lecture - NOTE UNUSUAL HOUR https://levyomer.wordpress.com/
Affiliation: Bar-Ilan University
Host: Shaul Markovitch

Abstract:

Neural word embeddings, such as word2vec (Mikolov et al., 2013), have 
become increasingly popular in both academic and industrial NLP. These 
methods attempt to capture the semantic meanings of words by processing 
huge unlabeled corpora with methods inspired by neural networks and the 
recent onset of Deep Learning. The result is a vectorial representation 
of every word in a low-dimensional continuous space. These word vectors 
exhibit interesting arithmetic properties (e.g. king - man + woman = 
queen) (Mikolov et al., 2013), and seemingly outperform traditional 
vector-space models of meaning inspired by Harris's Distributional 
Hypothesis (Baroni et al., 2014). Our work attempts to demystify word 
embeddings, and understand what makes them so much better than 
traditional methods at capturing semantic properties.

Our main result shows that state-of-the-art word embeddings are actually 
"more of the same". In particular, we show that skip-grams with negative 
sampling, the latest algorithm in word2vec, is implicitly factorizing a 
word-context PMI matrix, which has been thoroughly used and studied in 
the NLP community for the past 20 years. We also identify that the root 
of word2vec's perceived superiority can be attributed to a collection of 
hyperparameter settings. While these hyperparameters were thought to be 
unique to neural-network-inspired embedding methods, we show that they 
can, in fact, be ported to traditional distributional methods, 
significantly improving their performance. Among our qualitative results 
is a method for interpreting these seemingly-opaque word-vectors, and 
the answer to why king - man + woman = queen.

(Based on joint work with Yoav Goldberg and Ido Dagan.)


Short Bio:

Omer is a Computer Science PhD student at Bar-Ilan University's 
Natural Language Processing lab, working with Prof. Ido Dagan and  
Dr. Yoav Goldberg. He will be starting a post-doc at the University of 
Washington with Prof. Luke Zettlemoyer this fall.

Omer is interested in realizing high-level semantic applications such 
as question answering and summarization to help people cope with 
information overload. At the heart of these applications are challenges 
in textual entailment and semantic similarity, which form the core of 
Omer's current research. He is also interested in the current advances 
in representation learning (aka"deep learning"), particularly in the scope 
of word embeddings, and how they can support semantic applications.