Talk of Kfir Y. Levy (Technion), Monday December 19, 17:15 – 19:00, Multimedia Amphitheater (basement of the Central Library building), National Technical University of Athens

Title: Beyond SGD: Efficient Learning with Non i.i.d. Data

Abstract: The tremendous success of the Machine Learning paradigm heavily relies on the development of powerful optimization methods. The canonical algorithm for training learning models is SGD (Stochastic Gradient Descent), yet this method has several limitations. In particular, it relies on the assumption that data-points are i.i.d. (independent and identically distributed). Unfortunately, this assumption often fails to hold in practice.

In this talk, I will discuss an ongoing line of research where we develop alternative methods that resolve this limitation of SGD in two different contexts. In the first part, I will describe a method that copes well with contaminated data. In the second part, I will discuss a method that efficiently handles Markovian data. The methods that I describe are as efficient as SGD, and implicitly adapt to the underlying structure of the problem in a data dependent manner.

Short Bio: Kfir Y. Levy is an Assistant Professor in the Electrical and Computer Engineering Department at Technion – Israel Institute of Technology. Kfir’s research is focused on Machine Learning, AI, and Optimization, with a special interest in designing universal methods that apply to a wide class of learning scenarios. Kfir did his postdoc in the Institute for Machine Learning at ETH Zurich. He is a recipient of the Alon fellowship, the ETH Zurich Postdoctoral fellowship, as well as the Irwin and Joan Jacobs fellowship. Kfir received all of his degrees from the Technion.