Talks by distinguished ECE-NTUA alumni on Machine Learning


On January 8, 2020, four distinguished ECE-NTUA alumni, Alexandros Dimakis, Manolis Zampetakis, Christos Tzamos and Constantinos (Costis) Daskalakis, spoke in front of a large audience at the Ceremonial Hall of the NTUA. The event was attended by the NTUA Rector, Prof. Andreas G. Boudouvis, and the Dean of the School of Electrical and Computer Engineering of the NTUA, Prof. Nectarios Koziris, along with many academic members of the NTUA community.

Abstracts and Short Bios of speakers

Alex Dimakis’ Lecture Title: The Future will be Unsupervised: Recent Results and Open Problems in Deep Generative Models

Abstract: We will summarize on-going work on using deep generative models for modeling high-dimensional data, like images. Further, we will show how solving various inverse problems, like denoising, inpainting and super-resolution, can be aided by deep generative models. We will mention various research open research directions that involve tools from optimization, linear algebra and probability theory.

Bio: Alex Dimakis is an Associate Professor at the Electrical and Computer Engineering department, University of Texas at Austin. From 2009 until 2012, he was with the Viterbi School of Engineering, University of Southern California. He received his Ph.D. in 2008 and M.S. degree in 2005 in Electrical Engineering and Computer Sciences from UC Berkeley and his Diploma degree from the National Technical University of Athens in 2003. During 2009 he was a CMI postdoctoral scholar at Caltech. He received an ARO young investigator award in 2014, the NSF Career award in 2011, a Google faculty research award in 2012 and the Eli Jury dissertation award in 2008. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. He served two terms as an Associate Editor for IEEE Signal Processing Letters and is currently serving as an Associate Editor for IEEE Transactions on Information Theory. His research interests include information theory, coding theory and machine learning.

See the presentation here.

Manolis Zampetakis’ Lecture Title: Iterative Methods for Efficient Statistics, in High Dimensions

Abstract: We consider the classical statistical problem of estimating to arbitrary accuracy the parameters of a multidimensional Gaussian distribution under the following real-world obstacles: (1) truncated samples, and (2) inhomogeneous population.

(1) Statistical estimation from truncated samples is a classical problem, going back to Galton, Pearson and Fisher. Truncated samples are samples that are only revealed, if they fall in some subset S of the multi-dimensional Euclidean space. We provide and analyze an iterative algorithm with almost optimal sample and time complexity that recovers the parameters of a Gaussian distribution with arbitrary accuracy, from truncated samples in two relevant cases: (a) the set S is known, and (b) the set S in unknown.

(2) We prove the fast convergence of the celebrated iterative algorithm EM to efficiently estimate the means of Gaussian distributions in the inhomogeneous case of mixture of two Gaussian distributions.

Abstracting the use of iterative algorithms in the previous important statistical problems, we consider the general question of analyzing the convergence of iterative algorithms. We prove that contraction maps provide a universal analysis tool for proving global convergence of iterative maps.

The talk is based on joint works with Constantinos Daskalakis, Themis Gouleakis, Vasilis Kontonis and Christos Tzamos.

Bio: Manolis Zampetakis joined the graduate program of MIT on September 2014 in the Theory of Computation Group at CSAIL. His research advisor is Constantinos Daskalakis and is working on a wide range of problems on theoretical machine learning, learning theory, complexity theory and algorithmic game theory. Before MIT, he was an undergraduate student at the School of Electrical Engineering and Computer Science at the National Technical University of Athens. He worked as an intern at Google Research (New York, 2017), Yahoo Research (New York, 2018) and Microsoft Research (New England, 2019). Since Fall 2018, his research has been supported by a Google PhD Fellowship.

See the presentation here.

Christos Tzamos’ Lecture Title: Distribution-Independent PAC Learning of Halfspaces with Massart Noise

Abstract: We study the problem of distribution-independent PAC learning of halfspaces in the presence of Massart noise. Specifically, we are given a set of labeled examples (x, y) drawn from a distribution D on Rd+1 such that the marginal distribution on the unlabeled points x is arbitrary and the labels y are generated by an unknown halfspace corrupted with Massart noise at noise rate η < 1/2. The goal is to find a hypothesis h that minimizes the misclassification error Pr(x, y) ∼ D[h(x) ≠ y].

We give a poly(d, 1/ε)-time algorithm for this problem with misclassification error η+ε. We also provide evidence that improving on the error guarantee of our algorithm might be computationally hard. Prior to our work, no efficient weak (distribution-independent) learner was known in this model, even for the class of disjunctions. The existence of such an algorithm for halfspaces (or even disjunctions) has been posed as an open question in various works, starting with Sloan (1988), Cohen (1997), and was most recently highlighted in Avrim Blum's FOCS 2003 tutorial.

The talk is based on joint work with Ilias Diakonikolas and Themis Gouleakis.

Bio: Christos Tzamos is an Assistant Professor in the Department of Computer Sciences at University of Wisconsin-Madison and a member of the Theory of Computing group. His research interests lie in the interface of Theory of Computation with Economics and Game Theory, Machine Learning, Statistics and Probability Theory. He completed his PhD in the Theory of Computation group of MIT advised by Costis Daskalakis. Before that, he studied Electrical and Computer Engineering at NTUA and was a member of Corelab. After his PhD, he was a postdoctoral researcher at Microsoft Research (New England) working on Mechanism Design, Algorithms and Machine Learning. He is the recipient of a Simons Foundation award, the George M. Sprowls award, the best paper and the best student paper award in EC 2013 and of an outstanding paper award in NeurIPS 2019.

See the presentation here.

Constantinos Daskalakis’ Lecture Title: Statistical Inference from Dependent Data

Abstract: Statistical Learning Theory has largely focused on learning and generalization from independent and identically distributed observations. This assumption is, however, too strong. In many applications, observations are collected on nodes of a network, or some spatial or temporal domain, and are dependent. Examples abound in financial and meteorological applications, and dependencies naturally arise in social networks through peer effects.

We study the basic statistical tasks of linear and logistic regression on networked data, where the responses on the nodes of the network are not independent conditioning on the nodes’ vectors of covariates. Given a single observation (across the whole network) from a networked linear or logistic regression model and under necessary weak dependency assumptions, we prove strong consistency results for estimating the model parameters, recovering the rates achievable in the standard setting with independent data. We generalize these results beyond linear and logistic regression, assuming that the observations satisfy Dobrushin’s condition, showing how to use Gaussian complexity and VC dimension to control generalization error.

Bio: Constantinos (a.k.a. “Costis”) Daskalakis is a Professor of Computer Science at MIT and a member of CS-AI Lab. He works on computation theory and its interface with game theory, economics, probability theory, statistics and machine learning. He holds a Diploma in Electrical and Computer Engineering from the National Technical University of Athens, Greece, and a PhD in Computer Science from UC-Berkeley. He has been honored with the ACM Doctoral Dissertation award, the Kalai Prize from the Game Theory Society, the Sloan Foundation Fellowship, the Microsoft Faculty Fellowship, the SIAM outstanding paper prize, the ACM Grace Murray Hopper Award, the Simons investigator award, the Bodossaki Foundation Distinguished Young Scientists Award, and the Nevanlinna prize from the International Mathematical Union.

See the presentation here.