I am an FDS Postdoctoral Fellow at Yale University. Before that, I was a PhD student in the Computer Science Department of the National Technical University of Athens (NTUA) working with Dimitris Fotakis and Christos Tzamos. I completed my undergraduate studies in the School of Electrical and Computer Engineering Department of the NTUA.
I am interested in the statistical and computational foundations of machine learning, especially their connections to causal inference, generative modeling, and theoretical computer science.
Instructor: Alkis Kalavasis
This course is about generalization and stability of Machine Learning (ML) systems. There are various ways to define what it means for a learning algorithm to be stable. The most standard way is inspired by sensitivity analysis, which aims at determining how much the variation of the input can influence the output of a system. This abstract way allows one to introduce various notions of stability such as uniform stability, differential privacy, and replicability. In this course, we investigate these notions of stability, their implications to learning theory, and their surprising connections.
Lecture Notes (PDF)with Andrew Ilyas, Anay Mehrotra, and Manolis Zampetakis
FOCS (2025, 2024), STOC (2026, 2025, 2024), COLT (2026, 2025), NeurIPS (2024, 2023, 2022, 2021), ICML (2023), AISTATS (2022, 2021), ICLR (2022), ITCS (2024)