Foundations of Machine Learning: Over-Parameterization and Feature Learning

Invited Talk

ECON, KCDS, MathSEE and HIDA proudly present this talk by guest speaker Adit Radhakrishnan (Harvard), who will also be instructing the Deep Learning workshop.

Everyone interested in the foundations of machine learning is cordially invited to join the talk at KIT Campus South!

Time and place:

  • Wednesday, Oct 4, 2023 from 17:30-19:00h
  • NTI Lecture Hall, building 30.10, KIT Campus South

Abstract 
While deep learning has achieved various empirical successes, our understanding of fundamental principles driving these successes is still emerging. We analyze the two core principles driving success of neural networks: over-parameterization and feature learning. We then leverage these principles to design models with improved performance and interpretability. We begin by discussing the Neural Tangent Kernel (NTK) connection between infinitely wide networks and classical models known as kernel machines. While the NTK has been a useful tool for understanding properties of deep networks, it lacks a key component that is critical to the success of neural networks: feature learning. We show that features learned by deep neural networks are accurately captured by a mathematical operator known as the average gradient outer product. We use this operator to enable feature learning in kernel machines and show that the resulting models, referred to as Recursive Feature Machines, achieve state-of-the-art performance on tabular data.

About Adit Radhakrishnan
Adit is a George F. Carrier postdoctoral fellow at Harvard. He completed his Ph.D. in electrical engineering and computer science (EECS) at MIT advised by Caroline Uhler and was a Ph.D. fellow at the Eric and Wendy Schmidt Center at the Broad Institute of MIT and Harvard. He received his M.Eng. in EECS and his Bachelor’s of Science in Math and EECS from MIT. His research focuses on advancing theoretical foundations of machine learning in order to develop new methods for tackling biomedical problems. In his talk at KIT, he will present some of his recent projects, which shed light on some of the mysteries of deep learning that are of interest to both theorists and practitioners.

NTI Lecture hall on the KIT campus map