Logistics:

  • Date: May 20th-21st

Form to register:

https://docs.google.com/forms/d/e/1FAIpQLSfjOiMEB4SISwMVLuCDLo-3ZrPruGbaG4aoDB4jsPMqn_m9Qw/viewform

 

Speakers:

  • Brice Huang (MIT)
  • Murat Erdogdu (University of Toronto)
  • Song Mei (UC Berkeley)
  • Subhabrata Sen (Harvard)
  • Elizabeth Collins-Woodfin (McGill)
  • Theodor Misiakiewicz (TTIC and UC Berkeley)

Titles and Abstracts:

Song Mei

Title: Revisiting neural network approximation theory in the age of generative AI

Abstract: Textbooks on deep learning theory primarily perceive neural networks as universal function approximators. While this classical viewpoint is fundamental, it inadequately explains the impressive capabilities of modern generative AI models such as language models and diffusion models. This talk puts forth a refined perspective: neural networks often serve as algorithm approximators, going beyond mere function approximation. I will explain how this refined perspective offers a deeper insight into the success of modern generative AI models.

Brice Huang

Title: Capacity threshold for the Ising perceptron

Abstract: We show that the capacity of the Ising perceptron is with high probability upper bounded by the constant $\alpha \approx 0.833$ conjectured by Krauth and Mézard, under the condition that an explicit two-variable function $S(\lambda_1,\lambda_2)$ is maximized at (1,0). The earlier work of Ding and Sun proves the matching lower bound subject to a similar numerical condition, and together these results give a conditional proof of the conjecture of Krauth and Mézard.

Organizers:

Join Our Newsletter