TILOS Seminar: Synthetic Tasks as Testbeds for Attributing Model Behavior

HDSI 123 and Virtual 3234 Matthews Ln, La Jolla, CA, United States

Surbhi Goel, University of Pennsylvania Abstract: Understanding how different components of the machine learning pipeline—spanning data composition, architectural choices, and optimization dynamics—shape model behavior remains a fundamental challenge. In this talk, I will argue that synthetic tasks, which enable precise control over data distribution and task complexity, serve as powerful testbeds for analyzing and attributing […]

TILOS Seminar: Single location regression and attention-based models

HDSI 123 and Virtual 3234 Matthews Ln, La Jolla, CA, United States

Claire Boyer, Université Paris-Saclay Abstract: Attention-based models, such as Transformer, excel across various tasks but lack a comprehensive theoretical understanding, especially regarding token-wise sparsity and internal linear representations. To address this gap, we introduce the single-location regression task, where only one token in a sequence determines the output, and its position is a latent random […]

TILOS Seminar: Foundational Methods for Foundation Models for Scientific Machine Learning

HDSI 123 and Virtual 3234 Matthews Ln, La Jolla, CA, United States

Michael W. Mahoney, ICSI, LBNL, and Department of Statistics, UC Berkeley Abstract: The remarkable successes of ChatGPT in natural language processing (NLP) and related developments in computer vision (CV) motivate the question of what foundation models would look like and what new advances they would enable, when built on the rich, diverse, multimodal data that […]

TILOS Seminar: Amplifying human performance in combinatorial competitive programming

Virtual

Petar Veličković, Google DeepMind Abstract: Recent years have seen a significant surge in complex AI systems for competitive programming, capable of performing at admirable levels against human competitors. While steady progress has been made, the highest percentiles still remain out of reach for these methods on standard competition platforms such as Codeforces. In this talk, […]

TILOS Seminar: Optimal Quantization for LLMs and Matrix Multiplication

HDSI 123 and Virtual 3234 Matthews Ln, La Jolla, CA, United States

Yury Polyanskiy, MIT Abstract: The main building block of large language models is matrix multiplication, which is often bottlenecked by the speed of loading these matrices from memory. A number of recent quantization algorithms (SmoothQuant, GPTQ, QuIP, SpinQuant etc) address this issue by storing matrices in lower precision. We derive optimal asymptotic information-theoretic tradeoff between […]