Randomized Linear Algebra with Subspace Injections
Joel Tropp, Caltech
To achieve the greatest possible speed, practitioners regularly implement randomized algorithms for low-rank approximation and least-squares regression with structured dimension reduction maps. This talk outlines a new perspective on structured dimension reduction, based on the injectivity properties of the dimension reduction map. This approach provides sharper bounds for sparse dimension reduction maps, and it leads to exponential improvements for tensor-product dimension reduction. Empirical evidence confirms that these types of structured random matrices offer exemplary performance for a range of synthetic problems and contemporary scientific applications.
Joint work with Chris Camaño, Ethan Epperly, and Raphael Meyer; available at https://arxiv.org/abs/2508.21189.
Joel A. Tropp is Steele Family Professor of Applied & Computational Mathematics at the California Institute of Technology. His research centers on applied mathematics, machine learning, data science, numerical algorithms, and random matrix theory. Some of his best-known contributions include matching pursuit algorithms, randomized SVD algorithms, matrix concentration inequalities, and statistical phase transitions. Prof. Tropp attained the Ph.D. degree in Computational Applied Mathematics at the University of Texas at Austin in 2004, and he joined Caltech in 2007. He won the PECASE in 2008, and he was recognized as a Highly Cited Researcher in Computer Science each year from 2014–2018. He is co-founder of the SIAM Journal on Mathematics of Data Science (SIMODS), and he was co-chair of the inaugural 2020 SIAM Conference on the Mathematics of Data Science. Prof. Tropp was elected SIAM Fellow in 2019, IEEE Fellow in 2020, and IMS Fellow in 2024. He received the 2025 Richard P. Feynman Prize for Excellence in Teaching at Caltech. He is an invited speaker at the 2026 International Congress of Mathematicians (ICM).