BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://tilos.ai
X-WR-CALDESC:Events for 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20250309T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20251102T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20260308T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20261101T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20270314T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20271107T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20260109T110000
DTEND;TZID=America/Los_Angeles:20260109T120000
DTSTAMP:20260404T035327
CREATED:20251014T195932Z
LAST-MODIFIED:20260304T210221Z
UID:7661-1767956400-1767960000@tilos.ai
SUMMARY:Optimization for ML and AI Seminar: Randomized linear algebra with subspace injections
DESCRIPTION:Joel Tropp\, Caltech \nAbstract: To achieve the greatest possible speed\, practitioners regularly implement randomized algorithms for low-rank approximation and least-squares regression with structured dimension reduction maps. This talk outlines a new perspective on structured dimension reduction\, based on the injectivity properties of the dimension reduction map. This approach provides sharper bounds for sparse dimension reduction maps\, and it leads to exponential improvements for tensor-product dimension reduction. Empirical evidence confirms that these types of structured random matrices offer exemplary performance for a range of synthetic problems and contemporary scientific applications. \nJoint work with Chris Camaño\, Ethan Epperly\, and Raphael Meyer; available at arXiv:2508.21189. \n\nJoel A. Tropp is Steele Family Professor of Applied & Computational Mathematics at the California Institute of Technology. His research centers on applied mathematics\, machine learning\, data science\, numerical algorithms\, and random matrix theory. Some of his best-known contributions include matching pursuit algorithms\, randomized SVD algorithms\, matrix concentration inequalities\, and statistical phase transitions. Prof. Tropp attained the Ph.D. degree in Computational Applied Mathematics at the University of Texas at Austin in 2004\, and he joined Caltech in 2007. He won the PECASE in 2008\, and he was recognized as a Highly Cited Researcher in Computer Science each year from 2014–2018. He is co-founder of the SIAM Journal on Mathematics of Data Science (SIMODS)\, and he was co-chair of the inaugural 2020 SIAM Conference on the Mathematics of Data Science. Prof. Tropp was elected SIAM Fellow in 2019\, IEEE Fellow in 2020\, and IMS Fellow in 2024. He received the 2025 Richard P. Feynman Prize for Excellence in Teaching at Caltech. He is an invited speaker at the 2026 International Congress of Mathematicians (ICM).
URL:https://tilos.ai/event/optimization-for-ml-and-ai-seminar-with-joel-tropp-caltech/
LOCATION:HDSI 123 and Virtual\, 3234 Matthews Ln\, La Jolla\, CA\, 92093\, United States
CATEGORIES:TILOS Seminar Series,TILOS Sponsored Event
ATTACH;FMTTYPE=image/jpeg:https://tilos.ai/wp-content/uploads/2025/10/tropp-joel-e1760471957302.jpg
END:VEVENT
END:VCALENDAR