BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://tilos.ai
X-WR-CALDESC:Events for 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20230312T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20231105T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20240310T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20241103T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20250309T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20251102T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20240724T100000
DTEND;TZID=America/Los_Angeles:20240724T110000
DTSTAMP:20260404T084753
CREATED:20250828T200721Z
LAST-MODIFIED:20250828T200721Z
UID:7304-1721815200-1721818800@tilos.ai
SUMMARY:TILOS Seminar: What Kinds of Functions do Neural Networks Learn? Theory and Practical Applications
DESCRIPTION:Robert Nowak\, University of Wisconsin \nAbstract: This talk presents a theory characterizing the types of functions neural networks learn from data. Specifically\, the function space generated by deep ReLU networks consists of compositions of functions from the Banach space of second-order bounded variation in the Radon transform domain. This Banach space includes functions with smooth projections in most directions. A representer theorem associated with this space demonstrates that finite-width neural networks suffice for fitting finite datasets. The theory has several practical applications. First\, it provides a simple and theoretically grounded method for network compression. Second\, it shows that multi-task training can yield significantly different solutions compared to single-task training\, and that multi-task solutions can be related to kernel ridge regressions. Third\, the theory has implications for improving implicit neural representations\, where multi-layer neural networks are used to represent continuous signals\, images\, or 3D scenes. This exploration bridges theoretical insights with practical advancements\, offering a new perspective on neural network capabilities and future research directions. \n\nRobert Nowak is the Grace Wahba Professor of Data Science and Keith and Jane Nosbusch Professor in Electrical and Computer Engineering at the University of Wisconsin-Madison. His research focuses on machine learning\, optimization\, and signal processing. He serves on the editorial boards of the SIAM Journal on the Mathematics of Data Science and the IEEE Journal on Selected Areas in Information Theory.
URL:https://tilos.ai/event/tilos-seminar-what-kinds-of-functions-do-neural-networks-learn-theory-and-practical-applications/
LOCATION:HDSI 123 and Virtual\, 3234 Matthews Ln\, La Jolla\, CA\, 92093\, United States
CATEGORIES:TILOS Seminar Series
ATTACH;FMTTYPE=image/jpeg:https://tilos.ai/wp-content/uploads/2024/07/nowak-robert.jpg
END:VEVENT
END:VCALENDAR