BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://tilos.ai
X-WR-CALDESC:Events for 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20210314T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20211107T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20220313T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20221106T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20230312T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20231105T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20221116T100000
DTEND;TZID=America/Los_Angeles:20221116T110000
DTSTAMP:20260406T082327
CREATED:20250904T172450Z
LAST-MODIFIED:20250904T172450Z
UID:7353-1668592800-1668596400@tilos.ai
SUMMARY:TILOS Seminar: Rare Gems: Finding Lottery Tickets at Initialization
DESCRIPTION:Dimitris Papailiopoulos\, Associate Professor\, University of Wisconsin–Madison \nAbstract: Large neural networks can be pruned to a small fraction of their original size\, with little loss in accuracy\, by following a time-consuming “train\, prune\, re-train” approach. Frankle & Carbin in 2019 conjectured that we can avoid this by training lottery tickets\, i.e.\, special sparse subnetworks found at initialization\, that can be trained to high accuracy. However\, a subsequent line of work presents concrete evidence that current algorithms for finding trainable networks at initialization\, fail simple baseline comparisons\, e.g.\, against training random sparse subnetworks. Finding lottery tickets that train to better accuracy compared to simple baselines remains an open problem. In this work\, we resolve this open problem by discovering Rare Gems: sparse\, trainable networks at initialization\, that achieve high accuracy even before training. When Rare Gems are trained with SGD\, they achieve accuracy competitive or better than Iterative Magnitude Pruning (IMP) with warmup.
URL:https://tilos.ai/event/tilos-seminar-rare-gems-finding-lottery-tickets-at-initialization/
LOCATION:Virtual
CATEGORIES:TILOS Seminar Series
ATTACH;FMTTYPE=image/jpeg:https://tilos.ai/wp-content/uploads/2023/10/papailiopoulos-dimitris-1-e1711660394297.jpg
END:VEVENT
END:VCALENDAR