Wednesday, November 16, 2022 (10:00 AM PST)

  • TILOS Seminar Series: Rare Gems: Finding Lottery Tickets at Initialization, Speaker: Dimitris Papailiopoulos, Associate Professor, University of Wisconsin-Madison (Zoom link)

Abstract: Large neural networks can be pruned to a small fraction of their original size, with little loss in accuracy, by following a time-consuming “train, prune, re-train” approach. Frankle & Carbin in 2019 conjectured that we can avoid this by training lottery tickets, i.e., special sparse subnetworks found at initialization, that can be trained to high accuracy. However, a subsequent line of work presents concrete evidence that current algorithms for finding trainable networks at initialization, fail simple baseline comparisons, e.g., against training random sparse subnetworks. Finding lottery tickets that train to better accuracy compared to simple baselines remains an open problem. In this work, we resolve this open problem by discovering Rare Gems: sparse, trainable networks at initialization, that achieve high accuracy even before training. When Rare Gems are trained with SGD, they achieve accuracy competitive or better than Iterative Magnitude Pruning (IMP) with warmup.

Bio: Dimitris Papailiopoulos is the Jay & Cynthia Ihlenfeld Associate Professor of Electrical and Computer Engineering at the University of Wisconsin-Madison, a faculty fellow of the Grainger Institute for Engineering, and a faculty affiliate at the Wisconsin Institute for Discovery. His research interests span machine learning, information theory, and distributed systems, with a current focus on efficient large-scale training algorithms. Before coming to Madison, Dimitris was a postdoctoral researcher at UC Berkeley and a member of the AMPLab. He earned his Ph.D. in ECE from UT Austin, under the supervision of Alex Dimakis. He received his ECE Diploma M.Sc. degree from the Technical University of Crete, in Greece. Dimitris is a recipient of the NSF CAREER Award (2019), three years of Sony Faculty Innovation Awards (2018, 2019 and 2020), a joint IEEE ComSoc/ITSoc Best Paper Award (2020), an IEEE Signal Processing Society, Young Author Best Paper Award (2015), the Vilas Associate Award (2021), the Emil Steiger Distinguished Teaching Award (2021), and the Benjamin Smith Reynolds Award for Excellence in Teaching (2019). In 2018, he co-founded MLSys, a new conference that targets research at the intersection of machine learning and systems.

Wednesday, November 16, 2022 (4:00 PM ET)

  • OPTML++ Seminar: Equiangular lines and eigenvalue multiplicities, Speaker: Yufei Zhao, MIT (zoom link)

Abstract: Equiangular lines are configurations of lines in n-dimensional space, all passing through the origin, that pairwise make the same angle. We solve the following longstanding problem: fix an angle, in high dimensions, what is the maximum number of equiangular lines pairwise separated by the given angle?A central step is a new result in spectral graph theory: the adjacency matrix of a connected bounded degree graph has sublinear second eigenvalue multiplicity. This result is proved via counting closed walks, with a key new idea that deleting a net of vertices depresses local spectral radii. It remains an intriguing open problem to determine the maximum possible second eigenvalue multiplicity. This talk will discuss these problems and their connections.

Bio: Yufei Zhao is Associate Professor of Mathematics at MIT. His research tackles a broad range of problems in discrete mathematics, including extremal, probabilistic, and additive combinatorics, graph theory, and discrete geometry, as well as applications to computer science. His honors include the SIAM Dénes Kőnig prize (2018), the Sloan Research Fellowship (2019), and the NSF CAREER Award (2021).