BEGIN:VCALENDAR
VERSION:2.0
METHOD:PUBLISH
CALSCALE:GREGORIAN
PRODID:-//WordPress - MECv6.5.3//EN
X-ORIGINAL-URL:https://tilos.ai/
X-WR-CALNAME:TILOS
X-WR-CALDESC:The Institute for Learning-enabled Optimization at Scale
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-PUBLISHED-TTL:PT1H
X-MS-OLK-FORCEINSPECTOROPEN:TRUE
BEGIN:VEVENT
CLASS:PUBLIC
DTSTART;TZID=America/Los_Angeles:20220316T100000
DTEND;TZID=America/Los_Angeles:20220316T110000
DTSTAMP:20220228T182900
UID:MEC-f187a23c3ee681ef6913f31fd6d6446b@tilos.ai
CREATED:20220228
LAST-MODIFIED:20240216
PRIORITY:5
TRANSP:OPAQUE
SUMMARY:The Connections Between Discrete Geometric Mechanics, Information Geometry, Accelerated Optimization and Machine Learning
DESCRIPTION:Melvin Leok, Department of Mathematics, University of California, San Diego\nAbstract: Geometric mechanics describes Lagrangian and Hamiltonian mechanics geometrically, and information geometry formulates statistical estimation, inference, and machine learning in terms of geometry. A divergence function is an asymmetric distance between two probability densities that induces differential geometric structures and yields efficient machine learning algorithms that minimize the duality gap. The connection between information geometry and geometric mechanics will yield a unified treatment of machine learning and structure-preserving discretizations. In particular, the divergence function of information geometry can be viewed as a discrete Lagrangian, which is a generating function of a symplectic map, that arise in discrete variational mechanics. This identification allows the methods of backward error analysis to be applied, and the symplectic map generated by a divergence function can be associated with the exact time-h flow map of a Hamiltonian system on the space of probability distributions. We will also discuss how time-adaptive Hamiltonian variational integrators can be used to discretize the Bregman Hamiltonian, whose flow generalizes the differential equation that describes the dynamics of the Nesterov accelerated gradient descent method.\n\nMelvin Leok is professor of mathematics and co-director of the CSME graduate program at the University of California, San Diego. His research interests are in computational geometric mechanics, computational geometric control theory, discrete geometry, and structure-preserving numerical schemes, and particularly how these subjects relate to systems with symmetry. He received his Ph.D. in 2004 from the California Institute of Technology in Control and Dynamical Systems under the direction of Jerrold Marsden. He is a three-time NAS Kavli Frontiers of Science Fellow, a Simons Fellow in Mathematics, and has received the DoD Newton Award for Transformative Ideas, the NSF Faculty Early Career Development (CAREER) award, the SciCADE New Talent Prize, the SIAM Student Paper Prize, and the Leslie Fox Prize (second prize) in Numerical Analysis. He has given plenary talks at Foundations of Computational Mathematics, NUMDIFF, and the IFAC Workshop on Lagrangian and Hamiltonian Methods for Nonlinear Control. He serves on the editorial boards of the Journal of Nonlinear Science, the Journal of Geometric Mechanics, and the Journal of Computational Dynamics, and has served on the editorial boards of the SIAM Journal on Control and Optimization, and the LMS Journal of Computation and Mathematics.\n\n\n
URL:https://tilos.ai/events/connections-between-discrete-geometric-mechanics-information-geometry-accelerated-optimization-machine-learning/
ORGANIZER;CN=TILOS:MAILTO:
CATEGORIES:TILOS Seminar Series
LOCATION:Virtual
END:VEVENT
END:VCALENDAR