Nonlinear Feature Learning in Neural Networks

Nonlinear Feature Learning in Neural Networks Learning non-linear features from data is thought to be one of the fundamental reasons for the success of deep neural networks. This has been observed in a wide range of domains, including computer vision and natural language processing. Among many theoretical approaches to study neural nets, much work has […]

Read More

Deep Learning with Symmetries

Deep Learning with Symmetries Sample Complexity Gain of Invariances In practice, encoding invariances into models improves sample complexity. In [Tahmasebi & Jegelka, NeurIPS 2023], we study this phenomenon from a theoretical perspective. In particular, we provide minimax optimal rates for kernel ridge regression on compact manifolds, with a target function that is invariant to a […]

Read More

Graph Representation Learning

Graph Representation Learning Graph neural networks (GNNs) have been a very successful architecture in many domains. With [Barzilay et al., Nature Reviews Methods Primers 2024] we wrote an introductory survey on GNNs in a high-profile journal. Scalability. Many graph algorithms, including some for graph representation learning, are expensive to scale to large graphs. In [Le […]

Read More

Differentiable Extensions with Rounding Guarantees for Combinatorial Optimization over Permutations and Trees

Differentiable Extensions with Rounding Guarantees for Combinatorial Optimization over Permutations and Trees Continuously extending combinatorial optimization objectives is a powerful technique commonly applied to the optimization of set functions. However, few such methods exist for extending functions on permutations, despite the fact that many combinatorial optimization problems, such as the traveling salesperson problem (TSP), are […]

Read More