Learning Ultrametric Trees for Optimal Transport Regression
Optimal transport provides a metric which quantifies the dissimilarity between probability measures. For measures supported in discrete metric spaces, finding optimal transport distance has cubic time complexity in the size of the space. However, measures supported on trees admit a closed-form optimal transport which can be computed in linear time. In [Chen, Tabaghi & Wang, AAAI 2024], Wang’s team aims to find an optimal tree structure for a given discrete metric space, so that the tree-Wasserstein distance approximates the optimal transport distance in the original space. One of their key ideas is to cast the problem in ultrametric spaces and first learn that tree. This helps to optimize over the space of ultrametric trees—a mixed-discrete and continuous optimization problem—via projected gradient descent over the space of ultrametric matrices. During optimization, the parameters are projected to the ultrametric space via a hierarchical minimum spanning tree algorithm, equivalent to the closest projection to ultrametrics under L∞ norm. Experimental results on real datasets show that this approach outperforms previous approaches (e.g., Flowtree) in approximating optimal transport distances. Finally, experiments on synthetic data generated on ground truth trees show that our algorithm can accurately uncover the underlying trees.