Skip to yearly menu bar Skip to main content


Poster

Differentiability and Convergence of Filtration Learning with Multiparameter Persistence

Luis Scoccola · Siddharth Setlur · David Loiseaux · Mathieu Carrière · Steve Oudot


Abstract:

Filtration learning---learning a function on a geometric object, such as node attributes on a graph---has been shown to improve when incorporating descriptors from topological data analysis. When learning a real-valued function (the one-parameter setting), there is a canonical choice of topological descriptor: the barcode. The operation mapping a one-parameter filtration to its barcode is differentiable almost everywhere, and the convergence of gradient descent for losses using barcodes is relatively well understood. When learning a vector-valued function (the multiparameter setting), there is no unique choice of topological descriptor, and many distinct descriptors have been proposed. This calls for the development of a general framework for differentiability and gradient descent based optimization that applies to a wide range of multiparameter topological descriptors. In this article, we develop such a framework and show that it encompasses well-known descriptors of different flavors, such as signed barcodes and the multiparameter persistence landscape. We complement the theory with numerical experiments supporting the idea that multiparameter filtration learning can lead to improved performance compared to its one-parameter counterpart, even when using the simplest and most efficiently computable multiparameter descriptors.

Live content is unavailable. Log in and register to view live content