Authors: Mathieu Carrière and Théo Lacombe.
This repository is a work in progress. Comments and feedback appreciated!
A minimal working example is available as a tutorial based on this paper, see tutorial_0*.ipynb
.
This tutorial provides an introduction to generalized gradient descent for
Topological Data Analysis and showcase its use in the (very simple) case
of total persistence minimization.
This repository was tested on Ubuntu 20.04 and relies on the following libraries:
tensorflow 2.2
for automatic differentiation.gudhi
for persistent homology related computations. It was tested withgudhi 3.4
andgudhi 3.5
(which has not been released yet as of 09/20/2021.)cvxpy
(tested with1.13
, other versions should work as well) to compute the element of minimal norm on a convex hull of sampled gradients.numpy
(tested with1.20
, other versions should work as well).
In addition, to run the notebooks provided in the ./tutorials/
folder, one needs (along with a
jupyter notebook installation and the aforementioned packages):
matplotlib
(tested with3.4
, other versions should work as well).
Note: the organization is subject to evolution depending on the future additions that will be made to this repository.
The topt.py
file contains the most important methods for this repository.
In particular, it defines the TopoMeanModel
class, a natural (tensorflow-like) that compute
(in an autodiff-compatible way) a loss of the form:
where the $(d_i)_i$s represent
Non-exhaustive, feel free to mention code ressources related to optimization with topological descriptors.
- Mathieu Carriere's difftda notebook, related to this paper.
- The code repository for A Topology Layer for machine learning by Rickard Brüel Gabrielsson and co-authors.
- The code repository for PLLay: Efficient Topological Layer based on Persistence Landscape by Kwangho Kim and co-authors.
- TBC
Note: no license yet, all rights reserved (will be updated later).