Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology


Date

2023-05

Publication Type

Conference Paper

ETH Bibliography

yes

Citations

Altmetric

Data

Abstract

While many approaches to make neural networks more fathomable have been proposed, they are restricted to interrogating the network with input data. Measures for characterizing and monitoring structural properties, however, have not been developed. In this work, we propose neural persistence, a complexity measure for neural network architectures based on topological data analysis on weighted stratified graphs. To demonstrate the usefulness of our approach, we show that neural persistence reflects best practices developed in the deep learning community such as dropout and batch normalization. Moreover, we derive a neural persistence-based stopping criterion that shortens the training process while achieving comparable accuracies as early stopping based on validation loss.

Publication status

published

External links

Editor

Book title

International Conference on Learning Representations (ICLR 2019)

Journal / series

Volume

8

Pages / Article No.

6215 - 6239

Publisher

Curran

Event

7th International Conference on Learning Representations (ICLR 2019)

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Algebraic topology; Persistent homology; Network complexity; Neural network

Organisational unit

09486 - Borgwardt, Karsten M. (ehemalig) / Borgwardt, Karsten M. (former) check_circle
02060 - Dep. Biosysteme / Dep. of Biosystems Science and Eng.
09847 - Moor, Michael / Moor, Michael

Notes

Funding

Related publications and datasets