Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology
OPEN ACCESS
Author / Producer
Date
2023-05
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Rights / License
Abstract
While many approaches to make neural networks more fathomable have been proposed, they are restricted to interrogating the network with input data. Measures for characterizing and monitoring structural properties, however, have not been developed. In this work, we propose neural persistence, a complexity measure for neural network architectures based on topological data analysis on weighted stratified graphs. To demonstrate the usefulness of our approach, we show that neural persistence reflects best practices developed in the deep learning community such as dropout and batch normalization. Moreover, we derive a neural persistence-based stopping criterion that shortens the training process while achieving comparable accuracies as early stopping based on validation loss.
Permanent link
Publication status
published
External links
Editor
Book title
International Conference on Learning Representations (ICLR 2019)
Journal / series
Volume
8
Pages / Article No.
6215 - 6239
Publisher
Curran
Event
7th International Conference on Learning Representations (ICLR 2019)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Algebraic topology; Persistent homology; Network complexity; Neural network
Organisational unit
09486 - Borgwardt, Karsten M. (ehemalig) / Borgwardt, Karsten M. (former)
02060 - Dep. Biosysteme / Dep. of Biosystems Science and Eng.
09847 - Moor, Michael / Moor, Michael
Notes
Funding
Related publications and datasets
Is new version of: https://openreview.net/forum?id=ByxkijC5FQIs new version of: 10.48550/arXiv.1812.09764