An abstract domain for certifying neural networks
OPEN ACCESS
Loading...
Author / Producer
Date
2019-01
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Abstract
We present a novel method for scalable and precise certification of deep neural networks. The key technical insight behind our approach is a new abstract domain which combines floating point polyhedra with intervals and is equipped with abstract transformers specifically tailored to the setting of neural networks. Concretely, we introduce new transformers for affine transforms, the rectified linear unit (ReLU), sigmoid, tanh, and maxpool functions.
We implemented our method in a system called DeepPoly and evaluated it extensively on a range of datasets, neural architectures (including defended networks), and specifications. Our experimental results indicate that DeepPoly is more precise than prior work while scaling to large networks.
We also show how to combine DeepPoly with a form of abstraction refinement based on trace partitioning. This enables us to prove, for the first time, the robustness of the network when the input image is subjected to complex perturbations such as rotations that employ linear interpolation.
Permanent link
Publication status
published
External links
Editor
Book title
Journal / series
Volume
3 (POPL)
Pages / Article No.
41
Publisher
Association for Computing Machinery
Event
46th ACM SIGPLAN Symposium on Principles of Programming Languages (POPL 2019)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Abstract interpretation; Deep Learning; Adversarial attacks
Organisational unit
03893 - Püschel, Markus / Püschel, Markus
03948 - Vechev, Martin / Vechev, Martin
Notes
Funding
Related publications and datasets
Is part of: