Author:
Amir Guy,Wu Haoze,Barrett Clark,Katz Guy
Abstract
AbstractDeep learning has emerged as an effective approach for creating modern software systems, with neural networks often surpassing hand-crafted systems. Unfortunately, neural networks are known to suffer from various safety and security issues. Formal verification is a promising avenue for tackling this difficulty, by formally certifying that networks are correct. We propose an SMT-based technique for verifying binarized neural networks — a popular kind of neural network, where some weights have been binarized in order to render the neural network more memory and energy efficient, and quicker to evaluate. One novelty of our technique is that it allows the verification of neural networks that include both binarized and non-binarized components. Neural network verification is computationally very difficult, and so we propose here various optimizations, integrated into our SMT procedure as deduction steps, as well as an approach for parallelizing verification queries. We implement our technique as an extension to the Marabou framework, and use it to evaluate the approach on popular binarized neural network architectures.
Publisher
Springer International Publishing
Reference57 articles.
1. Artifact repository. https://github.com/guyam2/BNN_Verification_Artifact.
2. Marabou repository. https://github.com/NeuralNetworkVerification/Marabou.
3. P. Ashok, V. Hashemi, J. Kretinsky, and S. Mühlberger. DeepAbstract: Neural Network Abstraction for Accelerating Verification. In Proc. 18th Int. Symposium on Automated Technology for Verification and Analysis (ATVA), 2020.
4. P. Bacchus, R. Stewart, and E. Komendantskaya. Accuracy, Training Time and Hardware Efficiency Trade-Offs for Quantized Neural Networks on FPGAs. In Proc. 16th Int. Symposium on Applied Reconfigurable Computing (ARC), pages 121–135, 2020.
5. C. Barrett and C. Tinelli. Satisfiability modulo theories. Springer, 2018.
Cited by
26 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献