Efficient Neural Network Analysis with Sum-of-Infeasibilities

Efficient Neural Network Analysis with Sum-of-Infeasibilities” by Haoze Wu, Aleksandar Zeljic, Guy Katz, and Clark Barrett. In Proceedings of the 28^th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS '22), (Dana Fisman and Grigore Rosu, eds.), Apr. 2022, pp. 143-163.

Abstract

Inspired by sum-of-infeasibilities methods in convex optimization, we propose a novel procedure for analyzing verification queries on neural networks with piecewise-linear activation functions. Given a convex relaxation which over-approximates the non-convex activation functions, we encode the violations of activation functions as a cost function and optimize it with respect to the convex relaxation. The cost function, referred to as the Sum-of-Infeasibilities (SoI), is designed so that its minimum is zero and achieved only if all the activation functions are satisfied. We propose a stochastic procedure, DeepSoI, to efficiently minimize the SoI. An extension to a canonical case-analysis-based complete search procedure can be achieved by replacing the convex procedure executed at each search state with DeepSoI. Extending the complete search with DeepSoI achieves multiple simultaneous goals: 1) it guides the search towards a counter-example; 2) it enables more informed branching decisions; and 3) it creates additional opportunities for bound derivation. An extensive evaluation across different benchmarks and solvers demonstrates the benefit of the proposed techniques. In particular, we demonstrate that SoI significantly improves the performance of an existing complete search procedure. Moreover, the SoI-based implementation outperforms other state-of-the-art complete verifiers. We also show that our technique can efficiently improve upon the perturbation bound derived by a recent adversarial attack algorithm.

BibTeX entry:

@inproceedings{WZK+22,
   author = {Haoze Wu and Aleksandar Zelji{\'c} and Guy Katz and Clark
	Barrett},
   editor = {Dana Fisman and Grigore Rosu},
   title = {Efficient Neural Network Analysis with Sum-of-Infeasibilities},
   booktitle = {Proceedings of the {\it 28^{th}} International Conference
	on Tools and Algorithms for the Construction and Analysis of
	Systems (TACAS '22)},
   series = {Lecture Notes in Computer Science},
   volume = {13243},
   pages = {143--163},
   publisher = {Springer},
   month = apr,
   year = {2022},
   doi = {10.1007/978-3-030-99524-9_24},
   url = {http://theory.stanford.edu/~barrett/pubs/WZK+22.pdf}
}

(This webpage was created with bibtex2web.)