Global optimization of objective functions represented by ReLU networks

Global optimization of objective functions represented by ReLU networks” by Christopher A. Strong, Haoze Wu, Aleksandar Zeljic, Kyle D. Julian, Guy Katz, Clark Barrett, and Mykel J. Kochenderfer. Machine Learning, Oct. 2021, Springer.


Neural networks can learn complex, non-convex functions, and it is challenging to guarantee their correct behavior in safety-critical contexts. Many approaches exist to find failures in networks (e.g., adversarial examples), but these cannot guarantee the absence of failures. Verification algorithms address this need and provide formal guarantees about a neural network by answering “yes or no” questions. For example, they can answer whether a violation exists within certain bounds. However, individual “yes or no” questions cannot answer qualitative questions such as “what is the largest error within these bounds”; the answers to these lie in the domain of optimization. Therefore, we propose strategies to extend existing verifiers to perform optimization and find: (i) the most extreme failure in a given input region and (ii) the minimum input perturbation required to cause a failure. A naive approach using a bisection search with an off-the-shelf verifier results in many expensive and overlapping calls to the verifier. Instead, we propose an approach that tightly integrates the optimization process into the verification procedure, achieving better runtime performance than the naive approach. We evaluate our approach implemented as an extension of Marabou, a state-of-the-art neural network verifier, and compare its performance with the bisection approach and MIPVerify, an optimization-based verifier. We observe complementary performance between our extension of Marabou and MIPVerify.

Keywords: Neural network verifcation; Optimization; Adversarial examples; Marabou

BibTeX entry:

   author = {Christopher A. Strong and Haoze Wu and Aleksandar Zelji{\'c}
	and Kyle D. Julian and Guy Katz and Clark Barrett and Mykel J.
   title = {Global optimization of objective functions represented by
	{ReLU} networks},
   journal = {Machine Learning},
   publisher = {Springer},
   month = oct,
   year = {2021},
   doi = {10.1007/s10994-021-06050-2},
   url = {}

(This webpage was created with bibtex2web.)