neural network robustness verification

2021-01-23 15:13

阅读:535

标签:smt   lin   eth   efi   int   ima   tran   nsf   efficient   

There have been several approaches available. One line of research I focused on is abstract interpretation based approaches.

AI2: uses zonotope as the abstract domain, and leverages the ‘join‘ and ‘meet‘ operators of zonotopes to handle ReLU activation function. Other activation funcations, such as sigmoid, tanh, are not supported.

DeepZ: uses zonotope as the abstract domain. It handles the activation function by introducing at most one noise symbol and use parallelagram to approximate its behavior. In principle, it uses two parallel lines to contain the activation function.

DeepPoly: introduces a new abstract domain, [lexpr, uexpr, lbound, ubound]. In theory, it can be viewed as an extention of DeepZ. It uses two arbitrary lines to approximate the activation function. Moreover, each abstraction (abstract element) contains all the abstractions in the preceding layers. This is used to tighten the bound, nothing more. The writting is very tricky, although the domain is non-exact regarding the affine transformer, [lexpr, uexpr] is exact. That‘s why they raise up an invariant for sound proof.

StarSet: although it seems like a set method, it is equivalent to abstract interpretation based techniques. It is better than DeepPoly as it allows more than 2 lines for approximation. I would expect it may encounter scalable issues, but I can not confirm yet. In the evaluation, it is more effective than deeppoly, but less efficient than deeppoly. The reasons I could think of are, 1) deeppoly employs parallelism in elina; 2) startset is implemented using Matlab.

RefineZono: not a good work. It employs SMT technique to tight the bound.

RefinePoly: k-relu. Not read it yet.

 

If we can adaptively manipulate the abstraction, it would be more powerful.

neural network robustness verification

标签:smt   lin   eth   efi   int   ima   tran   nsf   efficient   

原文地址:https://www.cnblogs.com/lijiaying/p/13276972.html


评论


亲,登录后才可以留言!