PV4 Reluplexproduces a satisfiability result for a formula

SMT solver for theory of linear real arithmetic with ReLU constraints. ReLU (Rectified Linear Unit), are a specific kind of activation function used in deep neural networks (DNNs).

Application domain/field

Type of tool

SMT solver

Expected output

SAT, UNSAT or TIMEOUT UNSAT indicates that a property holds, SAT indicates that the property does not hold.

Internals

Verifying DNNs is difficult because it is beyond the reach of general-purpose tools such as linear programming and SMT solvers. Therefore they focus on making an SMT solver that can deal with DNNs. Specifically, they focus on DNNs with a specific kind of activation function called Rectified Linear Unit (ReLU).

Comments

Note: The repository of Reluplex is no longer maintained, nowadays the algorithm is implemented in Marabou
DNN Neural network SMT

Links

Artifact of CAV '17 paper: https://github.com/guykatzz/ReluplexCav2017

Last commit date

8 July 2020

Related papers

https://doi.org/10.1007/978-3-319-63387-9_5 (CAV '17)

Last publication date

13 July 2017

Related tools

ProVerB specific

View/edit source (Markdown)



ProVerB is a part of SLEBoK. Last updated: July 2022.