PV1 PrIC3checks if the Markov decision process is safe

Symbolic model checking of Markov decision processes (MDPs).

Application domain/field

Type of tool

Probabilistic model checker

Expected input

Format:

Expected output

true or false. If the tool returns false, it will provide a possible counterexample. This can be caused by two reasons: either the MDP is indeed unsafe, or the heuristic was inappropriate. The counterexample consists of a subsystem of states of the MDP witnessing Prmax(sIB)>λ

Internals

Extension of IC3, for quantitative reachability in MDPs. Uses efficient data structures from Storm and uses Z3.

Comments

It is called a 'prototypical implementation' in the CAV '20 paper.
Model checking Probabilistic

Links

Repository: https://github.com/moves-rwth/PrIC3

Last commit date

30 September 2020

Related papers

https://doi.org/10.1007/978-3-030-53291-8_27

Last publication date

14 July 2020

ProVerB specific

View/edit source (Markdown)



ProVerB is a part of SLEBoK. Last updated: July 2022.