fond
Model Checking Contest @ Petri Nets 2014
Tunis, Tunisia, June 24, 2014
Call for Tools Participation
Last Updated
Sept. 1, 2014
March 31, 2014: the call for tool participation is updated (the benchmark).
See instructions on the bottom of this page for update instructions.

Goals

The Model Checking Contest is a yearly event, jointly held with the Petri Nets conference, that gathers developers of formal verification tools for concurrent systems. The first goal of the Model Checking Contest is to enable the assessment of existing tools on a set of models (i.e., benchmarks) proposed by the scientific community. All tools are compared on the same benchmarks and using the same computing platform, so that a fair comparison can be made, which is not possible using traditional scientific publications, that use different benchmarks executed on different platforms.

The second goal of the Model Checking Contest is to infer conclusions about the respective efficiency of verification techniques for Petri nets (decision diagrams, partial orders, symmetries, etc.) depending to the particular characteristics of the model under analysis. Through the feedback on tools efficiency, we aim at identifying which techniques can best tackle a given class of models. Finally, the Model Checking Contest seeks to be a friendly place where developers meet, exchange, and collaborate to enhance their verification tools.

Call for Tools

Important: it is not necessary to have submitted one or many models to the MCC to submit a tool.

The Model Checking Contest is organized in successive steps: after the first step (Call for Models), we launch the Call for Tools, and thus, the Contest itself.

Models processed in this edition are divided in two categories. First, «known» models that are built from the ones used in the past editions. Then «surprise» models that are the 15 models submitted and selected during the first step of the MCC ; these will not be published before the tools are submitted.

Procedure to Submit One (or Many) Tool(s)

The procedure to submit tool(s) is explained below step by step.

Step 1: initiate the submission

You first have to download the Tool Submission Kit containing all material and instructions for tool integration. If your tool participates to the formula examinations, you should also download the Formula Kit.

Step 2: prepare the disk-image to be submitted

You have to integrate your tool in the disk image to be executed in a virtual machine. Please note that you may put several tools (or versions of your tool if you want to submit several variants) in the same image (see the instructions in the tool submission manual).

To install extra software, you have access to the root account in this VM (see documentation). You should install your stuff in the mcc home directory (see documentation). Instructions to perform these operations are provided in the tool submission kit manual located in the Tool Submission Kit.

All the required information concerning formulas is provided in the formula manual located in the Formula Kit. Some changes have been made since last year but the format remains close to what was done in the past. Please note that if formulas are provided in both XML and textual format, a parser is provided for XML only, the text being a way for a human to read formulas easily.

Step 3: finalize the submission

You have to upload a zipped version of your disk image. To initiate the process, please create a temporary account here (set Fabrice.Kordon@lip6.fr as the LIP6 contact). You will then receive detailed instructions for uploading.

Update of the Submission kit (March 31, 2014)

If you downloaded your submission kit before April 1st, 2014, then, several modification have been done:

Important: you must either reload the Submission kit with its new disk image or, if you already started complex operations, log as mcc and type, from the home directory:

rm -rf BenchKit/INPUTS/
wget http://mcc.lip6.fr/archives/MCC-INPUTS.tgz
tar xzf MCC-INPUTS.tgz
rm -f MCC-INPUTS.tgz

Your disk image is now updated.

Deadline

The deadline for submission of tools for the Model Checking Contest is April 30, 2014.