optilog
Write a program in Python that, using the optilog library, finds all solutions of a set of cardinality constraints.
A set of cardinality constraints is a set of inequalities like:
They can be translated into SAT using sequential counters:
The clause plus the clauses defining the right implications of the previous equivalences enforce . In the last equivalence, for instance, the clauses are:
The constraint is interpreted as: the number of variables assigned to true is at least . The minus sign is interpreted as negation. Therefore, is interpreted as: both and are both true. Therefore, as inequality, it is in fact that has a unique solution .
The input is a text (in the stdin) with pairs of connected nodes. For instance, the following text in the case of our example:
x1 + x2 > 1
x2 + x3 + x4 > 2
x1 + x2 + x3 + x4 < 2
The output is also a text (in the stdout) where in every line there is a variable assignment (variables assigned to 1 (true) as possitive, and those assigned to 0 (false) as negative. In this example:
{ -x1 x2 x3 -x4 }
{ -x1 x2 -x3 x4 }
Input
x1 + x2 + x3 > 2 x1 + x2 + x3 < 2
Output
{ -x1 x2 x3 }
{ x1 -x2 x3 }
{ x1 x2 -x3 }
Input
x1 + x2 > 1 x2 + x3 + x4 > 2 x1 + x2 + x3 + x4 < 2
Output
{ -x1 x2 x3 -x4 }
{ -x1 x2 -x3 x4 }
Input
x1 - x2 > 2
Output
{ x1 -x2 }