A recent trend of CP that I noticed is the decomposition of global constraints.
See for instance the papers:
· Decomposition of the NValue Constraint. C. Bessiere, G. Katsirelos, N. Narodytska, C.G. Quimper, T. Walsh. Proceedings CP'10, St Andrews, Scotland, pages 114-128.
· Decompositions of All-Different, Global Cardinality and Related Constraints. Christian Bessière, George Katsirelos, Nina Narodytska, Claude-Guy Quimper, and Toby Walsh. In Proceedings of the 21st International Joint Conference on Artificial Intelligence (IJCAI 09), pages 419-424, 2009.
· Decomposing Global Grammar Constraints. Claude-Guy Quimper and Toby Walsh. In Proceedings of the 13th International Conference on Principles and Practice of Constraint Programming (CP 07), pages 590-604, 2007.
You can find these papers on the personal page of Claude Guy
I find these decompositions very useful since it can avoid the error prone task of implementing difficult algorithms for global constraints.
We will make an attempt of decomposing another global constraint: the sortedness constraint (I’m not aware of a decomposition of this one).
The paper describing bound-consistent filtering for this constraint is:
Faster Algorithms for Bound-Consistency of the Sortedness and the Alldifferent Constraint, Kurt Melhorn and Sven Thiel, CP2000.
If you don’t know what the sortedness is about, here is a quick reminder:
sortedness(x[], s[],p[]) takes three f.d . var array as arguments and it enforces that s is a sorted version of x and p represents the permutation of the sort.
For instance sortedness(x[5,2,4,1,3],s[1,2,3,4,5],p[4,2,5,3,1]) is a valid instantiation.
If you don’t have the implementation of sortedness from the paper above at hand, here is what you usually do as a straightforward model:
post(s[i] == x[p[i]])
forall(i in 1..n-1) {
post(x[p[i]] <= x[p[i+1]])
post(s[i] <= s[i+1])
}
post(alldifferent(p))
Unfortunately the above model does not enforce bound consistency.
We will add redundant constraints to strengthen the decomposition and hopefully enforce bound consistency with the decomposition. The redundant constraints are based on the counting sort algorithm. Looking at our example sortedness(x[5,2,4,1,3],s[1,2,3,4,5],p[4,2,5,3,1]) the idea is the following:
How many values in x can be strictly smaller than s[i] (assuming indices starts at 1)? The answer is at most i-1 otherwise value s[i] would be at a greater index. This idea is the basis of the linear time counting sort algorithm. Here is how this idea is translated in CP terms:
minval = min(i in 1..n) x[i].min()
maxval = max(i in 1..n) x[i].max()
#define an array of variable occ with domains {0,...,n} that will represent the number of occurrences of each value
var occ [minval..maxval](0..n)
#gcc to count number of occurrences of each values in the array x
post(gcc(x,occ))
#same for the sorted ones, not sure it is useful
post(gcc(s,occ))
var nbBefore [minval..maxval](0..n) #nbBefore[i] = #{i | x[i]#obviously no values are smaller than minval
post(nbBefore[minval] == 0)
forall(i in minval+1..maxval) {
cp.post(nbBefore[i] == sum(j in minval..i-1) occ[j])
}
forall(i in 1..n) {
#there are strictly less than i values smaller than s[i]
cp.post(nbBefore[s[i]] < i)
}
I still need to prove formally but I think this decomposition achieves bound-consistency for sortedness. Any comments are welcome (if you find it useful, if you find a counter example or a better decomposition or if something is not clear…).