Answer Set Programs
Unground and Automatically Generated
A number of programs in lparse/smodels format, along with encode / decode scripts, etc.
A su-doku solver, it is very simple and suprisingly quick. It took about 20 minutes to write and about 10 minutes to debug and thus I use it as an example of the power of answer set programming as a problem solving technique. Download (tar ball).
A puzzle similar to Su-Doku with an additional conditon that requires sections of the puzzle to be connected (i.e. requires reachability). Two encodings are given, one encodes reachability and grounds to about 250,000 rules for the given example. The second encoding is only just over 1000 rules and is tight but will give answer sets that are not solutions. Now includes a puzzle generator. Download (tar ball).
Another Japanese logic puzzle, sometimes known as "bridges". Now includes a puzzle generator. Download (tar ball).
A program the gives solution to a classic chess problem, that of visiting every square of a chess board once and only once using a knight. Download (tar ball).
Solitaire (also known as peg solitare or brainvita) is a simple yet surprisingly difficult and addictive game. The program is for an English board - a cross shape with 'arms' that are 3 pegs wide and 2 pegs long - from a central block of 3 by 3. To start with all but the center whole are filled and with each move a peg is jumped over another (remove it) until only one is left, in the center of the board. Download (tar ball).
Derived from the TOAST code base, this is a series of programs that factor increasingly large pseudo-prime numbers (product of two primes). Notably the number of literals is constant but the search space steps up slowly. Download (tar ball).
Program test sets and benchmarks used in various papers.
A pair of sets of program that model the RC5 symmetric key cipher, as described in "Applied Cryptography". Note these are not generic benchmarks collections, one set has been designed to have constant program size but varying difficulty and the other has constant difficulty and varying program size. Each program has exactly one answer set, but you should probably set the solver to compute all of them as it removes the element of luck involved. Also, GrinGo or the ground2smodels program may be needed to convert these files to smodels format. Varying size and varying difficulty. A harder set of varying difficulty benchmarks is also available.
Smodels Benchmark Collection
A mixed collection of programs used to assess the speed of various smodels variants. Many of these programs were generated by / taken from the sets listed above. Download (tar ball).
2009 Solver Competition
This tarball contains a set of scripts that should handle input and output to an answer set solver for the 2009 solver competition. It contains the following:
- Smodels-IE/ the top level directory, contains a program (actually a symlink) for each of the benchmarks. They link to either bin/solve.sh if it is a decision problem, or bin/optimise.sh if it is an optimisation problem, both of which run the solver chain and format the output.
- Smodels-IE/bin/ the programs required; binaries of the solver and grounder, the wrapper solve.sh and optimise.sh scripts, two perl formatting scripts and couple of support tools.
- Smodels-IE/encodings/ A collection of unground ASP encodings, borrowed from the lovely folks at Potsdam.
- Smodels-IE/lib/ The parsing library for the format scripts.
To set up for your own solver, do the following:
- Rename your top level directory to match your team name.
- Replace topLevel/bin/solver with your solver.
- If the solver output is radically different to smodels / cmodels / clasp then you may need to add a pattern to topLevel/lib/parseAnswerSet.pm (If so, please let me know as I try to keep it up to date and I'd love to support more solvers.)
- If you want to replace any of the encodings, simply replace the appropriate file in topLevel/encodings/.
- To submit solutions, enter the top level directory and run bin/submitAll.sh
Important things to note:
- Two problems, GrammarBasedInformationExtraction and StrategicCompanies are not included. The first has (as far as I know), no natural encoding in AnsProlog (currently there is not one in asparagus), the second is of a complexity higher than NP, it has an encoding in asparagus but not all solvers support the necessary constructs.
- This is still very much a work in progress.